Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology. Jeffrey R. Edwards University of North Carolina

Size: px
Start display at page:

Download "Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology. Jeffrey R. Edwards University of North Carolina"

Transcription

1 Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology Jeffrey R. Edwards University of North Carolina 1

2 Outline I. Types of Difference Scores II. Questions Difference Scores Are Intended To Address III. Problems With Difference Scores IV. An Alternative Procedure V. The Matrix Approach to Testing Constraints VI. Analyzing Quadratic Regression Equations Using Response Surface Methodology VII. Moderated Polynomial Regression VIII. Mediated Polynomial Regression IX. Difference Scores As Dependent Variables X. Answers to Frequently Asked Questions 2

3 Types of Difference Scores: Univariate Algebraic difference: (X Y) Absolute difference: X Y Squared difference: (X Y) 2 3

4 Types of Difference Scores: Multivariate Sum of algebraic differences: Σ(X i Y i ) = D 1 Sum of absolute differences: Σ X i Y i = D Sum of squared difference: Σ(X i Y i ) 2 = D 2 Euclidean distance: Profile correlation: 2 ( X Y ) D ( X X )( Y Y ) i i = ( ) 2 X X ( Y Y) i i i i 2 = Q 4

5 Questions Difference Scores are Intended to Address How well do characteristics of the job fit the needs or desires of the employee? To what extent do job demands exceed or fall short of the abilities of the person? Are prior expectations of the employee met by actual job experiences? What is the degree of similarity between perceptions or beliefs of supervisors and subordinates? Do the values of the person match the culture of the organization? Can novices provide performance evaluations that agree with expert ratings? 5

6 Data Used for Running Illustration Data were collected from 373 MBA students who were engaged in the recruiting process. Respondents rated the actual and desired amounts of various job attributes and the anticipated satisfaction concerning a job for which they had recently interviewed. Actual and desired measured had three items and used 7- point response scales ranging from none at all to a very great amount. The satisfaction measured had three items and used a 7-point response scale ranging from strongly disagree to strongly agree. The job attributes used for illustration are autonomy, prestige, span of control, and travel. 6

7 Problems with Difference Scores: Reliability When component measures are positively correlated, difference scores are often less reliable than either component. The formula for the reliability of an algebraic difference is (Johns, 1981): α(x y) = σ 2 x r σ + σ ryy 2r 2rxy σ 2 xx σy xy σx σy 2 2 x + y x y σ 7

8 Problems with Difference Scores: Reliability To illustrate, if X and Y have unit variances, have reliabilities of.75, and are correlated.50, the reliability of X Y equals: α α (x y) (x y) = = = =

9 Example: Reliability of the Algebraic Difference for Autonomy For autonomy, the actual amount (X) and desired amount (Y) measures had reliabilities of.89 and.85, variances of 1.16 and 0.88, and a correlation of.51 Hence, the reliability of the algebraic difference (X Y) is: α(x y) = α(x y) =.74 Notethat this reliability is lower than the reliabilities of X and Y. 9

10 Reliabilities of Other Types of Difference Scores Reliabilities of other difference scores can be estimated using procedures for the reliabilities of squares, products, and linear combinations of variables. For example, a squared difference can be written as a linear combination of X 2, XY, and Y 2 : (X Y) 2 = X 2 2XY + Y 2 The reliability of this expression can be derived by combining procedures described by Nunnally (1978) and Bohrnstedt and Marwell (1978). 10

11 Reliabilities of Other Types of Difference Scores The reliabilities of profile similarity indices such as D 1, D, and D 2 can also be derived by applying Nunnally (1978) and Bohrnstedt and Marwell (1978). The reliabilities of the squared and product terms that constitute X Y, (X Y) 2, D, and D 2 involve the means of X and Y, which are arbitrary for measures that use interval rather than ratio scales. Profile similarity indices usually collapse conceptually distinct dimensions, which obscures the meaning of their true scores and, thus, their reliabilities. 11

12 Problems with Difference Scores: Conceptual Ambiguity It might seem that component variables are reflected equally in a difference score, given that the components are implicitly assigned the same weight when the difference score is constructed. However, the variance of a difference score depends on the variances and covariances of the component measures, which are sample dependent. When one component is a constant, the variance of a difference score is solely due to the other component, i.e., the one that varies. For instance, when P-O fit is assessed in a single organization, the P-O difference solely represents variation in the person scores. 12

13 Variance of an Algebraic Difference Score The variance of an algebraic difference score can be computed using the following formula for the variance of a weighted linear combination of random variables: V(aX + by) = a 2 V(X) + b 2 V(Y) + 2abC(X,Y) For the algebraic difference score (X Y), a = +1 and b = 1, which yields: V(X Y) = V(X) + V(Y) 2C(X,Y) Thus, X and Y contribute equally to V(X Y) only when V(X) and V(Y) happen to be equal. 13

14 Example: Variance of the Algebraic Difference for Autonomy For autonomy, the variance of X is 1.16, and the variance of Y is The covariance between X and Y is their correlation multiplied by the products of their standard deviations, which equals.51 x = Using these quantities, the variance of (X Y) is: V(X Y) = = 1.02 V(X Y) depends more on V(X) than V(Y) and also incorporates C(X,Y). Thus, V(X Y) does not reflect V(X) and V(Y) in equal proportions. 14

15 Variances of Other Types of Difference Scores Variances of difference scores involving higher-order terms, such as (X Y) 2, can be computed using rules for the variances of products of random variables (Bohrnstedt & Goldberger, 1969; Goodman (1960). These formulas involve the means of X and Y, which are arbitrary when X and Y are measured on interval rather than ratio scales. Nonetheless, it is reasonable to assume that all components do not contribute equally, particularly when the number of components becomes large. 15

16 Problems with Difference Scores: Confounded Effects Difference scores confound the effects of the components of the difference. For example, an equation using an algebraic difference as a predictor can be written as: Z = b 0 + b 1 (X Y) + e In this equation, b 1 can reflect a positive relationship for X, an negative relationship for Y, or some combination thereof. 16

17 Problems with Difference Scores: Confounded Effects Some researchers have attempted to address this confound by controlling for one component of the difference. For an algebraic difference, this yields: Z = b 0 + b 1 X + b 2 (X Y) + e However, controlling for X simply transforms the algebraic difference into a partialled measure of Y (Wall & Payne, 1973): Z = b 0 + (b 1 + b 2 )X b 2 Y + e Thus, b 2 is not the effect of (X Y), but instead is the negative of the effect of Y, controlling for X. 17

18 Problems with Difference Scores: Confounded Effects The effects of X and Y are easier to interpret if X and Y are used as separate predictors: Z = b 0 + b 1 X + b 2 Y + e The R 2 from this equation is the same as that from the equation using (X Y) and X as predictors, but its interpretation is more straightforward. 18

19 Example: Confounded Effects for the Algebraic Difference for Autonomy Results using (X Y): Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTALD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

20 Example: Confounded Effects for the Algebraic Difference for Autonomy Results using X and (X Y): Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTALD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

21 Example: Confounded Effects for the Algebraic Difference for Autonomy Results using X and Y: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

22 Problems with Difference Scores: Confounded Effects Other researchers have controlled for X, Y, or (X Y) when using X Y or (X Y) 2 as predictors. For example: Z = b 0 + b 1 (X Y) + b 2 (X Y) 2 + e Although this approach might seem to provide a conservative test for (X Y) 2, the term b 1 (X Y) merely shifts the minimum of the U-shaped curve captured by (X Y) 2. Specifically, if b 1 is positive, the minimum of the curve is shifted to the left, and if b 1 is negative, the minimum is shifted to the right. 22

23 Example: Confounded Effects for the Squared Difference for Autonomy Results using (X Y) 2 : Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTSQD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

24 Example: Confounded Effects for the Squared Difference for Autonomy Plot of (X Y) 2 : 7 6 Satisfaction Actual - Desired 24

25 Example: Confounded Effects for the Squared Difference for Autonomy Results using (X Y) and (X Y) 2 : Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTALD AUTSQD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

26 Example: Confounded Effects for the Squared Difference for Autonomy Plot of (X Y) and (X Y) 2 : 7 6 Satisfaction Actual - Desired 26

27 Problems with Difference Scores: Confounded Effects Analogously, X and Y have been controlled in equations using X Y as a predictor: Z = b 0 + b 1 X + b 2 Y + b 3 X Y + e Controlling for X and Y does not provide a conservative test of X Y. Rather, it alters the tilt of the V-shaped function indicated by X Y. For example, if b 1 = b 2 and b 1 = b 3, the left side is horizontal and the right side is positively sloped, and if b 1 = b 2 and b 2 = b 3, the right side is horizontal and the left side is negatively sloped. 27

28 Example: Confounded Effects for the Absolute Difference for Autonomy Results using X Y : Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTABD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

29 Example: Confounded Effects for the Squared Difference for Autonomy Plot of X Y : 7 6 Satisfaction Actual - Desired 29

30 Example: Confounded Effects for the Squared Difference for Autonomy Results using (X Y) and X Y : Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTALD AUTABD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

31 Example: Confounded Effects for the Squared Difference for Autonomy Plot of (X Y) and X Y : 7 6 Satisfaction Actual - Desired 31

32 Problems with Difference Scores: Untested Constraints Difference scores impose untested constraints on the coefficients relating X and Y to Z. The constraints imposed by an algebraic difference can be seen with the following equations: Expansion yields: Z = b 0 + b 1 (X Y) + e Z = b 0 + b 1 X b 1 Y + e 32

33 Problems with Difference Scores: Untested Constraints Now, consider an equation that uses X and Y as separate predictors: Z = b 0 + b 1 X + b 2 Y + e Comparing this equation to the previous equation shows that using (X Y) as a predictor constrains the coefficients on X and Y to be equal in magnitude but opposite in sign (i.e., b 1 = b 2, or b 1 + b 2 = 0). This constraint should not be imposed on the data but instead should be treated as a hypothesis to be tested. 33

34 Example: Constrained and Unconstrained Algebraic Difference for Autonomy Results using (X Y): Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTALD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

35 Example: Constrained and Unconstrained Algebraic Difference for Autonomy Results using X and Y: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

36 Example: Constrained and Unconstrained Algebraic Difference for Autonomy Constrained and unconstrained results: X Y R 2 Constrained 0.39 ** **.12 ** Unconstrained 0.45 ** **.13 ** 36

37 Problems with Difference Scores: Untested Constraints The constraints imposed by an absolute difference can be seen using a piecewise linear equation: Z = b 0 + b 1 (1 2W)(X Y) + e When (X Y) is positive or zero, W = 0, and the term (1 2W)(X Y) becomes (X Y). When (X Y) is negative, W = 1, and (1 2W)(X Y) equals (X Y). Thus, W switches the sign on (X Y) only when it is negative, producing an absolute value transformation. Expanding the equation yields: Z = b 0 + b 1 X b 1 Y 2b 1 WX + 2b 1 WY + e 37

38 Problems with Difference Scores: Untested Constraints Now, consider a piecewise equation using X and Y: Z = b 0 + b 1 X + b 2 Y + b 3 W + b 4 WX + b 5 WY + e Comparing this equation to the previous equation shows that X Y imposes four constraints: b 1 = b 2, or b 1 + b 2 = 0 b 4 = b 5, or b 4 + b 5 = 0 b 3 = 0 b 4 = 2b 1, or 2b 1 + b 4 = 0 These constraints should be treated as hypotheses to be tested empirically. 38

39 Example: Constrained and Unconstrained Absolute Difference for Autonomy Results using X Y : Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTABD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

40 Example: Constrained and Unconstrained Absolute Difference for Autonomy Results using X, Y, W, WX, and WY: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD AUTW AUTW*AUTCA AUTW*AUTCD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

41 Example: Constrained and Unconstrained Absolute Difference for Autonomy Constrained and unconstrained results: X Y W WX WY R 2 Constrained ** 0.53 ** ** **.11 ** Unconstrained ** **.16 ** 41

42 Problems with Difference Scores: Untested Constraints The constraints imposed by a squared difference can be seen with the following equations: Expansion yields: Z = b 0 + b 1 (X Y) 2 + e Z = b 0 + b 1 X 2 2b 1 XY + b 1 Y 2 + e Thus, a squared difference implicitly treats Z as a function of X 2, XY, and Y 2. 42

43 Problems with Difference Scores: Untested Constraints Now, consider a quadratic equation using X and Y: Z = b 0 + b 1 X + b 2 Y + b 3 X 2 + b 4 XY + b 5 Y 2 + e Comparing this equation to the previous equation shows that (X Y) 2 imposes four constraints: b 1 = 0 b 2 = 0 b 3 = b 5, or b 3 b 5 = 0 b 3 + b 4 + b 5 = 0 Again, these constraints should be treated as hypotheses to be tested empirically, not simply imposed on the data. 43

44 Example: Constrained and Unconstrained Squared Difference for Autonomy Results using (X Y) 2 : Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTSQD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

45 Example: Constrained and Unconstrained Squared Difference for Autonomy Results using X, Y, X 2, XY, and Y 2 : Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD AUTCA AUTCAD AUTCD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

46 Example: Constrained and Unconstrained Squared Difference for Autonomy Constrained and unconstrained results: X Y X 2 XY Y 2 R 2 Constrained ** 0.36 ** **.10 ** Unconstrained 0.20 * ** ** ** 46

47 Problems with Difference Scores: Dimensional Reduction Difference scores reduce the three-dimensional relationship of X and Y with Z to two dimensions. The linear algebraic difference function represents a symmetric plane with equal but opposite slopes with respect to the X-axis and Y-axis. The U-shaped squared difference function represents a symmetric U-shaped surface with its minimum (or maximum) running along the X = Y line. The V-shaped absolute difference function represents a symmetric V-shaped surface with its minimum (or maximum) running along the X = Y line. 47

48 Two-Dimensional Algebraic Difference Function Z (X - Y) 48

49 Three-Dimensional Algebraic Difference Function 49

50 Two-Dimensional Absolute Difference Function Z (X - Y) 50

51 Three-Dimensional Absolute Difference Function 51

52 Two-Dimensional Squared Difference Function Z (X - Y) 52

53 Three-Dimensional Squared Difference Function 53

54 Two-Dimensional Algebraic Difference Function for Autonomy 7 6 Satisfaction Actual - Desired 54

55 Three-Dimensional Algebraic Difference Function for Autonomy 55

56 Two-Dimensional Absolute Difference Function for Autonomy 7 6 Satisfaction Actual - Desired 56

57 Three-Dimensional Absolute Difference Function for Autonomy 57

58 Two-Dimensional Squared Difference Function for Autonomy 7 6 Satisfaction Actual - Desired 58

59 Three-Dimensional Squared Difference Function for Autonomy 59

60 Problems with Difference Scores: Dimensional Reduction These surfaces represent only three of the many possible surfaces depicting how X and Y may be related to Z. This problem is compounded by the use of profile similarity indices, which collapse a series of three-dimensional surfaces into a single two-dimensional function. 60

61 An Alternative Procedure The relationship of X and Y with Z should be viewed in three dimensions, with X and Y constituting the two horizontal axes and Z constituting the vertical axis. Analyses should focus not on two-dimensional functions relating the difference between X and Y to Z, but instead on three-dimensional surfaces depicting the joint relationship of X and Y with Z. Constraints should not be simply imposed on the data, but instead should be viewed as hypotheses that, if confirmed, lend support to the conceptual model upon which the difference score is based. 61

62 Confirmatory Approach When a difference scores represents a hypothesis that is predicted a priori, the alternative procedure should be applied using the confirmatory approach. The R 2 for the unconstrained equation should be significant. The coefficients in the unconstrained equation should follow the pattern indicated by the difference score. The constraints implied by the difference score should not be rejected. The set of terms one order higher than those in the unconstrained equation should not be significant. 62

63 Example: Confirmatory Test of Algebraic Difference for Autonomy Unconstrained equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

64 Example: Confirmatory Test of Algebraic Difference for Autonomy Unconstrained surface: 64

65 Example: Confirmatory Test of Algebraic Difference for Autonomy The first condition is met, because the R 2 from the unconstrained equation is significant. The second condition is met, because the coefficients on X and Y are significant and in the expected direction. For the third condition, testing the constraints imposed by the algebraic difference is the same as testing the difference in R 2 between the constrained and unconstrained equations. 65

66 Example: Confirmatory Test of Algebraic Difference for Autonomy Constrained equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTALD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

67 Example: Confirmatory Test of Algebraic Difference for Autonomy Constrained surface: 67

68 Example: Confirmatory Test of Algebraic Difference for Autonomy The general formula for the difference in R 2 between two regression equations is: F = (R 2 U R 2 C ) /(df 2 (1 R U ) / df U The test of the constraint imposed by the algebraic difference for autonomy is: ( ) /( ) (1.127) / 357 The constraint is rejected, so the third condition is not satisfied. C = df U ) 4.91, p <.05 68

69 Example: Confirmatory Test of Algebraic Difference for Autonomy For the fourth condition, the unconstrained equation for the algebraic equation is linear, so the higher-order terms are the three quadratic terms X 2, XY, and Y 2. Testing the three quadratic terms as a set is the same as testing the difference in R 2 between the linear and quadratic equations. 69

70 Example: Confirmatory Test of Algebraic Difference for Autonomy Quadratic equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD AUTCA AUTCAD AUTCD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

71 Example: Confirmatory Test of Algebraic Difference for Autonomy The test of the higher-orderterms associated with the algebraic difference for autonomy: ( ) /( ) = 5.96, p <.05 (1.169) / 354 The higher-order terms are significant, so the fourth condition is not satisfied. 71

72 Example: Confirmatory Test of Absolute Difference for Autonomy Unconstrained equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD AUTW AUTCAW AUTCDW Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

73 Example: Confirmatory Test of Absolute Difference for Autonomy Unconstrained surface: 73

74 Example: Confirmatory Test of Absolute Difference for Autonomy The first condition is met, because the R 2 from the unconstrained equation is significant. The second condition is not met, because the coefficients on X and Y are not significant, and in the expected direction. For the third condition, testing the constraints imposed by the absolute difference is the same as testing the difference in R 2 between the constrained and unconstrained equations. 74

75 Example: Confirmatory Test of Absolute Difference for Autonomy Constrained equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTABD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

76 Example: Confirmatory Test of Absolute Difference for Autonomy Constrained surface: 76

77 Example: Confirmatory Test of Absolute Difference for Autonomy The test of the constraints imposed by the absolute difference for autonomy is: ( ) /( ) = 5.68, p <.05 (1.159) / 354 The constraints are rejected, so the third condition is not satisfied. 77

78 Example: Confirmatory Test of Absolute Difference for Autonomy For the fourth condition, the unconstrained equation for the absolute equation is piecewise linear, so the higher-order terms are the six quadratic terms X 2, XY, Y 2, WX 2, WXY, and WY 2. Testing the six quadratic terms as a set is the same as testing the difference in R 2 between the piecewise linear and piecewise quadratic equations. 78

79 Example: Confirmatory Test of Absolute Difference for Autonomy Piecewise quadratic equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD AUTW AUTCAW AUTCDW AUTCA AUTCAD AUTCD AUTCA2W AUTCADW AUTCD2W Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

80 Example: Confirmatory Test of Absolute Difference for Autonomy The test of the higher-orderterms associated with the absolute difference for autonomy is: ( ) /( ) = 1.85, p >.05 (1.185) / 348 The higher-order terms are not significant, so the fourth condition is satisfied. 80

81 Example: Confirmatory Test of Squared Difference for Autonomy Unconstrained equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD AUTCA AUTCAD AUTCD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

82 Example: Confirmatory Test of Squared Difference for Autonomy Unconstrained surface: 82

83 Example: Confirmatory Test of Squared Difference for Autonomy The first condition is met, because the R 2 from the unconstrained equation is significant. The second condition is not met, because the coefficients on X and Y are significant, and the coefficients on X 2 and Y 2 are not significant. For the third condition, testing the constraints imposed by the squared difference is the same as testing the difference in R 2 between the constrained and unconstrained equations. 83

84 Example: Confirmatory Test of Squared Difference for Autonomy Constrained equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTSQD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

85 Example: Confirmatory Test of Squared Difference for Autonomy Constrained surface: 85

86 Example: Confirmatory Test of Squared Difference for Autonomy The test of the constraint imposed by the squared difference for autonomy is: ( ) /( ) = 7.77, p <.05 (1.169) / 354 The constraint is rejected, so the third condition is not satisfied. 86

87 Example: Confirmatory Test of Squared Difference for Autonomy For the fourth condition, the unconstrained equation for the squared equation is quadratic, so the higher-order terms are the four cubic terms X 3, X 2 Y, XY 2, and Y 3. Testing the four cubic terms as a set is the same as testing the difference in R 2 between the quadratic and cubic equations. 87

88 Example: Confirmatory Test of Squared Difference for Autonomy Cubic equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD AUTCA AUTCAD AUTCD AUTCA AUTCA2D AUTCAD AUTCD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

89 Example: Confirmatory Test of Squared Difference for Autonomy The test of the higher-order terms associated with the squared difference for autonomy is: ( ) /( ) = 2.27, p >.05 (1.190) / 350 The higher-order terms are not significant, so the fourth condition is satisfied. 89

90 Exploratory Approach When no a priori hypothesis can is predicted, the alternative procedure can be applied using the exploratory approach. The analysis begins by using the linear terms X and Y as predictors. If the R 2 is not significant, the procedure stops, with the conclusion that X and Y are unrelated to Z. If the R 2 from the linear equation is significant, then the quadratic term X 2, XY, and Y 2 are added as a set, and the increment in R 2 is tested. If the increment in R 2 is not significant, the linear equation is retained. If the increment in R 2 from the quadratic terms is significant, the four cubic terms X 3, X 2 Y, XY 2, and Y 3 are added, and the increment in R 2 is tested. If the increment is not significant, the quadratic equation is retained. 90

91 Exploratory Approach The foregoing procedure continues, adding higher-order terms in sets and stopping when the increment in R 2 is not significant. The tests of higher-order terms involved in the exploratory procedure are susceptible to outliers and influential cases, and therefore regression diagnostics should be applied. Like any exploratory analysis, the exploratory procedure described here can produce results that do not generalize beyond the sample in hand. Therefore, the obtained results should be considered tentative, pending cross-validation. It is folly to construct elaborate post-hoc interpretations of complex surfaces that are not both generalizable and conceptually meaningful (Edwards, 1994, p. 74). 91

92 Example: Exploratory Analyses for Autonomy Linear equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

93 Example: Exploratory Analyses for Autonomy Quadratic equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD AUTCA AUTCAD AUTCD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

94 Example: Exploratory Analyses for Autonomy The test of the quadratic terms for autonomy is: ( ) /( ) = 5.96, p <.05 (1.169) / 354 The quadratic terms are significant, so the cubic equation is estimated. 94

95 Example: Exploratory Analyses for Autonomy Cubic equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD AUTCA AUTCAD AUTCD AUTCA AUTCA2D AUTCAD AUTCD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

96 Example: Exploratory Analyses for Autonomy The test of the cubic terms associated for autonomy is: ( ) /( ) = 2.27, p >.05 (1.190) / 350 The cubic are not significant, so the quadratic equation is retained. 96

97 The Matrix Approach to Testing Constraints The constraints imposed by difference scores and certain profile similarity indices (i.e., D 1, D, D 2 ) can be tested using statistical packages that permit linear constraints on regression coefficients (e.g., SAS, SPSS, SYSTAT). In SYSTAT, q constraints on k regression coefficients can be written as: AB = D where A is a q x k matrix of weights, B is a k x 1 column vector of regression coefficients, and D is a q x 1 vector of zeros. The values of A are chosen to express constraints as weighted linear combinations of regression coefficients set equal to zero. 97

98 Testing Constraints Imposed by an Algebraic Difference Recall that, in a linear regression equation, the constraint imposed by an algebraic difference is b 1 = b 2, or b 1 + b 2 = 0. The corresponding A and B matrices are: b b 0 [ 0 1 1] b = [ b + b ]

99 Example: Testing the Algebraic Difference Constraints for Autonomy The following SYSTAT commands compute an algebraic difference, estimate constrained and unconstrained equations, and test the constraint: MGLH MOD SAT=CONSTANT+AUTALD EST MOD SAT=CONSTANT+AUTCA+AUTCD EST HYP AMA [0 1 1] TEST 99

100 Example: Testing the Algebraic Difference Constraints for Autonomy Results from the constrained equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTALD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

101 Example: Testing the Algebraic Difference Constraints for Autonomy Results from the unconstrained equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

102 Example: Testing the Algebraic Difference Constraints for Autonomy Test of constraints: Hypothesis. A Matrix Test of Hypothesis Source SS df MS F P Hypothesis Error

103 103 Testing Constraints Imposed by an Absolute Difference In a piecewise linear regression equation, the constraints imposed by an absolute difference are b 1 + b 2 = 0, b 4 + b 5 = 0, b 3 = 0, and 2b 1 + b 4 = 0. The corresponding A and B matrices are: = b 2b b b b b b b b b b b b

104 Example: Testing the Absolute Difference Constraints for Autonomy The following SYSTAT commands compute an absolute difference, estimate constrained and unconstrained equations, and test the constraint: MOD SAT=CONSTANT+AUTABD EST MOD SAT=CONSTANT+AUTCA+AUTCD+AUTW+AUTCAW+AUTCDW EST HYP AMA [ ;, ;, ;, ] TEST 104

105 Example: Testing the Absolute Difference Constraints for Autonomy Results from the constrained equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTABD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

106 Example: Testing the Absolute Difference Constraints for Autonomy Results from the unconstrained equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD AUTW AUTCAW AUTCDW Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

107 Example: Testing the Absolute Difference Constraints for Autonomy Test of constraints: Hypothesis. A Matrix Test of Hypothesis Source SS df MS F P Hypothesis Error

108 108 Testing Constraints Imposed by a Squared Difference In a quadratic regression equation, the constraints imposed by a squared difference are b 1 = 0, b 2 = 0, b 3 b 5 = 0, and b 3 + b 4 + b 5 = 0. The corresponding A and B matrices are: + = b 2b b b b b b b b b b b

109 Example: Testing the Squared Difference Constraints for Autonomy The following SYSTAT commands compute a squared difference, estimate constrained and unconstrained equations, and test the constraint: MOD SAT=CONSTANT+AUTSQD EST MOD SAT=CONSTANT+AUTCA+AUTCD+AUTCA2+AUTCAD+AUTCD2 EST HYP AMA [ ;, ;, ;, ] TEST 109

110 Example: Testing the Squared Difference Constraints for Autonomy Results from the constrained equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTSQD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

111 Example: Testing the Squared Difference Constraints for Autonomy Results from the unconstrained equation: Dep Var: SAT N: 360 Multiple R: Squared multiple R: Adjusted squared multiple R: Standard error of estimate: Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT AUTCA AUTCD AUTCA AUTCAD AUTCD Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression Residual

112 Example: Testing the Squared Difference Constraints for Autonomy Test of constraints: Hypothesis. A Matrix Test of Hypothesis Source SS df MS F P Hypothesis Error

113 Analyzing Quadratic Regression Equations Using Response Surface Methodology Response surface methodology can be used to analyze features of surfaces corresponding to quadratic regression equations. These analyses are useful for two reasons: Constraints imposed by difference scores are usually rejected, which makes it necessary to interpret unconstrained equations. Many conceptually meaningful hypotheses cannot be expressed using difference scores. 113

114 Key Features of Response Surfaces: Stationary Point The stationary point is the point at which the slope of the surface relating X and Y to Z is zero in all directions. For convex (i.e., bowl-shaped) surfaces, the stationary point is the overall minimum of the surface with respect to the Z axis. For concave (i.e., dome-shaped) surfaces, the stationary point is the overall maximum of the surface with respect to the Z axis. For saddle-shaped surfaces, the stationary point is where the surface is flat with respect to the Z axis. 114

115 Key Features of Response Surfaces: Stationary Point The coordinates of the stationary point can be computed using the following formulas: X Y 0 0 = = b2b 4b b1b 4b b 5 5 2b b b b b b b X 0 and Y 0 are the coordinates of the stationary point in the X,Y plane. 115

116 Example: Stationary Point for Autonomy Applying these formulas to the equation for autonomy yields: ( 0.293)(0.276) 2(0.197)( 0.035) = 4( 0.056)( 0.035) X0 2 = (0.197)(0.276) 2( 0.293)( 0.056) 0 = = 4( 0.056)( 0.035) Y

117 Example: Stationary Point for Autonomy Stationary Point 117

118 Key Features of Response Surfaces: Principal Axes The principal axes describe the orientation of the surface with respect to the X,Y plane. The axes are perpendicular and intersect at the stationary point. For convex surfaces, the upward curvature is greatest along the first principal axis and least along the second principal axis. For concave surfaces, the downward curvature is greatest along the second principal axis and least along the first principal axis. For saddle-shaped surfaces, upward curvature is greatest along the first principal axis, and the downward curvature is greatest along the second principal axis. 118

119 Key Features of Response Surfaces: First Principal Axis An equation for the first principal axis is: Y = p + 10 p11x The formula for the slope of the first principal axis (i.e., p 11 ) is: p 11 = b 5 b 3 + Using X 0, Y 0, and p 11, the intercept of the first principal axis (i.e., p 10 ) can be calculated as follows: p (b b 3 4 b 10 = Y0 p11x0 5 ) 2 + b

120 Example: First Principal Axis for Autonomy Applying these formulas to the equation for autonomy yields: p 11 = ( 0.056) + [ ( 0.035)] =1.079 p 10 = (1.079)(0.982) =

121 Example: First Principal Axis for Autonomy First Principal Axis 121

122 Key Features of Response Surfaces: Second Principal Axis An equation for the second principal axis is: Y = p + 20 p21x The formula for the slope of the second principal axis (i.e., p 21 ) is: p 11 = b 5 b 3 X 0, Y 0, and p 21 can be used to obtain the intercept of the second principal axis (i.e., p 20 ) as follows: p (b b 3 4 b 20 = Y0 p21x0 5 ) 2 + b

123 Example: Second Principal Axis for Autonomy Applying these formulas to the equation for autonomy yields: p ( 0.056) = [ ( 0.035)] = p 20 = ( 0.927)(0.982) =

124 Example: Second Principal Axis for Autonomy Second Principal Axis 124

125 Key Features of Response Surfaces: Shape Along the Y = X Line The shape of the surface along a line in the X,Y plane can be estimated by substituting the expression for the line into the quadratic regression equation. To estimate the slope along the Y = X line, X is substituted for Y in the quadratic regression equation, which yields: Z = b 0 + b 1 X + b 2 X + b 3 X 2 + b 4 X 2 + b 5 X 2 + e = b 0 + (b 1 + b 2 )X + (b 3 + b 4 + b 5 )X 2 + e The term (b 3 + b 4 + b 5 ) represents the curvature of the surface along the Y = X line, and (b 1 + b 2 ) is the slope of the surface at the point X =

126 Example: Shape Along Y = X Line for Autonomy For autonomy, the shape of the surface along the Y = X line is: Z = [ ( 0.293)]X + [ ( 0.035)]X 2 + e Simplifying this expression yields: Z = X X 2 + e The surface is curved upward along the Y = X line and is negatively sloped at the point X = 0 (the curvature is significant at p <.05). 126

127 Example: Shape Along Y = X Line for Autonomy Contours Show Shape Along Y = X Line 127

128 Key Features of Response Surfaces: Shape Along Y = X Line To estimate the slope along the Y = X line, X is substituted for Y in the quadratic regression equation, which yields: Z = b 0 + b 1 X b 2 X + b 3 X 2 b 4 X 2 + b 5 X 2 + e = b 0 + (b 1 b 2 )X + (b 3 b 4 + b 5 )X 2 + e The term (b 3 b 4 + b 5 ) represents the curvature of the surface along the Y = X line, and (b 1 b 2 ) is the slope of the surface at the point X =

129 Example: Shape Along Y = X Line for Autonomy For autonomy, the shape of the surface along the Y = X line is: Z = [0.197 ( 0.293)]X + [ ( 0.035)]X 2 + e Simplifying this expression yields: Z = X 0.367X 2 + e The surface is curved downward along the Y = X line and is positively sloped at the point X = 0 (both are significant at p <.05). 129

130 Example: Shape Along Y = X Line for Autonomy Contours Show Shape Along Y = X Line 130

131 Key Features of Response Surfaces: Shape Along First Principal Axis To estimate the slope along the first principal axis, p 10 + p 11 X is substituted for Y: 2 Z = b0 + b1x + b2(p10 + p11x) + b3x + b4x(p10 + p11x) 2 + b (p + p X) e = b0 + b2p10 + b5p10 + (b1 + b2p11 + b4p10 + 2b5p10p11)X (b + b p + b p )X e The composite terms preceding X 2 and X are the curvature of the surface along the first principal axis and the slope of the surface at the point X =

4:3 LEC - PLANNED COMPARISONS AND REGRESSION ANALYSES

4:3 LEC - PLANNED COMPARISONS AND REGRESSION ANALYSES 4:3 LEC - PLANNED COMPARISONS AND REGRESSION ANALYSES FOR SINGLE FACTOR BETWEEN-S DESIGNS Planned or A Priori Comparisons We previously showed various ways to test all possible pairwise comparisons for

More information

regression analysis is a type of inferential statistics which tells us whether relationships between two or more variables exist

regression analysis is a type of inferential statistics which tells us whether relationships between two or more variables exist regression analysis is a type of inferential statistics which tells us whether relationships between two or more variables exist sales $ (y - dependent variable) advertising $ (x - independent variable)

More information

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #6

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #6 STA 8 Applied Linear Models: Regression Analysis Spring 011 Solution for Homework #6 6. a) = 11 1 31 41 51 1 3 4 5 11 1 31 41 51 β = β1 β β 3 b) = 1 1 1 1 1 11 1 31 41 51 1 3 4 5 β = β 0 β1 β 6.15 a) Stem-and-leaf

More information

Chapter 9 - Correlation and Regression

Chapter 9 - Correlation and Regression Chapter 9 - Correlation and Regression 9. Scatter diagram of percentage of LBW infants (Y) and high-risk fertility rate (X ) in Vermont Health Planning Districts. 9.3 Correlation between percentage of

More information

Intro to Linear Regression

Intro to Linear Regression Intro to Linear Regression Introduction to Regression Regression is a statistical procedure for modeling the relationship among variables to predict the value of a dependent variable from one or more predictor

More information

Linear Regression. Simple linear regression model determines the relationship between one dependent variable (y) and one independent variable (x).

Linear Regression. Simple linear regression model determines the relationship between one dependent variable (y) and one independent variable (x). Linear Regression Simple linear regression model determines the relationship between one dependent variable (y) and one independent variable (x). A dependent variable is a random variable whose variation

More information

Can you tell the relationship between students SAT scores and their college grades?

Can you tell the relationship between students SAT scores and their college grades? Correlation One Challenge Can you tell the relationship between students SAT scores and their college grades? A: The higher SAT scores are, the better GPA may be. B: The higher SAT scores are, the lower

More information

AP Statistics Unit 6 Note Packet Linear Regression. Scatterplots and Correlation

AP Statistics Unit 6 Note Packet Linear Regression. Scatterplots and Correlation Scatterplots and Correlation Name Hr A scatterplot shows the relationship between two quantitative variables measured on the same individuals. variable (y) measures an outcome of a study variable (x) may

More information

Introduction to Linear regression analysis. Part 2. Model comparisons

Introduction to Linear regression analysis. Part 2. Model comparisons Introduction to Linear regression analysis Part Model comparisons 1 ANOVA for regression Total variation in Y SS Total = Variation explained by regression with X SS Regression + Residual variation SS Residual

More information

Multiple Regression. More Hypothesis Testing. More Hypothesis Testing The big question: What we really want to know: What we actually know: We know:

Multiple Regression. More Hypothesis Testing. More Hypothesis Testing The big question: What we really want to know: What we actually know: We know: Multiple Regression Ψ320 Ainsworth More Hypothesis Testing What we really want to know: Is the relationship in the population we have selected between X & Y strong enough that we can use the relationship

More information

1. (Problem 3.4 in OLRT)

1. (Problem 3.4 in OLRT) STAT:5201 Homework 5 Solutions 1. (Problem 3.4 in OLRT) The relationship of the untransformed data is shown below. There does appear to be a decrease in adenine with increased caffeine intake. This is

More information

Correlation. A statistics method to measure the relationship between two variables. Three characteristics

Correlation. A statistics method to measure the relationship between two variables. Three characteristics Correlation Correlation A statistics method to measure the relationship between two variables Three characteristics Direction of the relationship Form of the relationship Strength/Consistency Direction

More information

Correlation and Linear Regression

Correlation and Linear Regression Correlation and Linear Regression Correlation: Relationships between Variables So far, nearly all of our discussion of inferential statistics has focused on testing for differences between group means

More information

9. Linear Regression and Correlation

9. Linear Regression and Correlation 9. Linear Regression and Correlation Data: y a quantitative response variable x a quantitative explanatory variable (Chap. 8: Recall that both variables were categorical) For example, y = annual income,

More information

Regression Analysis. Simple Regression Multivariate Regression Stepwise Regression Replication and Prediction Error EE290H F05

Regression Analysis. Simple Regression Multivariate Regression Stepwise Regression Replication and Prediction Error EE290H F05 Regression Analysis Simple Regression Multivariate Regression Stepwise Regression Replication and Prediction Error 1 Regression Analysis In general, we "fit" a model by minimizing a metric that represents

More information

Research Design - - Topic 19 Multiple regression: Applications 2009 R.C. Gardner, Ph.D.

Research Design - - Topic 19 Multiple regression: Applications 2009 R.C. Gardner, Ph.D. Research Design - - Topic 19 Multiple regression: Applications 2009 R.C. Gardner, Ph.D. Curve Fitting Mediation analysis Moderation Analysis 1 Curve Fitting The investigation of non-linear functions using

More information

STATISTICS 110/201 PRACTICE FINAL EXAM

STATISTICS 110/201 PRACTICE FINAL EXAM STATISTICS 110/201 PRACTICE FINAL EXAM Questions 1 to 5: There is a downloadable Stata package that produces sequential sums of squares for regression. In other words, the SS is built up as each variable

More information

ECON3150/4150 Spring 2015

ECON3150/4150 Spring 2015 ECON3150/4150 Spring 2015 Lecture 3&4 - The linear regression model Siv-Elisabeth Skjelbred University of Oslo January 29, 2015 1 / 67 Chapter 4 in S&W Section 17.1 in S&W (extended OLS assumptions) 2

More information

Difference in two or more average scores in different groups

Difference in two or more average scores in different groups ANOVAs Analysis of Variance (ANOVA) Difference in two or more average scores in different groups Each participant tested once Same outcome tested in each group Simplest is one-way ANOVA (one variable as

More information

Linear Models in Statistics

Linear Models in Statistics Linear Models in Statistics ALVIN C. RENCHER Department of Statistics Brigham Young University Provo, Utah A Wiley-Interscience Publication JOHN WILEY & SONS, INC. New York Chichester Weinheim Brisbane

More information

Intro to Linear Regression

Intro to Linear Regression Intro to Linear Regression Introduction to Regression Regression is a statistical procedure for modeling the relationship among variables to predict the value of a dependent variable from one or more predictor

More information

Simple Linear Regression: One Quantitative IV

Simple Linear Regression: One Quantitative IV Simple Linear Regression: One Quantitative IV Linear regression is frequently used to explain variation observed in a dependent variable (DV) with theoretically linked independent variables (IV). For example,

More information

Regression. Estimation of the linear function (straight line) describing the linear component of the joint relationship between two variables X and Y.

Regression. Estimation of the linear function (straight line) describing the linear component of the joint relationship between two variables X and Y. Regression Bivariate i linear regression: Estimation of the linear function (straight line) describing the linear component of the joint relationship between two variables and. Generally describe as a

More information

ECON Introductory Econometrics. Lecture 6: OLS with Multiple Regressors

ECON Introductory Econometrics. Lecture 6: OLS with Multiple Regressors ECON4150 - Introductory Econometrics Lecture 6: OLS with Multiple Regressors Monique de Haan (moniqued@econ.uio.no) Stock and Watson Chapter 6 Lecture outline 2 Violation of first Least Squares assumption

More information

Regression Analysis. Regression: Methodology for studying the relationship among two or more variables

Regression Analysis. Regression: Methodology for studying the relationship among two or more variables Regression Analysis Regression: Methodology for studying the relationship among two or more variables Two major aims: Determine an appropriate model for the relationship between the variables Predict the

More information

Chapter 14. Multiple Regression Models. Multiple Regression Models. Multiple Regression Models

Chapter 14. Multiple Regression Models. Multiple Regression Models. Multiple Regression Models Chapter 14 Multiple Regression Models 1 Multiple Regression Models A general additive multiple regression model, which relates a dependent variable y to k predictor variables,,, is given by the model equation

More information

Acknowledgements. Outline. Marie Diener-West. ICTR Leadership / Team INTRODUCTION TO CLINICAL RESEARCH. Introduction to Linear Regression

Acknowledgements. Outline. Marie Diener-West. ICTR Leadership / Team INTRODUCTION TO CLINICAL RESEARCH. Introduction to Linear Regression INTRODUCTION TO CLINICAL RESEARCH Introduction to Linear Regression Karen Bandeen-Roche, Ph.D. July 17, 2012 Acknowledgements Marie Diener-West Rick Thompson ICTR Leadership / Team JHU Intro to Clinical

More information

The coordinates of the vertex of the corresponding parabola are p, q. If a > 0, the parabola opens upward. If a < 0, the parabola opens downward.

The coordinates of the vertex of the corresponding parabola are p, q. If a > 0, the parabola opens upward. If a < 0, the parabola opens downward. Mathematics 10 Page 1 of 8 Quadratic Relations in Vertex Form The expression y ax p q defines a quadratic relation in form. The coordinates of the of the corresponding parabola are p, q. If a > 0, the

More information

Assessing the relation between language comprehension and performance in general chemistry. Appendices

Assessing the relation between language comprehension and performance in general chemistry. Appendices Assessing the relation between language comprehension and performance in general chemistry Daniel T. Pyburn a, Samuel Pazicni* a, Victor A. Benassi b, and Elizabeth E. Tappin c a Department of Chemistry,

More information

Module 2. General Linear Model

Module 2. General Linear Model D.G. Bonett (9/018) Module General Linear Model The relation between one response variable (y) and q 1 predictor variables (x 1, x,, x q ) for one randomly selected person can be represented by the following

More information

Statistical Modelling in Stata 5: Linear Models

Statistical Modelling in Stata 5: Linear Models Statistical Modelling in Stata 5: Linear Models Mark Lunt Arthritis Research UK Epidemiology Unit University of Manchester 07/11/2017 Structure This Week What is a linear model? How good is my model? Does

More information

MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA:

MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA: MULTIVARIATE ANALYSIS OF VARIANCE MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA: 1. Cell sizes : o

More information

CHAPTER 2 POLYNOMIALS KEY POINTS

CHAPTER 2 POLYNOMIALS KEY POINTS CHAPTER POLYNOMIALS KEY POINTS 1. Polynomials of degrees 1, and 3 are called linear, quadratic and cubic polynomials respectively.. A quadratic polynomial in x with real coefficient is of the form a x

More information

Interaction effects for continuous predictors in regression modeling

Interaction effects for continuous predictors in regression modeling Interaction effects for continuous predictors in regression modeling Testing for interactions The linear regression model is undoubtedly the most commonly-used statistical model, and has the advantage

More information

An Introduction to Path Analysis

An Introduction to Path Analysis An Introduction to Path Analysis PRE 905: Multivariate Analysis Lecture 10: April 15, 2014 PRE 905: Lecture 10 Path Analysis Today s Lecture Path analysis starting with multivariate regression then arriving

More information

Advising on Research Methods: A consultant's companion. Herman J. Ader Gideon J. Mellenbergh with contributions by David J. Hand

Advising on Research Methods: A consultant's companion. Herman J. Ader Gideon J. Mellenbergh with contributions by David J. Hand Advising on Research Methods: A consultant's companion Herman J. Ader Gideon J. Mellenbergh with contributions by David J. Hand Contents Preface 13 I Preliminaries 19 1 Giving advice on research methods

More information

Inverse of a Square Matrix. For an N N square matrix A, the inverse of A, 1

Inverse of a Square Matrix. For an N N square matrix A, the inverse of A, 1 Inverse of a Square Matrix For an N N square matrix A, the inverse of A, 1 A, exists if and only if A is of full rank, i.e., if and only if no column of A is a linear combination 1 of the others. A is

More information

Lecture 18: Simple Linear Regression

Lecture 18: Simple Linear Regression Lecture 18: Simple Linear Regression BIOS 553 Department of Biostatistics University of Michigan Fall 2004 The Correlation Coefficient: r The correlation coefficient (r) is a number that measures the strength

More information

LI EAR REGRESSIO A D CORRELATIO

LI EAR REGRESSIO A D CORRELATIO CHAPTER 6 LI EAR REGRESSIO A D CORRELATIO Page Contents 6.1 Introduction 10 6. Curve Fitting 10 6.3 Fitting a Simple Linear Regression Line 103 6.4 Linear Correlation Analysis 107 6.5 Spearman s Rank Correlation

More information

An Introduction to Mplus and Path Analysis

An Introduction to Mplus and Path Analysis An Introduction to Mplus and Path Analysis PSYC 943: Fundamentals of Multivariate Modeling Lecture 10: October 30, 2013 PSYC 943: Lecture 10 Today s Lecture Path analysis starting with multivariate regression

More information

2012 Algebra 1 End-of-Course (EOC) Assessment Form 1

2012 Algebra 1 End-of-Course (EOC) Assessment Form 1 read How should use of Reports be limited? provided on page 5 of 01 Algebra 1 End-of-Course (EOC) Assessment Form 1 MA.91.A.. Function notation; Identifying functions; Linking equations to functions MA.91.A..

More information

4/22/2010. Test 3 Review ANOVA

4/22/2010. Test 3 Review ANOVA Test 3 Review ANOVA 1 School recruiter wants to examine if there are difference between students at different class ranks in their reported intensity of school spirit. What is the factor? How many levels

More information

Chapter 12: Multiple Regression

Chapter 12: Multiple Regression Chapter 12: Multiple Regression 12.1 a. A scatterplot of the data is given here: Plot of Drug Potency versus Dose Level Potency 0 5 10 15 20 25 30 0 5 10 15 20 25 30 35 Dose Level b. ŷ = 8.667 + 0.575x

More information

MA.8.1 Students will apply properties of the real number system to simplify algebraic expressions and solve linear equations.

MA.8.1 Students will apply properties of the real number system to simplify algebraic expressions and solve linear equations. Focus Statement: Students will solve multi-step linear, quadratic, and compound equations and inequalities using the algebraic properties of the real number system. They will also graph linear and quadratic

More information

Example: 1982 State SAT Scores (First year state by state data available)

Example: 1982 State SAT Scores (First year state by state data available) Lecture 11 Review Section 3.5 from last Monday (on board) Overview of today s example (on board) Section 3.6, Continued: Nested F tests, review on board first Section 3.4: Interaction for quantitative

More information

Do not copy, post, or distribute

Do not copy, post, or distribute 14 CORRELATION ANALYSIS AND LINEAR REGRESSION Assessing the Covariability of Two Quantitative Properties 14.0 LEARNING OBJECTIVES In this chapter, we discuss two related techniques for assessing a possible

More information

PART I. (a) Describe all the assumptions for a normal error regression model with one predictor variable,

PART I. (a) Describe all the assumptions for a normal error regression model with one predictor variable, Concordia University Department of Mathematics and Statistics Course Number Section Statistics 360/2 01 Examination Date Time Pages Final December 2002 3 hours 6 Instructors Course Examiner Marks Y.P.

More information

1. Define the following terms (1 point each): alternative hypothesis

1. Define the following terms (1 point each): alternative hypothesis 1 1. Define the following terms (1 point each): alternative hypothesis One of three hypotheses indicating that the parameter is not zero; one states the parameter is not equal to zero, one states the parameter

More information

Computational Physics

Computational Physics Interpolation, Extrapolation & Polynomial Approximation Lectures based on course notes by Pablo Laguna and Kostas Kokkotas revamped by Deirdre Shoemaker Spring 2014 Introduction In many cases, a function

More information

Longitudinal Data Analysis of Health Outcomes

Longitudinal Data Analysis of Health Outcomes Longitudinal Data Analysis of Health Outcomes Longitudinal Data Analysis Workshop Running Example: Days 2 and 3 University of Georgia: Institute for Interdisciplinary Research in Education and Human Development

More information

AMS 315/576 Lecture Notes. Chapter 11. Simple Linear Regression

AMS 315/576 Lecture Notes. Chapter 11. Simple Linear Regression AMS 315/576 Lecture Notes Chapter 11. Simple Linear Regression 11.1 Motivation A restaurant opening on a reservations-only basis would like to use the number of advance reservations x to predict the number

More information

Correlation: Relationships between Variables

Correlation: Relationships between Variables Correlation Correlation: Relationships between Variables So far, nearly all of our discussion of inferential statistics has focused on testing for differences between group means However, researchers are

More information

STA 302 H1F / 1001 HF Fall 2007 Test 1 October 24, 2007

STA 302 H1F / 1001 HF Fall 2007 Test 1 October 24, 2007 STA 302 H1F / 1001 HF Fall 2007 Test 1 October 24, 2007 LAST NAME: SOLUTIONS FIRST NAME: STUDENT NUMBER: ENROLLED IN: (circle one) STA 302 STA 1001 INSTRUCTIONS: Time: 90 minutes Aids allowed: calculator.

More information

1. The positive zero of y = x 2 + 2x 3/5 is, to the nearest tenth, equal to

1. The positive zero of y = x 2 + 2x 3/5 is, to the nearest tenth, equal to SAT II - Math Level Test #0 Solution SAT II - Math Level Test No. 1. The positive zero of y = x + x 3/5 is, to the nearest tenth, equal to (A) 0.8 (B) 0.7 + 1.1i (C) 0.7 (D) 0.3 (E). 3 b b 4ac Using Quadratic

More information

APPENDIX 1 BASIC STATISTICS. Summarizing Data

APPENDIX 1 BASIC STATISTICS. Summarizing Data 1 APPENDIX 1 Figure A1.1: Normal Distribution BASIC STATISTICS The problem that we face in financial analysis today is not having too little information but too much. Making sense of large and often contradictory

More information

b = 2, c = 3, we get x = 0.3 for the positive root. Ans. (D) x 2-2x - 8 < 0, or (x - 4)(x + 2) < 0, Therefore -2 < x < 4 Ans. (C)

b = 2, c = 3, we get x = 0.3 for the positive root. Ans. (D) x 2-2x - 8 < 0, or (x - 4)(x + 2) < 0, Therefore -2 < x < 4 Ans. (C) SAT II - Math Level 2 Test #02 Solution 1. The positive zero of y = x 2 + 2x is, to the nearest tenth, equal to (A) 0.8 (B) 0.7 + 1.1i (C) 0.7 (D) 0.3 (E) 2.2 ± Using Quadratic formula, x =, with a = 1,

More information

Problems. Suppose both models are fitted to the same data. Show that SS Res, A SS Res, B

Problems. Suppose both models are fitted to the same data. Show that SS Res, A SS Res, B Simple Linear Regression 35 Problems 1 Consider a set of data (x i, y i ), i =1, 2,,n, and the following two regression models: y i = β 0 + β 1 x i + ε, (i =1, 2,,n), Model A y i = γ 0 + γ 1 x i + γ 2

More information

STRUCTURAL EQUATION MODELING. Khaled Bedair Statistics Department Virginia Tech LISA, Summer 2013

STRUCTURAL EQUATION MODELING. Khaled Bedair Statistics Department Virginia Tech LISA, Summer 2013 STRUCTURAL EQUATION MODELING Khaled Bedair Statistics Department Virginia Tech LISA, Summer 2013 Introduction: Path analysis Path Analysis is used to estimate a system of equations in which all of the

More information

22S39: Class Notes / November 14, 2000 back to start 1

22S39: Class Notes / November 14, 2000 back to start 1 Model diagnostics Interpretation of fitted regression model 22S39: Class Notes / November 14, 2000 back to start 1 Model diagnostics 22S39: Class Notes / November 14, 2000 back to start 2 Model diagnostics

More information

Moderation 調節 = 交互作用

Moderation 調節 = 交互作用 Moderation 調節 = 交互作用 Kit-Tai Hau 侯傑泰 JianFang Chang 常建芳 The Chinese University of Hong Kong Based on Marsh, H. W., Hau, K. T., Wen, Z., Nagengast, B., & Morin, A. J. S. (in press). Moderation. In Little,

More information

Chapter 12 - Lecture 2 Inferences about regression coefficient

Chapter 12 - Lecture 2 Inferences about regression coefficient Chapter 12 - Lecture 2 Inferences about regression coefficient April 19th, 2010 Facts about slope Test Statistic Confidence interval Hypothesis testing Test using ANOVA Table Facts about slope In previous

More information

MTH301 Calculus II Glossary For Final Term Exam Preparation

MTH301 Calculus II Glossary For Final Term Exam Preparation MTH301 Calculus II Glossary For Final Term Exam Preparation Glossary Absolute maximum : The output value of the highest point on a graph over a given input interval or over all possible input values. An

More information

Review of Statistics 101

Review of Statistics 101 Review of Statistics 101 We review some important themes from the course 1. Introduction Statistics- Set of methods for collecting/analyzing data (the art and science of learning from data). Provides methods

More information

Correlation and the Analysis of Variance Approach to Simple Linear Regression

Correlation and the Analysis of Variance Approach to Simple Linear Regression Correlation and the Analysis of Variance Approach to Simple Linear Regression Biometry 755 Spring 2009 Correlation and the Analysis of Variance Approach to Simple Linear Regression p. 1/35 Correlation

More information

SC705: Advanced Statistics Instructor: Natasha Sarkisian Class notes: Introduction to Structural Equation Modeling (SEM)

SC705: Advanced Statistics Instructor: Natasha Sarkisian Class notes: Introduction to Structural Equation Modeling (SEM) SC705: Advanced Statistics Instructor: Natasha Sarkisian Class notes: Introduction to Structural Equation Modeling (SEM) SEM is a family of statistical techniques which builds upon multiple regression,

More information

Question 1a 1b 1c 1d 1e 2a 2b 2c 2d 2e 2f 3a 3b 3c 3d 3e 3f M ult: choice Points

Question 1a 1b 1c 1d 1e 2a 2b 2c 2d 2e 2f 3a 3b 3c 3d 3e 3f M ult: choice Points Economics 102: Analysis of Economic Data Cameron Spring 2016 May 12 Department of Economics, U.C.-Davis Second Midterm Exam (Version A) Compulsory. Closed book. Total of 30 points and worth 22.5% of course

More information

Regression and correlation. Correlation & Regression, I. Regression & correlation. Regression vs. correlation. Involve bivariate, paired data, X & Y

Regression and correlation. Correlation & Regression, I. Regression & correlation. Regression vs. correlation. Involve bivariate, paired data, X & Y Regression and correlation Correlation & Regression, I 9.07 4/1/004 Involve bivariate, paired data, X & Y Height & weight measured for the same individual IQ & exam scores for each individual Height of

More information

Experimental Design and Data Analysis for Biologists

Experimental Design and Data Analysis for Biologists Experimental Design and Data Analysis for Biologists Gerry P. Quinn Monash University Michael J. Keough University of Melbourne CAMBRIDGE UNIVERSITY PRESS Contents Preface page xv I I Introduction 1 1.1

More information

Algebra 2 Summer Work Packet Review and Study Guide

Algebra 2 Summer Work Packet Review and Study Guide Algebra Summer Work Packet Review and Study Guide This study guide is designed to accompany the Algebra Summer Work Packet. Its purpose is to offer a review of the nine specific concepts covered in the

More information

Linear Regression. In this problem sheet, we consider the problem of linear regression with p predictors and one intercept,

Linear Regression. In this problem sheet, we consider the problem of linear regression with p predictors and one intercept, Linear Regression In this problem sheet, we consider the problem of linear regression with p predictors and one intercept, y = Xβ + ɛ, where y t = (y 1,..., y n ) is the column vector of target values,

More information

Three Factor Completely Randomized Design with One Continuous Factor: Using SPSS GLM UNIVARIATE R. C. Gardner Department of Psychology

Three Factor Completely Randomized Design with One Continuous Factor: Using SPSS GLM UNIVARIATE R. C. Gardner Department of Psychology Data_Analysis.calm Three Factor Completely Randomized Design with One Continuous Factor: Using SPSS GLM UNIVARIATE R. C. Gardner Department of Psychology This article considers a three factor completely

More information

Algebra II Vocabulary Word Wall Cards

Algebra II Vocabulary Word Wall Cards Algebra II Vocabulary Word Wall Cards Mathematics vocabulary word wall cards provide a display of mathematics content words and associated visual cues to assist in vocabulary development. The cards should

More information

ST430 Exam 1 with Answers

ST430 Exam 1 with Answers ST430 Exam 1 with Answers Date: October 5, 2015 Name: Guideline: You may use one-page (front and back of a standard A4 paper) of notes. No laptop or textook are permitted but you may use a calculator.

More information

Ch. 16: Correlation and Regression

Ch. 16: Correlation and Regression Ch. 1: Correlation and Regression With the shift to correlational analyses, we change the very nature of the question we are asking of our data. Heretofore, we were asking if a difference was likely to

More information

Multiple Linear Regression

Multiple Linear Regression Multiple Linear Regression Simple linear regression tries to fit a simple line between two variables Y and X. If X is linearly related to Y this explains some of the variability in Y. In most cases, there

More information

Spring 2014 Algebra 1 End-of-Course (EOC) Assessment Next Generation Sunshine State Standards (NGSSS) Form 1

Spring 2014 Algebra 1 End-of-Course (EOC) Assessment Next Generation Sunshine State Standards (NGSSS) Form 1 cautions should be considered when using Reports? on page 5 of this report. Spring 01 Algebra 1 End-of-Course (EOC) Assessment Form 1 MA.91.A.. Function notation; Identifying functions MA.91.A.. Domain/range

More information

Unit 12: Response Surface Methodology and Optimality Criteria

Unit 12: Response Surface Methodology and Optimality Criteria Unit 12: Response Surface Methodology and Optimality Criteria STA 643: Advanced Experimental Design Derek S. Young 1 Learning Objectives Revisit your knowledge of polynomial regression Know how to use

More information

Contents. Acknowledgments. xix

Contents. Acknowledgments. xix Table of Preface Acknowledgments page xv xix 1 Introduction 1 The Role of the Computer in Data Analysis 1 Statistics: Descriptive and Inferential 2 Variables and Constants 3 The Measurement of Variables

More information

Econometrics. 4) Statistical inference

Econometrics. 4) Statistical inference 30C00200 Econometrics 4) Statistical inference Timo Kuosmanen Professor, Ph.D. http://nomepre.net/index.php/timokuosmanen Today s topics Confidence intervals of parameter estimates Student s t-distribution

More information

ECON2228 Notes 2. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 47

ECON2228 Notes 2. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 47 ECON2228 Notes 2 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 2 2014 2015 1 / 47 Chapter 2: The simple regression model Most of this course will be concerned with

More information

One-sided and two-sided t-test

One-sided and two-sided t-test One-sided and two-sided t-test Given a mean cancer rate in Montreal, 1. What is the probability of finding a deviation of > 1 stdev from the mean? 2. What is the probability of finding 1 stdev more cases?

More information

Math 423/533: The Main Theoretical Topics

Math 423/533: The Main Theoretical Topics Math 423/533: The Main Theoretical Topics Notation sample size n, data index i number of predictors, p (p = 2 for simple linear regression) y i : response for individual i x i = (x i1,..., x ip ) (1 p)

More information

: The model hypothesizes a relationship between the variables. The simplest probabilistic model: or.

: The model hypothesizes a relationship between the variables. The simplest probabilistic model: or. Chapter Simple Linear Regression : comparing means across groups : presenting relationships among numeric variables. Probabilistic Model : The model hypothesizes an relationship between the variables.

More information

Lectures on Simple Linear Regression Stat 431, Summer 2012

Lectures on Simple Linear Regression Stat 431, Summer 2012 Lectures on Simple Linear Regression Stat 43, Summer 0 Hyunseung Kang July 6-8, 0 Last Updated: July 8, 0 :59PM Introduction Previously, we have been investigating various properties of the population

More information

Multilevel Models in Matrix Form. Lecture 7 July 27, 2011 Advanced Multivariate Statistical Methods ICPSR Summer Session #2

Multilevel Models in Matrix Form. Lecture 7 July 27, 2011 Advanced Multivariate Statistical Methods ICPSR Summer Session #2 Multilevel Models in Matrix Form Lecture 7 July 27, 2011 Advanced Multivariate Statistical Methods ICPSR Summer Session #2 Today s Lecture Linear models from a matrix perspective An example of how to do

More information

Independent Samples ANOVA

Independent Samples ANOVA Independent Samples ANOVA In this example students were randomly assigned to one of three mnemonics (techniques for improving memory) rehearsal (the control group; simply repeat the words), visual imagery

More information

using the beginning of all regression models

using the beginning of all regression models Estimating using the beginning of all regression models 3 examples Note about shorthand Cavendish's 29 measurements of the earth's density Heights (inches) of 14 11 year-old males from Alberta study Half-life

More information

ON THE USE OF POLYNOMIAL REGRESSION EQUATIONS AS AN ALTERNATIVE TO DIFFERENCE SCORES IN ORGANIZATIONAL RESEARCH

ON THE USE OF POLYNOMIAL REGRESSION EQUATIONS AS AN ALTERNATIVE TO DIFFERENCE SCORES IN ORGANIZATIONAL RESEARCH o Academy of Management Journal 1993, Vol. 36, No. 6, 1577-1613. ON THE USE OF POLYNOMIAL REGRESSION EQUATIONS AS AN ALTERNATIVE TO DIFFERENCE SCORES IN ORGANIZATIONAL RESEARCH JEFFREY R. EDWARDS University

More information

Regression Analysis: Exploring relationships between variables. Stat 251

Regression Analysis: Exploring relationships between variables. Stat 251 Regression Analysis: Exploring relationships between variables Stat 251 Introduction Objective of regression analysis is to explore the relationship between two (or more) variables so that information

More information

Correlation and Regression

Correlation and Regression Correlation and Regression October 25, 2017 STAT 151 Class 9 Slide 1 Outline of Topics 1 Associations 2 Scatter plot 3 Correlation 4 Regression 5 Testing and estimation 6 Goodness-of-fit STAT 151 Class

More information

Assumptions, Diagnostics, and Inferences for the Simple Linear Regression Model with Normal Residuals

Assumptions, Diagnostics, and Inferences for the Simple Linear Regression Model with Normal Residuals Assumptions, Diagnostics, and Inferences for the Simple Linear Regression Model with Normal Residuals 4 December 2018 1 The Simple Linear Regression Model with Normal Residuals In previous class sessions,

More information

DESIGNING EXPERIMENTS AND ANALYZING DATA A Model Comparison Perspective

DESIGNING EXPERIMENTS AND ANALYZING DATA A Model Comparison Perspective DESIGNING EXPERIMENTS AND ANALYZING DATA A Model Comparison Perspective Second Edition Scott E. Maxwell Uniuersity of Notre Dame Harold D. Delaney Uniuersity of New Mexico J,t{,.?; LAWRENCE ERLBAUM ASSOCIATES,

More information

Notebook Tab 6 Pages 183 to ConteSolutions

Notebook Tab 6 Pages 183 to ConteSolutions Notebook Tab 6 Pages 183 to 196 When the assumed relationship best fits a straight line model (r (Pearson s correlation coefficient) is close to 1 ), this approach is known as Linear Regression Analysis.

More information

Structural Equation Modeling and Confirmatory Factor Analysis. Types of Variables

Structural Equation Modeling and Confirmatory Factor Analysis. Types of Variables /4/04 Structural Equation Modeling and Confirmatory Factor Analysis Advanced Statistics for Researchers Session 3 Dr. Chris Rakes Website: http://csrakes.yolasite.com Email: Rakes@umbc.edu Twitter: @RakesChris

More information

Chapter 10. Regression. Understandable Statistics Ninth Edition By Brase and Brase Prepared by Yixun Shi Bloomsburg University of Pennsylvania

Chapter 10. Regression. Understandable Statistics Ninth Edition By Brase and Brase Prepared by Yixun Shi Bloomsburg University of Pennsylvania Chapter 10 Regression Understandable Statistics Ninth Edition By Brase and Brase Prepared by Yixun Shi Bloomsburg University of Pennsylvania Scatter Diagrams A graph in which pairs of points, (x, y), are

More information

Summer Packet for Students Taking Introduction to Calculus in the Fall

Summer Packet for Students Taking Introduction to Calculus in the Fall Summer Packet for Students Taking Introduction to Calculus in the Fall Algebra 2 Topics Needed for Introduction to Calculus Need to know: à Solve Equations Linear Quadratic Absolute Value Polynomial Rational

More information

THE ROYAL STATISTICAL SOCIETY 2008 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE 4 LINEAR MODELS

THE ROYAL STATISTICAL SOCIETY 2008 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE 4 LINEAR MODELS THE ROYAL STATISTICAL SOCIETY 008 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE 4 LINEAR MODELS The Society provides these solutions to assist candidates preparing for the examinations

More information

Key Algebraic Results in Linear Regression

Key Algebraic Results in Linear Regression Key Algebraic Results in Linear Regression James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) 1 / 30 Key Algebraic Results in

More information

sociology 362 regression

sociology 362 regression sociology 36 regression Regression is a means of studying how the conditional distribution of a response variable (say, Y) varies for different values of one or more independent explanatory variables (say,

More information

LINEAR REGRESSION ANALYSIS. MODULE XVI Lecture Exercises

LINEAR REGRESSION ANALYSIS. MODULE XVI Lecture Exercises LINEAR REGRESSION ANALYSIS MODULE XVI Lecture - 44 Exercises Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Exercise 1 The following data has been obtained on

More information