Estimation in SEM: A Concrete Example

Size: px
Start display at page:

Download "Estimation in SEM: A Concrete Example"

Transcription

1 Journal of Educational and Behavioral Statistics March 2007, Vol. 32, No. 1, pp DOI: / AERA and ASA. Estimation in SEM: A Concrete Example John M. Ferron Melinda R. Hess University of South Florida A concrete example is used to illustrate maximum likelihood estimation of a structural equation model with two unknown parameters. The fitting function is found for the example, as are the vector of first-order partial derivatives, the matrix of second-order partial derivatives, and the estimates obtained from each iteration of the Newton-Raphson algorithm. The goal is to provide a concrete illustration to help those learning structural equation modeling bridge the gap between the verbal descriptions of estimation procedures and the mathematical definition of these procedures provided in the technical literature. Keywords: structural equation modeling; teaching statistics; maximum likelihood; Newton-Raphson; parameter estimation Concrete examples are often used in the teaching and learning of mathematics and are advocated at virtually all levels of mathematics education, from K-12 instruction, as evidenced by the standards of the National Council of Teacher's of Mathematics (National Council of Teachers of Mathematics, 1989) to acquisition of higher level mathematics skills. One can easily extend the reasoning behind the need to provide students with concrete examples and representations beyond the purely mathematical realm into the world of statistics. For basic statistical concepts, like the standard deviation, worked-out examples using a definitional formula are provided in most introductory texts, and additional examples can be readily generated. For more advanced analyses, like multiple regression, canonical correlation, and multivariate analysis of variance, workedout examples can be found in Pedhazur (1997), Tatsuoka and Lohnes (1988), and Stevens (2002), respectively. As statistical analyses get more complicated, the availability of concrete examples diminishes. In the area of structural equation modeling, it is relatively difficult to find or generate concrete examples illustrating parameter estimation algorithms. Such examples, however, are valuable in helping learners bridge the gap between the verbal descriptions of estimation procedures and the mathematical definition of these procedures provided in the technical literature. We have found the example illustrating maximum likelihood estimation of a single parameter model 110

2 SEM Concrete Example FIGURE 1. Path diagram for computational example. given by Bollen (1989) to be useful to learners. Our purpose is to provide a more complex example for learners who benefit from concrete illustrations. We have chosen a two-parameter model for the example. Having multiple parameters enables us to maintain vectors and matrices in the computations, whereas limiting our model to two parameters keeps the computations relatively simple. Maximum likelihood estimation is illustrated because this is the method most commonly used to estimate structural equation models and the first method learners typically try to understand. Context for the Example In this example, we consider a single latent exogenous variable, ý, being used to predict a single latent endogenous variable, il, both of which have singleindicator variables (X and Y, respectively). The model is shown in a path diagram in Figure 1, where ovals indicate the latent variables of interest, rectangles indicate observed variables used as indicators of the latent variables, circles represent errors, single-headed arrows represent effects, and double-headed arrows originating and ending on the same variable indicate variances. For illustrative purposes, we have assumed known values for five of the model parameters and unknown values for the other two. The measurement error (6) of indicator variable X is assumed to have a variance of 2, and the measurement error (a) of indicator variable Y is assumed to have a variance of 4. Other parameters that we have assumed a value for include the variance of latent variable ý to be 8 and the coefficients from both latent variables ý -and ri to their single indicators variables, X and Y respectively, to be 1. The two parameters that we are estimating in this example are the coefficient from latent variable ý to latent variable 11, represented by gamma (y) and the variance of the disturbance of the endogenous latent variable il, represented by psi (*). This model can be mathematically represented by a series of matrices. Using the eight-matrix notation that over the years has been made familiar by the LISREL 111

3 Ferron and Hess (Linear Structural RELationships) software package (J6reskog & Stbrbom, 2004), the model parameters can be represented by the following series of matrices: A. [1] Ay =[11 0z= [2] 0= [4] 9= [81 T [,] P = [0] r =[y] where A, contains the coefficient from, to its indicator variable X, A, contains the coefficient from il to its indicator variable Y, 08 contains the variance in the measurement error of X, 0, contains the variance in the measurement error of Y, 9 contains the variance in, f becomes zero because this model has no effects between latent endogenous variables, W contains the variance in the disturbance of il, and r contains the coefficient from ý to rt. Implied Covariance Matrix If a structural equation model was properly specified, and if we knew all the parameter values, we could compute the model's implication for the covariance between all pairs of variables. Put another way, the model and parameter values, 0, imply a covariance matrix, E(O). The general form of the implied covariance matrix for structural equation modeling has been discussed (Hayduk, 1987) and derived (Bollen, 1989) elsewhere, and is given by lay(lf' )(raf, )((I E() Ax '[(/-- +)-]'A axa " + 0}'F A ' (1) where I is the identity matrix. By substituting the parameters from our example, this general equation can be simplified, yielding Z [ 1 8 =82+*+ 8,y 4 8y] 10J" (2) Observed Covariance Matrix Assume the observed covariance matrix is [ 20 (3) where 20 is the observed variance of X, 10 is the observed variance of Y, and 5 is the observed covariance between X and Y. 112

4 Maximum Likelihood Fitting Function SEM Concrete Example When we estimate parameter values for a structural equation model, we are trying to pick values for the parameters that make the implied covariance matrix as close as possible to the observed covariance matrix, or in other words, we try to minimize a discrepancy function between the model implied covariance matrix and the observed covariance matrix. There are a variety of different ways to define the discrepancy function, and thus alternative ways to estimate the parameters. The most common estimation method, maximum likelihood, is based on a normality assumption that allows the likelihood of the observed covariance matrix to be considered for a given set of parameter values. Different sets of values are considered, and then the set that maximizes the likelihood of the observed covariance matrix is selected. It can be shown that the values that maximize the likelihood of the data are the values that minimize the maximum likelihood fitting function, FML = log[f,(0)i + tr(sz-' (0)) - log[si - (p + q), (4) where p + q is the number of variables, S is the observed covariance matrix, and F,(O) is the implied covariance matrix. Bollen (1989) provides detailed derivation of the likelihood function and shows its relationship to the fitting function. The fitting function can be simplified for our example by substituting our implied and observed covariance matrices, Fmf = log 8[ 8 2 +*I+4 8y ([tr20 5 ][8y 2 ±4*±4 8y]-]) I y 1011 (5 10J[8y 10j log 5-1(p + q). (5) This equation can be simplified to Equation 6 and further simplified to Equation 7 through matrix algebra (see Hayduk, 1987, or Bollen, 1989, for details about inverses, determinants, traces, and multiplication of matrices in the context of structural equation modeling). The minimization process is unaffected by the constants in the function [log(175) and 2], thus these could be dropped from the equation. FML = log(80y2 + 10*' y2) + 5 tr y (6) -log(175)- 2 -o8gy '-4 113

5 Ferron and Hess FML = log(16y * + 40) + (16y2 + -0,+ 40) (80y 2-80y + 10* + 240) - log(175) - 2 (7) Minimizing the Fitting Function Numerical methods are used to minimize the fitting function. The algorithm begins with an initial guess for each of the parameters to be estimated. These guesses become the start values, which are then adjusted by considering how the fitting function is changing at these particular values. The adjusted values are then considered and also adjusted. The iterative process of making adjustments continues until it appears that the minimum of the fitting function has been reached. There are a variety of algorithms for minimizing the fitting function (e.g., Newton-Raphson, Fletcher-Powell, Gauss-Newton, Levenberg-Marquardt). The algorithms differ from each other in the details of how the adjustments, or steps, in the process are defined. We will illustrate the Newton-Raphson algorithm because it is relatively straightforward, but complex enough to facilitate thinking about the first- and second-order partial derivatives of the fitting function. A step in the Newton-Raphson algorithm is generally defined by 6(i+) = 6()_ ra2fml]-1 [@FML] L=6050 ao[ J (8) where the vector of parameter estimates at iteration i + 1, 0 (i+1), involves an adjustment of the parameter estimates from the previous iteration, V). The size of the adjustment depends on the gradient, 3Lf-, and the Hessian matrix, a2fail Maoa The gradient contains the partial slopes for the surface of the fitting function. As we get closer to the minimum, these slopes approach zero. The Hessian matrix contains information about the rate the partial slopes are changing. Using the gradient and inverse Hessian to adjust the estimates allow relatively large adjustments to be made when we are far from the minimum, and finer and finer adjustments to be made as we get closer to the minimum. To illustrate the use of the Newton-Raphson algorithm in this example, we must first find the first-order and second-order partial derivatives of the fitting function. In the next section, we will find the first-order partial derivatives of the fitting function for our example, and in the following section, we will find the second-order partial derivatives. Once the partial derivatives have been found, we will return to the Newton-Raphson algorithm and illustrate its application to our example. 114

6 Vector of First-Order Partial Derivatives of FML SEM Concrete-Example An element of the vector of first-order partial derivatives is found by finding the derivative of the fitting function with respect to a particular parameter. For our example, the gradient, or vector of first-order partial derivatives, is W,FML a y A- d afml " (9) L 8t) j The process of finding the first-order partial derivative of the fitting function with respect to y for our example is illustrated in Equations 10 through 12. Equation 10 shows that we intend to take the derivative of the fitting function. Equation 11 shows the result of taking the derivative with respect to Y, and Equation 12 shows the result after simplifying the expression. afml [Ilog(16y2 + 10* + 40) + ( (80y2-80y + 10*+ 240) - log(175) -2] (10) afml =(6204±)32y &-- = 16y`2 + 1'0* ( *f + 40)(160y - 80) - ( y + 10qf + 240) (32y) (11) ( * + 40)2 8 FML 512,y ý y* (12) Zy (16y2 + 10* + 40) 2 We then find the first-order partial derivative of the fitting function with respect to *' for our example, which is illustrated in Equations 13, 14, and 15. afml =_ [log( * + 40) + (16y ) (80y2-80y + 10*, + 240) - log(175) - 2 (13) 115

7 Ferron and Hess afml = 10 M-- (T6y2T+ 1o0l (16y )(10) - (80y2-80y )(10) (14) (16y ) 2 afml (15) a* ( )2 Matrix of Second-Order Partial Derivatives of FML The matrix of second-order partial derivatives of FML, the Hessian matrix, for our example is given by [a2fml a2fml1 ' FAfL -76 aya'l (16) FML a2fml Note that each element of the Hessian matrix is found by taking the derivative of one of the first-order partial derivatives with respect to one of the parameters. The first element of the first row of this matrix is obtained by taking the derivative of 8FmL with respect to 7. This process is illustrated starting with Equation 17, which indicates that we will be taking the derivative with respect to y of the result from Equation 12. The result of taking this partial derivative is shown in Equation 18 and then simplified to obtain Equation FML a [512y y4-8004* (17) 72 by L ( )2 ( )2 ( y ) - (512y3 8 2 FML y ) (2)( ) (32y) (18) (16y )4-8192,y y y a 2 FML y ,4y (19) _6T ( )3 The second element of the second row is obtained by taking the derivative of 8FmL with respect to *. This process is shown starting with Equation 20, which 116

8 SEM Concrete Example relies on the result from Equation 15, then proceeds by finding the partial derivative (Equation 21), and concludes by simplifying (Equation 22). a 2 FML - [--480y y + 100t) ] (20) ( * + 40) 2 (100) - (-480y y 6 2 FML + 100*)- 1600)(2)(1672 +t 10* + 40)(10) (21) a*)2 (16y2 + 10* + 40)4 a 2 FML 11200y y * _ (16y2 + 10* + 40)3 The off-diagonal elements can be found by either taking the derivative of with respect to y or the derivative of am with respect to 4i. The process of taking the derivative of a Equation 25. with respect to y is shown in Equations 23 through a 2 FML a [ZFML,] 8 [-480y y + 100* (23) a 7 r ao_ - 7 L a j j a 7 [ (16y2 + 10*i + 40)2 j ( * + 40)2(-960y + 800) - (-480y y' a 2 FML +1001J- 1600)(2)(1612+ lo4'40)( 32 'y) (24) a'ya*v (16y2 + 10* + 40)4 a 2 FML 15360y y y y*'y *l (25) a 7 asii -- (16y2 + 10qf + 40)3 Results of Each Iteration Now that we have solved for the first-order and second-order partial derivatives, we can illustrate the use of the Newton-Raphson algorithm for our example, where each step is defined by [i+1). - L a) "Ea 2 FML a 2 FML a F[FML W ay0 32 aya*) a- (26 2 J FML a 2 FML afml a*a 7 ay L- 0* 117

9 Ferron and Hess TABLE 1 Summary of the Results of Each Iteration of the Maximum Likelihood Estimation Iteration j(3) ý(3) 2E2EM FML ay E-6a E E E a E-6 is the scientific notation for To start the estimation process, we need some initial estimates for y and 4. There are alternative ways of doing this, ranging from subjective estimates based on the researcher's understanding of the model to noniterative estimation approaches like the instrumental variables method (e.g., Bollen, 1989). For simplicity, we chose to provide subjective estimates for the start values, (1) =.5 and 4(1) = 5. By substituting these values into '(0, -(-), OFML a 2 FML a 2 FML and 'FML and solving, Equation 26 becomes [,(2)7 [ ]-1 [ ] =( [ ' which leads to values of for,(2) and for j(2). These estimates are then substituted into (),, ary2 I 'FML and -5 in Equation 26, which leads to E 1 = [ ] o(3) _ [ ]-' [ ,i(3)j [ [ [ J ( The estimates are now = -(3) and j(3) = To conserve space, the results of the iterations are suimmarized in Table 1. In this table, we list the estimates of 'y and *', the elements of the gradient, and the value of the maximum likelihood fitting function for each iteration. Note that the changes in y' and *' are relatively large during the first few iterations and get smaller as the iterations continue. Also notice the elements of the gradient and the maximum likelihood fitting function are getting smaller as the iterations increase. For this example, the elements of the gradient become zero on the seventh iteration. In addition, the value for the fitting function becomes zero, which implies 118

10 SEM Concrete Example the estimates of for y and for * lead to an implied covariance matrix, El(0), that does not differ from the sample covariance matrix, S. Alternative Estimation Methods For this small example, we could have arrived at estimates for y and *, by equating the implied covariance matrix to the sample covariance matrix, [8Y(2+) i 8,yl _20 51 ] (29) L 8y 10o= 5 1 = Separate equations can be written for each element, leading to two equations containing two unknowns, 8y = 5 (30) 8y2 + * + 4 = 20. (31) Solving Equation 30 for y leads to 0.625, and substituting for y in Equation 31 and solving for 4, gives In more complex situations, there are typically more equations than unknowns. Attempts to solve the equations for the unknowns will often lead to multiple estimates for the same unknown because sampling error keeps the elements of S from being identical to the elements of E (0). It is these situations that motivate more complex estimation methods. Different methods (e.g., maximum likelihood, weighted least squares) are based on different assumptions (e.g., maximum likelihood assumes multivariate normality) and will often lead to different estimates, so care should be taken to choose an estimation method appropriate for the application. Finally, it should be noted that iterative methods will often not reach a fitting function value of zero. We say that we have met the convergence criterion when all elements of the gradient become less than some set value (e.g., ), which implies the estimates are changing trivially from one iteration to the next. Closing Comments For the purpose of illustrating maximum likelihood estimation in the context of structural equation modeling it was useful to focus on a relatively simple model. As models and data become more complex, researchers should be aware that estimation does not always proceed as smoothly as it did with this example. If estimation problems are encountered, it may be useful to consider the following strategies: (a) changing the start values, (b) changing the estimation algorithm, (c) increasing the maximum number of iterations, (d) screening the data for outliers and other anomalies, (e) using boundary constraints, (f) rescaling any variable with a variance many orders of magnitude larger than the variances of the 119

11 Ferron and Hess other variables, (g) checking the identification status of the model, (h) checking the consistency between the model specified in the structural equation modeling software and the path diagram, and (i) reexamining the consistency between the path diagram and theoretical expectations. For details on the problems that can be encountered, potential solutions, and limitations to these potential solutions, see Chen, Bollen, Paxton, Curran, and Kirby (2001); Wothke (1993); and Bentler and Chou (1987). References Bentler, P. M., & Chou, C. P. (1987). Practical issues in structural equation modeling. Sociological Methods and Research, 16, Bollen, K. A. (1989). Structural equation modeling with latent variables. New York: John Wiley. Chen, F., Bollen, K. A., Paxton, P., Curran, P. J., & Kirby, J. B. (2001). Improper solutions in structural equation models-causes, consequences, and strategies. Sociological Methods and Research, 29, Hayduk, L. (1987). Structural equation modeling with LJSREL: Essentials and advances. Baltimore: John Hopkins University Press. J6reskog, K. G., & Sbrbom, D. (2004). LISREL (Version 8.7) [Computer Software]. Lincolnwood, IL: Scientific Software International. National Council of Teachers of Mathematics. (1989). Curriculum and evaluation standards for school mathematics. Reston, VA: Author. Pedhazur, E. J. (1997). Multiple regression in behavioral research: Explanation and prediction (3rd ed.). Fort Worth, TX: Harcourt Brace. Stevens, J. P. (2002). Applied multivariate statistics for the social sciences (4th ed.). Mahwah, NJ: Lawrence Erlbaum. Tatsuoka, M. M., & Lohnes, P. R. (1988). Multivariate analysis: Techniques for educational and psychological research (2nd ed.). New York: Macmillan. Wothke, W. (1993). Nonpositive definite matrices in structural, equation modeling. In K. A. Bollen & J. S. Long (Eds.), Testing structural equation models (chap. 11, pp ). Newbury Park, CA: Sage. Authors JOHN M. FERRON is Professor of Educational Measurement and Research at the University of South Florida, 4202 East Fowler Ave. EDU 162, Tampa, FL 33620; His areas of interest include structural equation modeling, multilevel modeling, and the analysis of interrupted time-series data. MELINDA R. HESS is Director of the Center for Research, Evaluation, Assessment, and Measurement at the University of South Florida, 4202 East Fowler Ave. EDU 162, Tampa, FL 33620; Her areas of interest include multilevel modeling, structural equation modeling, research reporting protocols and practices, and program assessment. Manuscript received December 1, 2004 Accepted November 21,

12 COPYRIGHT INFORMATION TITLE: Estimation in SEM: A Concrete Example SOURCE: Journal of Educational and Behavioral Statistics 32 no1 Mr 2007 PAGE(S): The magazine publisher is the copyright holder of this article and it is reproduced with permission. Further reproduction of this article in violation of the copyright is prohibited. To contact the publisher:

Model Estimation Example

Model Estimation Example Ronald H. Heck 1 EDEP 606: Multivariate Methods (S2013) April 7, 2013 Model Estimation Example As we have moved through the course this semester, we have encountered the concept of model estimation. Discussions

More information

SHOPPING FOR EFFICIENT CONFIDENCE INTERVALS IN STRUCTURAL EQUATION MODELS. Donna Mohr and Yong Xu. University of North Florida

SHOPPING FOR EFFICIENT CONFIDENCE INTERVALS IN STRUCTURAL EQUATION MODELS. Donna Mohr and Yong Xu. University of North Florida SHOPPING FOR EFFICIENT CONFIDENCE INTERVALS IN STRUCTURAL EQUATION MODELS Donna Mohr and Yong Xu University of North Florida Authors Note Parts of this work were incorporated in Yong Xu s Masters Thesis

More information

Chapter 5. Introduction to Path Analysis. Overview. Correlation and causation. Specification of path models. Types of path models

Chapter 5. Introduction to Path Analysis. Overview. Correlation and causation. Specification of path models. Types of path models Chapter 5 Introduction to Path Analysis Put simply, the basic dilemma in all sciences is that of how much to oversimplify reality. Overview H. M. Blalock Correlation and causation Specification of path

More information

Misspecification in Nonrecursive SEMs 1. Nonrecursive Latent Variable Models under Misspecification

Misspecification in Nonrecursive SEMs 1. Nonrecursive Latent Variable Models under Misspecification Misspecification in Nonrecursive SEMs 1 Nonrecursive Latent Variable Models under Misspecification Misspecification in Nonrecursive SEMs 2 Abstract A problem central to structural equation modeling is

More information

Structural equation modeling

Structural equation modeling Structural equation modeling Rex B Kline Concordia University Montréal E ISTQL Set E SR models CFA vs. SR o Factors: CFA: Exogenous only SR: Exogenous + endogenous E2 CFA vs. SR o Factors & indicators:

More information

Path Diagrams. James H. Steiger. Department of Psychology and Human Development Vanderbilt University

Path Diagrams. James H. Steiger. Department of Psychology and Human Development Vanderbilt University Path Diagrams James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) Path Diagrams 1 / 24 Path Diagrams 1 Introduction 2 Path Diagram

More information

Testing and Interpreting Interaction Effects in Multilevel Models

Testing and Interpreting Interaction Effects in Multilevel Models Testing and Interpreting Interaction Effects in Multilevel Models Joseph J. Stevens University of Oregon and Ann C. Schulte Arizona State University Presented at the annual AERA conference, Washington,

More information

INSTRUCTIONAL FOCUS DOCUMENT HS/Algebra 1

INSTRUCTIONAL FOCUS DOCUMENT HS/Algebra 1 Possible Lesson 01 (7 days) State Resources: Algebra 1 End of Course Success CCRS: Objective 4 Lesson 2 Line Dancing, Systems of Equations Card Match, On Your Own: Systems of Equations, Using Technology:

More information

Sample Conceptual Unit for Eighth Grade

Sample Conceptual Unit for Eighth Grade Sample Conceptual Unit for Eighth Grade Critical eighth-grade concepts build directly from seventh-grade essentials. Fluency with rational numbercomputation and algebraic thinking skills is key to eighth-grade

More information

A Cautionary Note on the Use of LISREL s Automatic Start Values in Confirmatory Factor Analysis Studies R. L. Brown University of Wisconsin

A Cautionary Note on the Use of LISREL s Automatic Start Values in Confirmatory Factor Analysis Studies R. L. Brown University of Wisconsin A Cautionary Note on the Use of LISREL s Automatic Start Values in Confirmatory Factor Analysis Studies R. L. Brown University of Wisconsin The accuracy of parameter estimates provided by the major computer

More information

Can Variances of Latent Variables be Scaled in Such a Way That They Correspond to Eigenvalues?

Can Variances of Latent Variables be Scaled in Such a Way That They Correspond to Eigenvalues? International Journal of Statistics and Probability; Vol. 6, No. 6; November 07 ISSN 97-703 E-ISSN 97-7040 Published by Canadian Center of Science and Education Can Variances of Latent Variables be Scaled

More information

Structural Equation Modeling and Confirmatory Factor Analysis. Types of Variables

Structural Equation Modeling and Confirmatory Factor Analysis. Types of Variables /4/04 Structural Equation Modeling and Confirmatory Factor Analysis Advanced Statistics for Researchers Session 3 Dr. Chris Rakes Website: http://csrakes.yolasite.com Email: Rakes@umbc.edu Twitter: @RakesChris

More information

Hierarchical Linear Models. Jeff Gill. University of Florida

Hierarchical Linear Models. Jeff Gill. University of Florida Hierarchical Linear Models Jeff Gill University of Florida I. ESSENTIAL DESCRIPTION OF HIERARCHICAL LINEAR MODELS II. SPECIAL CASES OF THE HLM III. THE GENERAL STRUCTURE OF THE HLM IV. ESTIMATION OF THE

More information

Chapter 8. Models with Structural and Measurement Components. Overview. Characteristics of SR models. Analysis of SR models. Estimation of SR models

Chapter 8. Models with Structural and Measurement Components. Overview. Characteristics of SR models. Analysis of SR models. Estimation of SR models Chapter 8 Models with Structural and Measurement Components Good people are good because they've come to wisdom through failure. Overview William Saroyan Characteristics of SR models Estimation of SR models

More information

Specifying Latent Curve and Other Growth Models Using Mplus. (Revised )

Specifying Latent Curve and Other Growth Models Using Mplus. (Revised ) Ronald H. Heck 1 University of Hawai i at Mānoa Handout #20 Specifying Latent Curve and Other Growth Models Using Mplus (Revised 12-1-2014) The SEM approach offers a contrasting framework for use in analyzing

More information

Random Vectors, Random Matrices, and Matrix Expected Value

Random Vectors, Random Matrices, and Matrix Expected Value Random Vectors, Random Matrices, and Matrix Expected Value James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) 1 / 16 Random Vectors,

More information

Introduction to Factor Analysis

Introduction to Factor Analysis to Factor Analysis Lecture 10 August 2, 2011 Advanced Multivariate Statistical Methods ICPSR Summer Session #2 Lecture #10-8/3/2011 Slide 1 of 55 Today s Lecture Factor Analysis Today s Lecture Exploratory

More information

An Introduction to Path Analysis

An Introduction to Path Analysis An Introduction to Path Analysis Developed by Sewall Wright, path analysis is a method employed to determine whether or not a multivariate set of nonexperimental data fits well with a particular (a priori)

More information

C A R I B B E A N E X A M I N A T I O N S C O U N C I L REPORT ON CANDIDATES WORK IN THE CARIBBEAN SECONDARY EDUCATION CERTIFICATE EXAMINATION

C A R I B B E A N E X A M I N A T I O N S C O U N C I L REPORT ON CANDIDATES WORK IN THE CARIBBEAN SECONDARY EDUCATION CERTIFICATE EXAMINATION C A R I B B E A N E X A M I N A T I O N S C O U N C I L REPORT ON CANDIDATES WORK IN THE CARIBBEAN SECONDARY EDUCATION CERTIFICATE EXAMINATION MAY/JUNE 2013 MATHEMATICS GENERAL PROFICIENCY EXAMINATION

More information

Evaluating the Sensitivity of Goodness-of-Fit Indices to Data Perturbation: An Integrated MC-SGR Approach

Evaluating the Sensitivity of Goodness-of-Fit Indices to Data Perturbation: An Integrated MC-SGR Approach Evaluating the Sensitivity of Goodness-of-Fit Indices to Data Perturbation: An Integrated MC-SGR Approach Massimiliano Pastore 1 and Luigi Lombardi 2 1 Department of Psychology University of Cagliari Via

More information

STAT 135 Lab 13 (Review) Linear Regression, Multivariate Random Variables, Prediction, Logistic Regression and the δ-method.

STAT 135 Lab 13 (Review) Linear Regression, Multivariate Random Variables, Prediction, Logistic Regression and the δ-method. STAT 135 Lab 13 (Review) Linear Regression, Multivariate Random Variables, Prediction, Logistic Regression and the δ-method. Rebecca Barter May 5, 2015 Linear Regression Review Linear Regression Review

More information

Supplemental material to accompany Preacher and Hayes (2008)

Supplemental material to accompany Preacher and Hayes (2008) Supplemental material to accompany Preacher and Hayes (2008) Kristopher J. Preacher University of Kansas Andrew F. Hayes The Ohio State University The multivariate delta method for deriving the asymptotic

More information

ECON 4160: Econometrics-Modelling and Systems Estimation Lecture 9: Multiple equation models II

ECON 4160: Econometrics-Modelling and Systems Estimation Lecture 9: Multiple equation models II ECON 4160: Econometrics-Modelling and Systems Estimation Lecture 9: Multiple equation models II Ragnar Nymoen Department of Economics University of Oslo 9 October 2018 The reference to this lecture is:

More information

Introduction to Factor Analysis

Introduction to Factor Analysis to Factor Analysis Lecture 11 November 2, 2005 Multivariate Analysis Lecture #11-11/2/2005 Slide 1 of 58 Today s Lecture Factor Analysis. Today s Lecture Exploratory factor analysis (EFA). Confirmatory

More information

An Introduction to Path Analysis

An Introduction to Path Analysis An Introduction to Path Analysis PRE 905: Multivariate Analysis Lecture 10: April 15, 2014 PRE 905: Lecture 10 Path Analysis Today s Lecture Path analysis starting with multivariate regression then arriving

More information

Linear Classification. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington

Linear Classification. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington Linear Classification CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington 1 Example of Linear Classification Red points: patterns belonging

More information

Structural equation modeling

Structural equation modeling Structural equation modeling Rex B Kline Concordia University Montréal ISTQL Set B B1 Data, path models Data o N o Form o Screening B2 B3 Sample size o N needed: Complexity Estimation method Distributions

More information

POLI 8501 Introduction to Maximum Likelihood Estimation

POLI 8501 Introduction to Maximum Likelihood Estimation POLI 8501 Introduction to Maximum Likelihood Estimation Maximum Likelihood Intuition Consider a model that looks like this: Y i N(µ, σ 2 ) So: E(Y ) = µ V ar(y ) = σ 2 Suppose you have some data on Y,

More information

Formulation of L 1 Norm Minimization in Gauss-Markov Models

Formulation of L 1 Norm Minimization in Gauss-Markov Models Formulation of L 1 Norm Minimization in Gauss-Markov Models AliReza Amiri-Simkooei 1 Abstract: L 1 norm minimization adjustment is a technique to detect outlier observations in geodetic networks. The usual

More information

Generating Valid 4 4 Correlation Matrices

Generating Valid 4 4 Correlation Matrices Generating Valid 4 4 Correlation Matrices Mark Budden, Paul Hadavas, Lorrie Hoffman, Chris Pretz Date submitted: 10 March 2006 Abstract In this article, we provide an algorithm for generating valid 4 4

More information

Factor analysis. George Balabanis

Factor analysis. George Balabanis Factor analysis George Balabanis Key Concepts and Terms Deviation. A deviation is a value minus its mean: x - mean x Variance is a measure of how spread out a distribution is. It is computed as the average

More information

STRUCTURAL EQUATION MODELING. Khaled Bedair Statistics Department Virginia Tech LISA, Summer 2013

STRUCTURAL EQUATION MODELING. Khaled Bedair Statistics Department Virginia Tech LISA, Summer 2013 STRUCTURAL EQUATION MODELING Khaled Bedair Statistics Department Virginia Tech LISA, Summer 2013 Introduction: Path analysis Path Analysis is used to estimate a system of equations in which all of the

More information

Multilevel Analysis, with Extensions

Multilevel Analysis, with Extensions May 26, 2010 We start by reviewing the research on multilevel analysis that has been done in psychometrics and educational statistics, roughly since 1985. The canonical reference (at least I hope so) is

More information

Introduction to Confirmatory Factor Analysis

Introduction to Confirmatory Factor Analysis Introduction to Confirmatory Factor Analysis Multivariate Methods in Education ERSH 8350 Lecture #12 November 16, 2011 ERSH 8350: Lecture 12 Today s Class An Introduction to: Confirmatory Factor Analysis

More information

Reconciling factor-based and composite-based approaches to structural equation modeling

Reconciling factor-based and composite-based approaches to structural equation modeling Reconciling factor-based and composite-based approaches to structural equation modeling Edward E. Rigdon (erigdon@gsu.edu) Modern Modeling Methods Conference May 20, 2015 Thesis: Arguments for factor-based

More information

Texas A & M University

Texas A & M University Capraro & Capraro Commonality Analysis: Understanding Variance Contributions to Overall Canonical Correlation Effects of Attitude Toward Mathematics on Geometry Achievement Robert M. Capraro Mary Margaret

More information

An Introduction to Matrix Algebra

An Introduction to Matrix Algebra An Introduction to Matrix Algebra EPSY 905: Fundamentals of Multivariate Modeling Online Lecture #8 EPSY 905: Matrix Algebra In This Lecture An introduction to matrix algebra Ø Scalars, vectors, and matrices

More information

Lecture 4: Types of errors. Bayesian regression models. Logistic regression

Lecture 4: Types of errors. Bayesian regression models. Logistic regression Lecture 4: Types of errors. Bayesian regression models. Logistic regression A Bayesian interpretation of regularization Bayesian vs maximum likelihood fitting more generally COMP-652 and ECSE-68, Lecture

More information

Key Algebraic Results in Linear Regression

Key Algebraic Results in Linear Regression Key Algebraic Results in Linear Regression James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) 1 / 30 Key Algebraic Results in

More information

miivfind: A command for identifying model-implied instrumental variables for structural equation models in Stata

miivfind: A command for identifying model-implied instrumental variables for structural equation models in Stata The Stata Journal (yyyy) vv, Number ii, pp. 1 16 miivfind: A command for identifying model-implied instrumental variables for structural equation models in Stata Shawn Bauldry University of Alabama at

More information

Running head: AUTOCORRELATION IN THE COFM. The Effects of Autocorrelation on the Curve-of-Factors Growth Model

Running head: AUTOCORRELATION IN THE COFM. The Effects of Autocorrelation on the Curve-of-Factors Growth Model Autocorrelation in the COFM 1 Running head: AUTOCORRELATION IN THE COFM The Effects of Autocorrelation on the Curve-of-Factors Growth Model Daniel L. Murphy Pearson S. Natasha Beretvas and Keenan A. Pituch

More information

Longitudinal and Panel Data: Analysis and Applications for the Social Sciences. Table of Contents

Longitudinal and Panel Data: Analysis and Applications for the Social Sciences. Table of Contents Longitudinal and Panel Data Preface / i Longitudinal and Panel Data: Analysis and Applications for the Social Sciences Table of Contents August, 2003 Table of Contents Preface i vi 1. Introduction 1.1

More information

INSTRUCTIONAL FOCUS DOCUMENT HS/Algebra 1

INSTRUCTIONAL FOCUS DOCUMENT HS/Algebra 1 Possible Lesson 01 (12 days) State Resources: Algebra 1 End of Course Success: Representations and Support: Objective 1 Lesson 1 Problem Solving Boards, Equation Representation, On Your Own: Writing an

More information

Improper Solutions in Exploratory Factor Analysis: Causes and Treatments

Improper Solutions in Exploratory Factor Analysis: Causes and Treatments Improper Solutions in Exploratory Factor Analysis: Causes and Treatments Yutaka Kano Faculty of Human Sciences, Osaka University Suita, Osaka 565, Japan. email: kano@hus.osaka-u.ac.jp Abstract: There are

More information

Prentice Hall Mathematics, Algebra 1, South Carolina Edition 2011

Prentice Hall Mathematics, Algebra 1, South Carolina Edition 2011 Prentice Hall Mathematics, Algebra 1, South Carolina Edition 2011 C O R R E L A T E D T O South Carolina Academic Standards for Mathematics 2007, Elementary Algebra Elementary Algebra Overview The academic

More information

FACTOR ANALYSIS AND MULTIDIMENSIONAL SCALING

FACTOR ANALYSIS AND MULTIDIMENSIONAL SCALING FACTOR ANALYSIS AND MULTIDIMENSIONAL SCALING Vishwanath Mantha Department for Electrical and Computer Engineering Mississippi State University, Mississippi State, MS 39762 mantha@isip.msstate.edu ABSTRACT

More information

Lecture Notes Part 2: Matrix Algebra

Lecture Notes Part 2: Matrix Algebra 17.874 Lecture Notes Part 2: Matrix Algebra 2. Matrix Algebra 2.1. Introduction: Design Matrices and Data Matrices Matrices are arrays of numbers. We encounter them in statistics in at least three di erent

More information

Gramian Matrices in Covariance Structure Models

Gramian Matrices in Covariance Structure Models Gramian Matrices in Covariance Structure Models P. M. Bentler, University of California, Los Angeles Mortaza Jamshidian, Isfahan University of Technology Covariance structure models frequently contain

More information

A Introduction to Matrix Algebra and the Multivariate Normal Distribution

A Introduction to Matrix Algebra and the Multivariate Normal Distribution A Introduction to Matrix Algebra and the Multivariate Normal Distribution PRE 905: Multivariate Analysis Spring 2014 Lecture 6 PRE 905: Lecture 7 Matrix Algebra and the MVN Distribution Today s Class An

More information

Using Power Tables to Compute Statistical Power in Multilevel Experimental Designs

Using Power Tables to Compute Statistical Power in Multilevel Experimental Designs A peer-reviewed electronic journal. Copyright is retained by the first or sole author, who grants right of first publication to the Practical Assessment, Research & Evaluation. Permission is granted to

More information

Fairfield Public Schools

Fairfield Public Schools Mathematics Fairfield Public Schools AP Calculus AB AP Calculus AB BOE Approved 04/08/2014 1 AP CALCULUS AB Critical Areas of Focus Advanced Placement Calculus AB consists of a full year of college calculus.

More information

Instrumental variables regression on the Poverty data

Instrumental variables regression on the Poverty data Instrumental variables regression on the Poverty data /********************** poverty2.sas **************************/ options linesize=79 noovp formdlim='-' nodate; title 'UN Poverty Data: Instrumental

More information

ECNS 561 Multiple Regression Analysis

ECNS 561 Multiple Regression Analysis ECNS 561 Multiple Regression Analysis Model with Two Independent Variables Consider the following model Crime i = β 0 + β 1 Educ i + β 2 [what else would we like to control for?] + ε i Here, we are taking

More information

The classifier. Theorem. where the min is over all possible classifiers. To calculate the Bayes classifier/bayes risk, we need to know

The classifier. Theorem. where the min is over all possible classifiers. To calculate the Bayes classifier/bayes risk, we need to know The Bayes classifier Theorem The classifier satisfies where the min is over all possible classifiers. To calculate the Bayes classifier/bayes risk, we need to know Alternatively, since the maximum it is

More information

The classifier. Linear discriminant analysis (LDA) Example. Challenges for LDA

The classifier. Linear discriminant analysis (LDA) Example. Challenges for LDA The Bayes classifier Linear discriminant analysis (LDA) Theorem The classifier satisfies In linear discriminant analysis (LDA), we make the (strong) assumption that where the min is over all possible classifiers.

More information

2012 Assessment Report. Mathematics with Calculus Level 3 Statistics and Modelling Level 3

2012 Assessment Report. Mathematics with Calculus Level 3 Statistics and Modelling Level 3 National Certificate of Educational Achievement 2012 Assessment Report Mathematics with Calculus Level 3 Statistics and Modelling Level 3 90635 Differentiate functions and use derivatives to solve problems

More information

Multilevel Models in Matrix Form. Lecture 7 July 27, 2011 Advanced Multivariate Statistical Methods ICPSR Summer Session #2

Multilevel Models in Matrix Form. Lecture 7 July 27, 2011 Advanced Multivariate Statistical Methods ICPSR Summer Session #2 Multilevel Models in Matrix Form Lecture 7 July 27, 2011 Advanced Multivariate Statistical Methods ICPSR Summer Session #2 Today s Lecture Linear models from a matrix perspective An example of how to do

More information

Journal of Geoscience Education, v. 46, n. 3, p , May 1998 (edits, June 2005)

Journal of Geoscience Education, v. 46, n. 3, p , May 1998 (edits, June 2005) Journal of Geoscience Education, v. 46, n. 3, p. 292-295, May 1998 (edits, June 2005) Computational Geology 1 Significant Figures! H.L. Vacher, Department of Geology, University of South Florida, 4202

More information

Do not copy, quote, or cite without permission LECTURE 4: THE GENERAL LISREL MODEL

Do not copy, quote, or cite without permission LECTURE 4: THE GENERAL LISREL MODEL LECTURE 4: THE GENERAL LISREL MODEL I. QUICK REVIEW OF A LITTLE MATRIX ALGEBRA. II. A SIMPLE RECURSIVE MODEL IN LATENT VARIABLES. III. THE GENERAL LISREL MODEL IN MATRIX FORM. A. SPECIFYING STRUCTURAL

More information

Chapter 4: Factor Analysis

Chapter 4: Factor Analysis Chapter 4: Factor Analysis In many studies, we may not be able to measure directly the variables of interest. We can merely collect data on other variables which may be related to the variables of interest.

More information

Parametric Unsupervised Learning Expectation Maximization (EM) Lecture 20.a

Parametric Unsupervised Learning Expectation Maximization (EM) Lecture 20.a Parametric Unsupervised Learning Expectation Maximization (EM) Lecture 20.a Some slides are due to Christopher Bishop Limitations of K-means Hard assignments of data points to clusters small shift of a

More information

Variables and Functions: Using Geometry to Explore Important Concepts in Algebra

Variables and Functions: Using Geometry to Explore Important Concepts in Algebra Variables and Functions: Using Geometry to Explore Important Concepts in Algebra Scott Steketee KCP Technologies University of Pennsylvania Graduate School of Education stek@kcptech.com Abstract: Students

More information

2/26/2017. PSY 512: Advanced Statistics for Psychological and Behavioral Research 2

2/26/2017. PSY 512: Advanced Statistics for Psychological and Behavioral Research 2 PSY 512: Advanced Statistics for Psychological and Behavioral Research 2 What is SEM? When should we use SEM? What can SEM tell us? SEM Terminology and Jargon Technical Issues Types of SEM Models Limitations

More information

A Matrix Theoretic Derivation of the Kalman Filter

A Matrix Theoretic Derivation of the Kalman Filter A Matrix Theoretic Derivation of the Kalman Filter 4 September 2008 Abstract This paper presents a matrix-theoretic derivation of the Kalman filter that is accessible to students with a strong grounding

More information

OAKLYN PUBLIC SCHOOL MATHEMATICS CURRICULUM MAP EIGHTH GRADE

OAKLYN PUBLIC SCHOOL MATHEMATICS CURRICULUM MAP EIGHTH GRADE OAKLYN PUBLIC SCHOOL MATHEMATICS CURRICULUM MAP EIGHTH GRADE STANDARD 8.NS THE NUMBER SYSTEM Big Idea: Numeric reasoning involves fluency and facility with numbers. Learning Targets: Students will know

More information

Mathematical Methods for Numerical Analysis and Optimization

Mathematical Methods for Numerical Analysis and Optimization Biyani's Think Tank Concept based notes Mathematical Methods for Numerical Analysis and Optimization (MCA) Varsha Gupta Poonam Fatehpuria M.Sc. (Maths) Lecturer Deptt. of Information Technology Biyani

More information

Lecture 25: November 27

Lecture 25: November 27 10-725: Optimization Fall 2012 Lecture 25: November 27 Lecturer: Ryan Tibshirani Scribes: Matt Wytock, Supreeth Achar Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer: These notes have

More information

RESMA course Introduction to LISREL. Harry Ganzeboom RESMA Data Analysis & Report #4 February

RESMA course Introduction to LISREL. Harry Ganzeboom RESMA Data Analysis & Report #4 February RESMA course Introduction to LISREL Harry Ganzeboom RESMA Data Analysis & Report #4 February 17 2009 LISREL SEM: Simultaneous [Structural] Equations Model: A system of linear equations ( causal model )

More information

1 Motivation for Instrumental Variable (IV) Regression

1 Motivation for Instrumental Variable (IV) Regression ECON 370: IV & 2SLS 1 Instrumental Variables Estimation and Two Stage Least Squares Econometric Methods, ECON 370 Let s get back to the thiking in terms of cross sectional (or pooled cross sectional) data

More information

Restricted Maximum Likelihood in Linear Regression and Linear Mixed-Effects Model

Restricted Maximum Likelihood in Linear Regression and Linear Mixed-Effects Model Restricted Maximum Likelihood in Linear Regression and Linear Mixed-Effects Model Xiuming Zhang zhangxiuming@u.nus.edu A*STAR-NUS Clinical Imaging Research Center October, 015 Summary This report derives

More information

Model fit evaluation in multilevel structural equation models

Model fit evaluation in multilevel structural equation models Model fit evaluation in multilevel structural equation models Ehri Ryu Journal Name: Frontiers in Psychology ISSN: 1664-1078 Article type: Review Article Received on: 0 Sep 013 Accepted on: 1 Jan 014 Provisional

More information

Testing Structural Equation Models: The Effect of Kurtosis

Testing Structural Equation Models: The Effect of Kurtosis Testing Structural Equation Models: The Effect of Kurtosis Tron Foss, Karl G Jöreskog & Ulf H Olsson Norwegian School of Management October 18, 2006 Abstract Various chi-square statistics are used for

More information

The 3 Indeterminacies of Common Factor Analysis

The 3 Indeterminacies of Common Factor Analysis The 3 Indeterminacies of Common Factor Analysis James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) The 3 Indeterminacies of Common

More information

chapter 11 ALGEBRAIC SYSTEMS GOALS

chapter 11 ALGEBRAIC SYSTEMS GOALS chapter 11 ALGEBRAIC SYSTEMS GOALS The primary goal of this chapter is to make the reader aware of what an algebraic system is and how algebraic systems can be studied at different levels of abstraction.

More information

CS281 Section 4: Factor Analysis and PCA

CS281 Section 4: Factor Analysis and PCA CS81 Section 4: Factor Analysis and PCA Scott Linderman At this point we have seen a variety of machine learning models, with a particular emphasis on models for supervised learning. In particular, we

More information

Package semgof. February 20, 2015

Package semgof. February 20, 2015 Package semgof February 20, 2015 Version 0.2-0 Date 2012-08-06 Title Goodness-of-fit indexes for structural equation models Author Elena Bertossi Maintainer Elena Bertossi

More information

Network data in regression framework

Network data in regression framework 13-14 July 2009 University of Salerno (Italy) Network data in regression framework Maria ProsperinaVitale Department of Economics and Statistics University of Salerno (Italy) mvitale@unisa.it - Theoretical

More information

Eco517 Fall 2004 C. Sims MIDTERM EXAM

Eco517 Fall 2004 C. Sims MIDTERM EXAM Eco517 Fall 2004 C. Sims MIDTERM EXAM Answer all four questions. Each is worth 23 points. Do not devote disproportionate time to any one question unless you have answered all the others. (1) We are considering

More information

An Introduction to Mplus and Path Analysis

An Introduction to Mplus and Path Analysis An Introduction to Mplus and Path Analysis PSYC 943: Fundamentals of Multivariate Modeling Lecture 10: October 30, 2013 PSYC 943: Lecture 10 Today s Lecture Path analysis starting with multivariate regression

More information

Lecture: Simultaneous Equation Model (Wooldridge s Book Chapter 16)

Lecture: Simultaneous Equation Model (Wooldridge s Book Chapter 16) Lecture: Simultaneous Equation Model (Wooldridge s Book Chapter 16) 1 2 Model Consider a system of two regressions y 1 = β 1 y 2 + u 1 (1) y 2 = β 2 y 1 + u 2 (2) This is a simultaneous equation model

More information

Estimating Coefficients in Linear Models: It Don't Make No Nevermind

Estimating Coefficients in Linear Models: It Don't Make No Nevermind Psychological Bulletin 1976, Vol. 83, No. 2. 213-217 Estimating Coefficients in Linear Models: It Don't Make No Nevermind Howard Wainer Department of Behavioral Science, University of Chicago It is proved

More information

Unit 4 Patterns and Algebra

Unit 4 Patterns and Algebra Unit 4 Patterns and Algebra In this unit, students will solve equations with integer coefficients using a variety of methods, and apply their reasoning skills to find mistakes in solutions of these equations.

More information

Inference using structural equations with latent variables

Inference using structural equations with latent variables This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this

More information

RAJASTHAN PUBLIC SERVICE COMMISSION, AJMER

RAJASTHAN PUBLIC SERVICE COMMISSION, AJMER RAJASTHAN PUBLIC SERVICE COMMISSION, AJMER SYLLABUS FOR EXAMINATION FOR THE POST OF LECTURER - MATHEMATICS, (SCHOOL EDUCATION) Paper - II Part I (Senior Secondary Standard) 1 Sets, Relations and Functions

More information

UCLA Department of Statistics Papers

UCLA Department of Statistics Papers UCLA Department of Statistics Papers Title Can Interval-level Scores be Obtained from Binary Responses? Permalink https://escholarship.org/uc/item/6vg0z0m0 Author Peter M. Bentler Publication Date 2011-10-25

More information

Structural Equations with Latent Variables

Structural Equations with Latent Variables Structural Equations with Latent Variables Structural Equations with Latent Variables KENNETH A. BOLLEN Department of Sociology The University of North Carolina at Chapel Hill Chapel Hill, North Carolina

More information

STRUCTURAL EQUATION MODELS WITH LATENT VARIABLES

STRUCTURAL EQUATION MODELS WITH LATENT VARIABLES STRUCTURAL EQUATION MODELS WITH LATENT VARIABLES Albert Satorra Departament d Economia i Empresa Universitat Pompeu Fabra Structural Equation Modeling (SEM) is widely used in behavioural, social and economic

More information

LINEAR MODELS FOR CLASSIFICATION. J. Elder CSE 6390/PSYC 6225 Computational Modeling of Visual Perception

LINEAR MODELS FOR CLASSIFICATION. J. Elder CSE 6390/PSYC 6225 Computational Modeling of Visual Perception LINEAR MODELS FOR CLASSIFICATION Classification: Problem Statement 2 In regression, we are modeling the relationship between a continuous input variable x and a continuous target variable t. In classification,

More information

INTRODUCTION TO STRUCTURAL EQUATION MODELS

INTRODUCTION TO STRUCTURAL EQUATION MODELS I. Description of the course. INTRODUCTION TO STRUCTURAL EQUATION MODELS A. Objectives and scope of the course. B. Logistics of enrollment, auditing, requirements, distribution of notes, access to programs.

More information

PLEASANTON UNIFIED SCHOOL DISTRICT 8 Course Outline Form

PLEASANTON UNIFIED SCHOOL DISTRICT 8 Course Outline Form PLEASANTON UNIFIED SCHOOL DISTRICT 8 Course Outline Form Course Title: Math 8 Course Number/CBED Number: Grade Levels: Length of Course: Eighth Grade One Year Credit: 10 Meets Graduation Requirements:

More information

Equating Tests Under The Nominal Response Model Frank B. Baker

Equating Tests Under The Nominal Response Model Frank B. Baker Equating Tests Under The Nominal Response Model Frank B. Baker University of Wisconsin Under item response theory, test equating involves finding the coefficients of a linear transformation of the metric

More information

Part 8: GLMs and Hierarchical LMs and GLMs

Part 8: GLMs and Hierarchical LMs and GLMs Part 8: GLMs and Hierarchical LMs and GLMs 1 Example: Song sparrow reproductive success Arcese et al., (1992) provide data on a sample from a population of 52 female song sparrows studied over the course

More information

Constructing and solving linear equations

Constructing and solving linear equations Key Stage 3 National Strategy Guidance Curriculum and Standards Interacting with mathematics in Key Stage 3 Constructing and solving linear equations Teachers of mathematics Status: Recommended Date of

More information

Exercises * on Principal Component Analysis

Exercises * on Principal Component Analysis Exercises * on Principal Component Analysis Laurenz Wiskott Institut für Neuroinformatik Ruhr-Universität Bochum, Germany, EU 4 February 207 Contents Intuition 3. Problem statement..........................................

More information

Introduction to Matrix Algebra and the Multivariate Normal Distribution

Introduction to Matrix Algebra and the Multivariate Normal Distribution Introduction to Matrix Algebra and the Multivariate Normal Distribution Introduction to Structural Equation Modeling Lecture #2 January 18, 2012 ERSH 8750: Lecture 2 Motivation for Learning the Multivariate

More information

UNIVERSITY OF TORONTO MISSISSAUGA April 2009 Examinations STA431H5S Professor Jerry Brunner Duration: 3 hours

UNIVERSITY OF TORONTO MISSISSAUGA April 2009 Examinations STA431H5S Professor Jerry Brunner Duration: 3 hours Name (Print): Student Number: Signature: Last/Surname First /Given Name UNIVERSITY OF TORONTO MISSISSAUGA April 2009 Examinations STA431H5S Professor Jerry Brunner Duration: 3 hours Aids allowed: Calculator

More information

Multilevel Structural Equation Model with. Gifi System in Understanding the. Satisfaction of Health Condition at Java

Multilevel Structural Equation Model with. Gifi System in Understanding the. Satisfaction of Health Condition at Java Applied Mathematical Sciences, Vol., 07, no. 6, 773-78 HIKARI Ltd, www.m-hikari.com https://doi.org/0.988/ams.07.77 Multilevel Structural Equation Model with Gifi System in Understanding the Satisfaction

More information

Lecture Notes: Geometric Considerations in Unconstrained Optimization

Lecture Notes: Geometric Considerations in Unconstrained Optimization Lecture Notes: Geometric Considerations in Unconstrained Optimization James T. Allison February 15, 2006 The primary objectives of this lecture on unconstrained optimization are to: Establish connections

More information

Content Descriptions Based on the Common Core Georgia Performance Standards (CCGPS) CCGPS Coordinate Algebra

Content Descriptions Based on the Common Core Georgia Performance Standards (CCGPS) CCGPS Coordinate Algebra Content Descriptions Based on the Common Core Georgia Performance Standards (CCGPS) CCGPS Coordinate Algebra Introduction The State Board of Education is required by Georgia law (A+ Educational Reform

More information

Nesting and Equivalence Testing

Nesting and Equivalence Testing Nesting and Equivalence Testing Tihomir Asparouhov and Bengt Muthén August 13, 2018 Abstract In this note, we discuss the nesting and equivalence testing (NET) methodology developed in Bentler and Satorra

More information