VAR2 VAR3 VAR4 VAR5. Or, in terms of basic measurement theory, we could model it as:

Similar documents
Or, in terms of basic measurement theory, we could model it as:

Applied Multivariate Analysis

Factor analysis. George Balabanis

LECTURE 4 PRINCIPAL COMPONENTS ANALYSIS / EXPLORATORY FACTOR ANALYSIS

Statistical Analysis of Factors that Influence Voter Response Using Factor Analysis and Principal Component Analysis

1 A factor can be considered to be an underlying latent variable: (a) on which people differ. (b) that is explained by unknown variables

Exploratory Factor Analysis and Canonical Correlation

Exploratory Factor Analysis and Principal Component Analysis

Exploratory Factor Analysis and Principal Component Analysis

Y (Nominal/Categorical) 1. Metric (interval/ratio) data for 2+ IVs, and categorical (nominal) data for a single DV

Factor Analysis Using SPSS

2/26/2017. This is similar to canonical correlation in some ways. PSY 512: Advanced Statistics for Psychological and Behavioral Research 2

Factor Analysis. Summary. Sample StatFolio: factor analysis.sgp

Introduction to Factor Analysis

Principal Component Analysis & Factor Analysis. Psych 818 DeShon

Multivariate and Multivariable Regression. Stella Babalola Johns Hopkins University

Introduction to Factor Analysis

E X P L O R E R. R e l e a s e A Program for Common Factor Analysis and Related Models for Data Analysis

Principal Components Analysis using R Francis Huang / November 2, 2016

Multivariate Fundamentals: Rotation. Exploratory Factor Analysis

B. Weaver (18-Oct-2001) Factor analysis Chapter 7: Factor Analysis

Factor Analysis. -Applied Multivariate Analysis- Lecturer: Darren Homrighausen, PhD

Principles of factor analysis. Roger Watson

Inter Item Correlation Matrix (R )

STAT 730 Chapter 9: Factor analysis

Factor Analysis Continued. Psy 524 Ainsworth

FACTOR ANALYSIS AND MULTIDIMENSIONAL SCALING

7. Assumes that there is little or no multicollinearity (however, SPSS will not assess this in the [binary] Logistic Regression procedure).

Package rela. R topics documented: February 20, Version 4.1 Date Title Item Analysis Package with Standard Errors

Manual Of The Program FACTOR. v Windows XP/Vista/W7/W8. Dr. Urbano Lorezo-Seva & Dr. Pere Joan Ferrando

2. Sample representativeness. That means some type of probability/random sampling.

The European Commission s science and knowledge service. Joint Research Centre

Principal Components. Summary. Sample StatFolio: pca.sgp

Factor Analysis Edpsy/Soc 584 & Psych 594

UCLA STAT 233 Statistical Methods in Biomedical Imaging

Factor Analysis: An Introduction. What is Factor Analysis? 100+ years of Factor Analysis FACTOR ANALYSIS AN INTRODUCTION NILAM RAM

Principal Component Analysis. Applied Multivariate Statistics Spring 2012

Unconstrained Ordination

Dimensionality Reduction Techniques (DRT)

Structural Equation Modeling and Confirmatory Factor Analysis. Types of Variables

L i s t i n g o f C o m m a n d F i l e

Short Answer Questions: Answer on your separate blank paper. Points are given in parentheses.

Canonical Correlation & Principle Components Analysis

Chapter 4: Factor Analysis

Factor Analysis (1) Factor Analysis

Neuendorf MANOVA /MANCOVA. Model: MAIN EFFECTS: X1 (Factor A) X2 (Factor B) INTERACTIONS : X1 x X2 (A x B Interaction) Y4. Like ANOVA/ANCOVA:

Didacticiel - Études de cas

How to Run the Analysis: To run a principal components factor analysis, from the menus choose: Analyze Dimension Reduction Factor...

Repeated-Measures ANOVA in SPSS Correct data formatting for a repeated-measures ANOVA in SPSS involves having a single line of data for each

Dimensionality Assessment: Additional Methods

Neuendorf MANOVA /MANCOVA. Model: X1 (Factor A) X2 (Factor B) X1 x X2 (Interaction) Y4. Like ANOVA/ANCOVA:

PRINCIPAL COMPONENTS ANALYSIS (PCA)

2. Sample representativeness. That means some type of probability/random sampling.

Principle Components Analysis (PCA) Relationship Between a Linear Combination of Variables and Axes Rotation for PCA

Applied Multivariate Analysis

Admin. Assignment 2: Final Exam. Small Group Presentations. - Due now.

EFA. Exploratory Factor Analysis

My data doesn t look like that..

An Introduction to SEM in Mplus

Research on the Influence Factors of Urban-Rural Income Disparity Based on the Data of Shandong Province

Neuendorf MANOVA /MANCOVA. Model: X1 (Factor A) X2 (Factor B) X1 x X2 (Interaction) Y4. Like ANOVA/ANCOVA:

MSP Research Note. RDQ Reliability, Validity and Norms

The Common Factor Model. Measurement Methods Lecture 15 Chapter 9

Intermediate Social Statistics

Principal Components Analysis and Exploratory Factor Analysis

Chapter 3: Testing alternative models of data

Part 2: EFA Outline. Exploratory and Confirmatory Factor Analysis. Basic ideas: 1. Linear regression on common factors. Basic Ideas of Factor Analysis

Using SPSS for One Way Analysis of Variance

Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti

Factor Analysis. Statistical Background. Chapter. Herb Stenson and Leland Wilkinson

Multivariate Statistics

Penalized varimax. Abstract

4:3 LEC - PLANNED COMPARISONS AND REGRESSION ANALYSES

Basics of Multivariate Modelling and Data Analysis

Introduction to Confirmatory Factor Analysis

Structural Equation Modelling

Lecture 4: Principal Component Analysis and Linear Dimension Reduction

ANCOVA. Lecture 9 Andrew Ainsworth

Supplementary materials for: Exploratory Structural Equation Modeling

Principal Component Analysis (PCA) Theory, Practice, and Examples

Package paramap. R topics documented: September 20, 2017

Psychology 405: Psychometric Theory Constructs, Components, and Factor Models


Univariate analysis. Simple and Multiple Regression. Univariate analysis. Simple Regression How best to summarise the data?

The European Commission s science and knowledge service. Joint Research Centre

Reliability Estimates for Three Factor Score Estimators

Composite scales and scores

Logistic Regression: Regression with a Binary Dependent Variable

PRINCIPAL COMPONENTS ANALYSIS

Wed, June 26, (Lecture 8-2). Nonlinearity. Significance test for correlation R-squared, SSE, and SST. Correlation in SPSS.

APPLICATION OF PRINCIPAL COMPONENT ANALYSIS IN MONITORING EDUCATIONAL PROSPECTS IN MATHEMATICS RELATED CAREERS IN THE UPPER EAST REGION OF GHANA

One-Way Repeated Measures Contrasts

Factor Analysis. Qian-Li Xue

Introduction to Random Effects of Time and Model Estimation

Confidence Intervals for One-Way Repeated Measures Contrasts

Using Structural Equation Modeling to Conduct Confirmatory Factor Analysis

DISCOVERING STATISTICS USING R

International Journal of Advances in Management, Economics and Entrepreneurship. Available online at: RESEARCH ARTICLE

CHAPTER 6: SPECIFICATION VARIABLES

STRUCTURAL EQUATION MODELING. Khaled Bedair Statistics Department Virginia Tech LISA, Summer 2013

Transcription:

1 Neuendorf Factor Analysis Assumptions: 1. Metric (interval/ratio) data 2. Linearity (in the relationships among the variables) -Factors are linear constructions of the set of variables (see #8 under Statistics) -The critical source of info for the factor analysis is typically the correlation matrix among all variables and correlations are linear tests 3. Univariate and multivariate normal distributions 4. There are no dependent variables specified... Instead, the model is: VAR1 VAR2 F1 VAR3 VAR4 F2 VAR5 Or, in terms of basic measurement theory, we could model it as: U1 VAR1 F1 U2 VAR2 U3 VAR3 U4 VAR4 F2 U5 VAR5 Meaning that each variable is composed of the common variance it shares with other variables (F1 and F2) and its own unique variance that is not shared (e.g., U1). The proportion of a variable s variance that is common is called its communality. (This is the same as 1 U).

2 Decisions to make: 1. Intent: Exploratory vs. Confirmatory (this is essentially SPSS vs. AMOS) 2. Factor extraction (how factors represent variables): Principal Component Factoring (the most typical): -factors derived on the basis of total variance for each variable (both the variance accounted for by F s and U s (unique variances) above. -uses the "regular" correlation matrix. -more widely used than common factoring. Common Factoring: -factors derived on basis on common variance only (variance accounted for by F s, but not the U s above). -uses a correlation matrix with estimated communalities, not 1.0s, in diagonals. There are several options for common factor extraction--principal axis factoring (most typical), unweighted least squares, generalized least squares, maximum-likelihood, alpha, and image (see an SPSS manual or SPSS help for more info.). 3. Rotation: Rotating the factors leads to greater interpretability of factors; since there is no true DV or criterion, there is no reason not to rotate in factor analysis. Rotation will maximize simple structure where each item hopefully has a high loading on one factor and low loadings on all others. There are two rotation options: Orthogonal Factors -factors will be completely uncorrelated (orthogonal=at right angles to each other). -SPSS provides 3 types--varimax (by far the most typical), quartimax, and equamax (your book calls this equimax). See Hair p. 115 for comparisons. Oblique Factors -factors are allowed to correlate (but don't have to) -SPSS provides two types--direct oblimin (most typical) and promax. 4. How many factors? A. Theory (a priori). B. Latent root criterion--eigenvalues of 1+ (a factor must account for at least the amount of variance associated with one hypothetical variable). C. Scree test--plots eigenvalues sequentially. Look for a "natural break" between the mountain and the scree.

3 Statistics: 1. Remember that the interitem correlation matrix is the basis, the "raw material," for the typical factor analysis. Two statistics help assess whether the items are indeed correlated enough to proceed with factor analysis. Be reminded of Hair et al. s and others caution that large samples and a large number of items will contribute to strong and significant statistics here: A. Bartlett's test of sphericity--indicates whether the correlation matrix is significantly different from an identity matrix (a matrix with all 1's on the diagonal, 0's everywhere else). Bartlett s calculation is based on a chi-square transformation of the determinant of the correlation matrix. Statistical significance is good here (note that this is the only statistical significance test in the Factor Analysis output!). It tells us that this set of measures is intercorrelated enough so as to be appropriate for factor analysis. B. KMO (Kaiser-Meyer-Olkin) measure of sampling adequacy (MSA)--assesses whether your sample of items (not people) is adequate by comparing r's with pr's. We look for large values for the KMO (approaching 1.0). Note that Hair et al. call the statistic MSA while SPSS calls it KMO. Although there is no significance test, Hair (p. 104) lists some pretty fun rules of thumb for the stat (i.e.,.80 or above is meritorious ;.70 or above is middling ;.60 or above is mediocre ;.50 or above is miserable ; below.50 is unacceptable ). 2. Coefficients predicting variables from factors--found in the rotated component matrix in SPSS for orthogonal rotation, and in the pattern matrix for oblique rotation. The β s as below: VAR1 = β1 F1 + β2 F2 + β3 F3 VAR2 = β4 F1 + β5 F2 + β6 F3 VAR3 = β7 F1 + β8 F2 + β9 F3 etc. 3. Factor loadings, i.e., correlations between the variables and the factors--in orthogonal rotation, these are the same as the coefficients predicting variables from factors, in oblique they are not, and are contained in the structure matrix. In order for an item to have loaded on a factor, Hair says a.30 coefficient is minimal (others use a.50 cutoff). Hair (p. 117) provides a chart for assessing the statistical significance of each loading, something that is not provided by SPSS. 4. Communality (h 2 ) the total amount of variance a variable shares with all factors (and, therefore, the amount it shares with all other variables in the factor analysis). In an orthogonal rotation, the communality is the sum of all squared loadings for one variable. Rules of thumb are actually few, but Costello and Osborne (2005) do indicate a.40 as a minimum.

4 h 2 = how much (%) of the variable's variance is explained by all the factors; 1-h 2 = what is unique to the variable. e.g., β1 2 + β2 2 + β3 2 from #2 above would be the communality of VAR1, assuming orthogonal rotation. Here, the communality is comparable to R 2 VAR1 F1F2F3. 5. Eigenvalue--the sum of all squared loadings for one factor. Eigen = the amount (not expressed as a %) of the variables' total variance that is accounted for by one factor. e.g., β1 2 + β4 2 + β7 2, etc. from #2 above would be the eigenvalue for F1. 6. Proportion of total variance accounted for by F1 = eigenvalue for F1 k where k=# of variables 7. Proportion of common variance accounted for by F1 = eigenvalue for F1 SUM of eigenvalues 8. Factor score coefficients (betas predicting factors from variables β s as below; these are not the same as loadings or coefficients predicting variables from factors). The factor score coefficients are found in the component score coefficient matrix in SPSS. These values are used by SPSS to create "factor scores"--values on new scales created for each factor. These new scales must be saved in the FACTOR procedure if you want to use them for further analysis. There are three options--we typically use the regression method. SPSS will name the new scales automatically and place them at the end of the data set. F1 = β 1 VAR1z + β 2 VAR2z + β 3 VAR3z + β 4 VAR4z + etc. F2 = β 5 VAR1z + β 6 VAR2z + β 7 VAR3z + β 8 VAR4z + etc. F3 = β 9 VAR1z + β 10 VAR2z + β 11 VAR3z + β 12 VAR4z + etc. 9. Correlations among factors--for oblique rotation only! Found in the component correlation matrix, this indicates how much the various factors (and therefore their scales, if created) relate to one another in linear fashion. References Costello, A. B., & Osborne, J. W. (2005). Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Practical Assessment, Research & Evaluation, 10(7), 1-9. Accessed from: http://pareonline.net/getvn.asp?v=10&n=7 Kim, J.-O., & Mueller, C. W. (1978). Factor analysis: Statistical methods and practical issues. Newbury Park, CA: Sage Publications.

5 WORKSHEET--Making Sense of Factor Loadings For this exercise: -Assume all measures are 0-10 indicators of enjoyment of different TV/film genres. -Assume an orthogonal rotation. -Don't forget that loadings must be squared before adding to get communalities and eigenvalues. -Notice that the loadings are not ordered by size. Such an ordering makes it easier to interpret the matrix. SPSS will do this if you click Sorted by size under Factor Options. Label: Hypothetical Factor Loadings: Communalities Communalities at 3 at 10 FACTOR 1 FACTOR 2 FACTOR 3 Factors: Factors: Q1--News.25691.24989.80751.78 1.0 Q2--Westerns.61936.18150.10120.43 1.0 Q3--Action.61937 -.01767.32450.49 1.0 Q4--Weepies.01612.75095 -.10217.57 1.0 Q5--Game shows -.62921.09360.24968.47 1.0 Q6--Soaps -.42346.62972 -.00855.58 1.0 Q7--Talk shows -.54741.41574.01161.47 1.0 Q8--Comedies.48948.31404 -.45143.54 1.0 Q9--Horror.74784.18525 -.05745.60 1.0 Q10--Cop shows.68488.12134 -.06675? 1.0 Eigenvalues: 2.98 1.39 1.05 sum=? sum=10.0 % of Total Variance: %? %? %? %? % of Common Variance: %? %? %? 100% 2/17