Statistical Techniques II
|
|
- Leona Jacobs
- 5 years ago
- Views:
Transcription
1 Statistical Techniques II EST705 Regression with atrix Algebra 06a_atrix SLR
2 atrix Algebra We will not be doing our regressions with matrix algebra, except that the computer does employ matrices. In fact, there is really no other way to do the basic calculations. ou will be responsible for knowing about matrices only to the extent that ROC REG or ROC GL produces information. This is primarily the initial and final matrices. 06a_atrix SLR 2
3 So, what is a matrix? A matrix is a rectangular arrangement of numbers, usually represented by an upper case letter (A, B, C, D, etc.) A = D = L N O Q L N O Q 06a_atrix SLR 3
4 The dimensions of a matrix are given by the number of rows and columns in the matrix (i.e. the dimensions are r by c). For the matrices above, A is 2 by 2 D is 4 by 3 06a_atrix SLR 4
5 For a simple linear regression the matrices of initial interest would be the data matrices, a matrix of values of the dependent variable and an matrix of values of the independent variable. The matrix also has a column of ones added to fit the intercept. 06a_atrix SLR 5
6 = L N O Q = L N O Q 06a_atrix SLR 6
7 As with our algebraic calculations we need some intermediate values; sums, sums of squares and cross-products. These are obtained by calculating First, a transpose matrix for both and. This is simply the matrix turned on its side so the rows of the original matrix become the columns of the transpose. These are denoted ' and '. 06a_atrix SLR 7
8 L = N O Q = a_atrix SLR 8
9 We now calculate 3 matrices, ', ' and '. This requires matrix multiplication. = L N O Q L N O Q a_atrix SLR 9
10 Calculate ', ' and ' (continued). = L N O Q a_atrix SLR 0
11 Calculate ', ' and ' (continued). = L N O Q L N O Q a_atrix SLR
12 n i i i 2 i n i n i n = = = L N O Q i i i i n i n = = L N O Q i 2 i n = L N O Q The results of these 3 calculations are; ' = ' = ' = 06a_atrix SLR 2
13 Notice that the contents of these 3 matrices are the same as the values we used for the algebraic solution. Normal equations - when the equations needed to solve a simple linear regression are derived, the result is two equations with two unknowns that must be resolved. These are called the normal equations. 06a_atrix SLR 3
14 The normal equations are b 0 n + b Σ i = Σ i b 0 Σ i + b Σ 2 i = Σ i i If you solve these algebraically, you get the two equations we use to solve for b 0 and b. 06a_atrix SLR 4
15 When expressed as matrices this factors out to n n i b i i= 0 i= = n n b n 2 i i i i i= i= i= n 06a_atrix SLR 5
16 In simple matrix notation, a B matrix (vector) times the ' equals the '. (')B=' As with the algebraic equations we need to solve for B (i.e. b 0 and b ). If we do this with algebra, we get the usual equations. Solving the matrix equations we get, B=(') - ' 06a_atrix SLR 6
17 This equation is the matrix algebra solution for a simple linear regression. B=(') - ' Note that there is not such thing as matrix "division". As with the algebraic values, if we multiply the B values (B matrix) by the values we get the predicted values. B=(') - ' = hat vector 06a_atrix SLR 7
18 What do we need to know about these matrix calculations? We need to know that the solution to the problem using matrix algebra involves the same values as for the simple linear regression. We need to know that the (') - is a key component to this solution. We need to know that the predicted values require the matrix segment (') - ' times the vector (AIN DIAGONAL). 06a_atrix SLR 8
19 Why? We can get the matrices from SAS, but we want to understand what we have. (') - is a key component not only of the solution for the regression coefficients, but also for the variance-covariance matrix. The (') - ' matrix main diagonal is a diagnostic that we will use (hat diag). 06a_atrix SLR 9
20 But the most important reason for using matrices is that the solution for simple linear and multiple regression are the same. Basically, matrix algebra is the ONL way to solve multiple regressions. So, what do we get from SAS? If the options and I are placed on the model statement, we can get the ' matrix and the (') - matrix. 06a_atrix SLR 20
21 For the simple linear regression that we saw for the tree weights and diameters, these options produce the following output. odel Cross-products ' ' ' ' INTERCE DBH WEIGHT INTERCE DBH WEIGHT ' Inverse, arameter Estimates, and SSE INTERCE DBH WEIGHT INTERCE DBH WEIGHT a_atrix SLR 2
22 The first two rows and columns of numbers contain the ' matrix, which has the values for n, Σ i and Σ 2 i odel Cross-products ' ' ' ' INTERCE DBH INTERCE DBH a_atrix SLR 22
23 The last column has ' (values for Σ i and Σ i i ) and the last value is ' (Σ 2 i). odel Cross-products ' ' ' ' WEIGHT INTERCE 7359 DBH WEIGHT a_atrix SLR 23
24 In the ' inverse matrix section, the first two rows and columns of numbers contain the (') - matrix and the value in the third row and third column is the SSE. The other values are b 0 and b. ' Inverse, arameter Estimates, and SSE INTERCE DBH WEIGHT INTERCE DBH WEIGHT a_atrix SLR 24
25 ou will be responsible only for knowing where the 6 intermediate values are for simple linear regression, and where to find the (') - matrix. odel Cross-products ' ' ' ' INTERCE DBH WEIGHT INTERCE DBH WEIGHT ' Inverse, arameter Estimates, and SSE INTERCE DBH WEIGHT INTERCE DBH WEIGHT a_atrix SLR 25
26 ultiple Regression The only difference between simple linear regression and multiple regression is the fact that multiple regression has several independent variables ( i variables). There for the matrix ' will be larger. For a simple linear regression, ' is 2x2. For a 3 factor multiple regression (, 2, 3 and an intercept) the ' matrix will be 4x4. 06a_atrix SLR 26
27 = L N O Q = L N O Q 06a_atrix SLR 27
28 ultiple Regression (continued) The values contained in the matrix include all of the sums, sums of squares and cross-products for all of the i variables (plus n in the upper left hand corner). 06a_atrix SLR 28
29 ultiple Regression (continued) ' = n Σ i Σ 2i Σ 3i Σ i Σ 2 i Σ i 2i Σ i 3i Σ 2i Σ i 2i Σ 2 2i Σ 2i 3i Σ 3i Σ i 3i Σ 2i 3i Σ 2 3i 06a_atrix SLR 29
30 ultiple Regression (continued) ' = Σ i Σ i i Σ 2i i Σ 3i i 06a_atrix SLR 30
31 For the multiple regression the solution is still given by; B=(') - ' The predicted values are still given by; B=(') - ' = hat vector The residuals are given by; hat= i - (') - ' The (') - ' matrix main diagonal is a diagnostic that we will use (hat diag). 06a_atrix SLR 3
32 So basically everything works the same in multiple regression as in simple linear regression if we use matrix algebra. One last piece of the puzzle. We will need some estimates of variances and covariances. As usual these will all involve the SE (ean Square Error). We need all of the variances and covariances for the regression coefficients 06a_atrix SLR 32
33 These variances and covariances are obtained by multiplying the (') - matrix by the SE. The resulting matrix contains all of the variances and covariances for the regression coefficients. The variances of regression coefficients are on the main diagonal. The square root of these values gives the standard error used for confidence intervals and testing of the b i values. 06a_atrix SLR 33
34 ultiple Regression (continued) The (') - matrix can be obtained from SAS. When multiplied by the SE value this gives the Variance-Covariance matrix. This can also be obtained from SAS. 06a_atrix SLR 34
35 ultiple Regression (continued) A note on the assumption of independence. We assume that the e i values are independent of each other. We assume that the e i are independent of the hat values (b 0 +b i ). But WE DO NOT ASSUE THAT THE VARIOUS REGRESSION COEFFICIENTS ARE INDEENDENT OF EACH OTHER. Calculations do not assume zero covariances, they employ the Variance- Covariance matrix. 06a_atrix SLR 35
36 Summary atrix solutions to regression begin with the matrices ', ' and '. The values in these matrices are the sums, sums of squares and cross-products, just as with simple linear regression. The normal equations are solved just as for SLR. The solution is B=(') - '. Using matrix algebra we can obtain predicted values and residuals, diagnostics, variances and covariances and all of the other values needed for testing and interpretation of the multiple regression. 06a_atrix SLR 36
Addition and subtraction: element by element, and dimensions must match.
Matrix Essentials review: ) Matrix: Rectangular array of numbers. ) ranspose: Rows become columns and vice-versa ) single row or column is called a row or column) Vector ) R ddition and subtraction: element
More information. a m1 a mn. a 1 a 2 a = a n
Biostat 140655, 2008: Matrix Algebra Review 1 Definition: An m n matrix, A m n, is a rectangular array of real numbers with m rows and n columns Element in the i th row and the j th column is denoted by
More informationLecture 9 SLR in Matrix Form
Lecture 9 SLR in Matrix Form STAT 51 Spring 011 Background Reading KNNL: Chapter 5 9-1 Topic Overview Matrix Equations for SLR Don t focus so much on the matrix arithmetic as on the form of the equations.
More informationLinear Algebra Review
Linear Algebra Review Yang Feng http://www.stat.columbia.edu/~yangfeng Yang Feng (Columbia University) Linear Algebra Review 1 / 45 Definition of Matrix Rectangular array of elements arranged in rows and
More informationdf=degrees of freedom = n - 1
One sample t-test test of the mean Assumptions: Independent, random samples Approximately normal distribution (from intro class: σ is unknown, need to calculate and use s (sample standard deviation)) Hypotheses:
More informationPOL 213: Research Methods
Brad 1 1 Department of Political Science University of California, Davis April 15, 2008 Some Matrix Basics What is a matrix? A rectangular array of elements arranged in rows and columns. 55 900 0 67 1112
More informationLecture 2 Simple Linear Regression STAT 512 Spring 2011 Background Reading KNNL: Chapter 1
Lecture Simple Linear Regression STAT 51 Spring 011 Background Reading KNNL: Chapter 1-1 Topic Overview This topic we will cover: Regression Terminology Simple Linear Regression with a single predictor
More informationLecture 15 Multiple regression I Chapter 6 Set 2 Least Square Estimation The quadratic form to be minimized is
Lecture 15 Multiple regression I Chapter 6 Set 2 Least Square Estimation The quadratic form to be minimized is Q = (Y i β 0 β 1 X i1 β 2 X i2 β p 1 X i.p 1 ) 2, which in matrix notation is Q = (Y Xβ) (Y
More informationChapter 5 Matrix Approach to Simple Linear Regression
STAT 525 SPRING 2018 Chapter 5 Matrix Approach to Simple Linear Regression Professor Min Zhang Matrix Collection of elements arranged in rows and columns Elements will be numbers or symbols For example:
More informationMultiple Linear Regression
Chapter 3 Multiple Linear Regression 3.1 Introduction Multiple linear regression is in some ways a relatively straightforward extension of simple linear regression that allows for more than one independent
More informationMath 1314 Week #14 Notes
Math 3 Week # Notes Section 5.: A system of equations consists of two or more equations. A solution to a system of equations is a point that satisfies all the equations in the system. In this chapter,
More informationStatistical Techniques II EXST7015 Simple Linear Regression
Statistical Techniques II EXST7015 Simple Linear Regression 03a_SLR 1 Y - the dependent variable 35 30 25 The objective Given points plotted on two coordinates, Y and X, find the best line to fit the data.
More informationThe General Linear Model. Monday, Lecture 2 Jeanette Mumford University of Wisconsin - Madison
The General Linear Model Monday, Lecture 2 Jeanette Mumford University of Wisconsin - Madison How we re approaching the GLM Regression for behavioral data Without using matrices Understand least squares
More informationChaper 5: Matrix Approach to Simple Linear Regression. Matrix: A m by n matrix B is a grid of numbers with m rows and n columns. B = b 11 b m1 ...
Chaper 5: Matrix Approach to Simple Linear Regression Matrix: A m by n matrix B is a grid of numbers with m rows and n columns B = b 11 b 1n b m1 b mn Element b ik is from the ith row and kth column A
More informationMatrix Basic Concepts
Matrix Basic Concepts Topics: What is a matrix? Matrix terminology Elements or entries Diagonal entries Address/location of entries Rows and columns Size of a matrix A column matrix; vectors Special types
More informationLecture 3: Inference in SLR
Lecture 3: Inference in SLR STAT 51 Spring 011 Background Reading KNNL:.1.6 3-1 Topic Overview This topic will cover: Review of hypothesis testing Inference about 1 Inference about 0 Confidence Intervals
More informationSimple Linear Regression for the Climate Data
Prediction Prediction Interval Temperature 0.2 0.0 0.2 0.4 0.6 0.8 320 340 360 380 CO 2 Simple Linear Regression for the Climate Data What do we do with the data? y i = Temperature of i th Year x i =CO
More informationTopic 7 - Matrix Approach to Simple Linear Regression. Outline. Matrix. Matrix. Review of Matrices. Regression model in matrix form
Topic 7 - Matrix Approach to Simple Linear Regression Review of Matrices Outline Regression model in matrix form - Fall 03 Calculations using matrices Topic 7 Matrix Collection of elements arranged in
More informationStudent Self-Assessment of Mathematics (SSAM) for Intermediate Algebra
Student Self-Assessment of Mathematics (SSAM) for Intermediate Algebra Answer key 1. Find the value of 3x 4y if x = -2 and y = 5 To find the value, substitute the given values in for x and y 3x -4y Substitute
More informationEigenvectors and Eigenvalues 1
Ma 2015 page 1 Eigenvectors and Eigenvalues 1 In this handout, we will eplore eigenvectors and eigenvalues. We will begin with an eploration, then provide some direct eplanation and worked eamples, and
More informationMultivariate Statistical Analysis
Multivariate Statistical Analysis Fall 2011 C. L. Williams, Ph.D. Lecture 4 for Applied Multivariate Analysis Outline 1 Eigen values and eigen vectors Characteristic equation Some properties of eigendecompositions
More informationLecture 10 Multiple Linear Regression
Lecture 10 Multiple Linear Regression STAT 512 Spring 2011 Background Reading KNNL: 6.1-6.5 10-1 Topic Overview Multiple Linear Regression Model 10-2 Data for Multiple Regression Y i is the response variable
More informationCorrelation and the Analysis of Variance Approach to Simple Linear Regression
Correlation and the Analysis of Variance Approach to Simple Linear Regression Biometry 755 Spring 2009 Correlation and the Analysis of Variance Approach to Simple Linear Regression p. 1/35 Correlation
More informationRegression. Estimation of the linear function (straight line) describing the linear component of the joint relationship between two variables X and Y.
Regression Bivariate i linear regression: Estimation of the linear function (straight line) describing the linear component of the joint relationship between two variables and. Generally describe as a
More informationMath 3330: Solution to midterm Exam
Math 3330: Solution to midterm Exam Question 1: (14 marks) Suppose the regression model is y i = β 0 + β 1 x i + ε i, i = 1,, n, where ε i are iid Normal distribution N(0, σ 2 ). a. (2 marks) Compute the
More informationMatrices and vectors A matrix is a rectangular array of numbers. Here s an example: A =
Matrices and vectors A matrix is a rectangular array of numbers Here s an example: 23 14 17 A = 225 0 2 This matrix has dimensions 2 3 The number of rows is first, then the number of columns We can write
More informationSTA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #6
STA 8 Applied Linear Models: Regression Analysis Spring 011 Solution for Homework #6 6. a) = 11 1 31 41 51 1 3 4 5 11 1 31 41 51 β = β1 β β 3 b) = 1 1 1 1 1 11 1 31 41 51 1 3 4 5 β = β 0 β1 β 6.15 a) Stem-and-leaf
More informationBasic Linear Algebra in MATLAB
Basic Linear Algebra in MATLAB 9.29 Optional Lecture 2 In the last optional lecture we learned the the basic type in MATLAB is a matrix of double precision floating point numbers. You learned a number
More informationH. Diagnostic plots of residuals
H. Diagnostic plots of residuals 1. Plot residuals versus fitted values almost always a. or simple reg. this is about the same as residuals vs. x b. Look for outliers, curvature, increasing spread (funnel
More informationEXST7015: Estimating tree weights from other morphometric variables Raw data print
Simple Linear Regression SAS example Page 1 1 ********************************************; 2 *** Data from Freund & Wilson (1993) ***; 3 *** TABLE 8.24 : ESTIMATING TREE WEIGHTS ***; 4 ********************************************;
More informationHypothesis Testing for Var-Cov Components
Hypothesis Testing for Var-Cov Components When the specification of coefficients as fixed, random or non-randomly varying is considered, a null hypothesis of the form is considered, where Additional output
More informationMath for ML: review. ML and knowledge of other fields
ath for L: review ilos Hauskrecht milos@cs.pitt.edu Sennott Square x- people.cs.pitt.edu/~milos/ L and knowledge of other fields L solutions and algorithms rely on knowledge of many other disciplines:
More informationSimple linear regression
Simple linear regression Biometry 755 Spring 2008 Simple linear regression p. 1/40 Overview of regression analysis Evaluate relationship between one or more independent variables (X 1,...,X k ) and a single
More informationMultiple Regression. Dr. Frank Wood. Frank Wood, Linear Regression Models Lecture 12, Slide 1
Multiple Regression Dr. Frank Wood Frank Wood, fwood@stat.columbia.edu Linear Regression Models Lecture 12, Slide 1 Review: Matrix Regression Estimation We can solve this equation (if the inverse of X
More informationMAT 1332: CALCULUS FOR LIFE SCIENCES. Contents. 1. Review: Linear Algebra II Vectors and matrices Definition. 1.2.
MAT 1332: CALCULUS FOR LIFE SCIENCES JING LI Contents 1 Review: Linear Algebra II Vectors and matrices 1 11 Definition 1 12 Operations 1 2 Linear Algebra III Inverses and Determinants 1 21 Inverse Matrices
More informationThe Matrix Algebra of Sample Statistics
The Matrix Algebra of Sample Statistics James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) The Matrix Algebra of Sample Statistics
More informationRegression. Oscar García
Regression Oscar García Regression methods are fundamental in Forest Mensuration For a more concise and general presentation, we shall first review some matrix concepts 1 Matrices An order n m matrix is
More informationVector, Matrix, and Tensor Derivatives
Vector, Matrix, and Tensor Derivatives Erik Learned-Miller The purpose of this document is to help you learn to take derivatives of vectors, matrices, and higher order tensors (arrays with three dimensions
More informationAn Introduction to Matrix Algebra
An Introduction to Matrix Algebra EPSY 905: Fundamentals of Multivariate Modeling Online Lecture #8 EPSY 905: Matrix Algebra In This Lecture An introduction to matrix algebra Ø Scalars, vectors, and matrices
More informationDerivation of the Kalman Filter
Derivation of the Kalman Filter Kai Borre Danish GPS Center, Denmark Block Matrix Identities The key formulas give the inverse of a 2 by 2 block matrix, assuming T is invertible: T U 1 L M. (1) V W N P
More informationThe General Linear Model. How we re approaching the GLM. What you ll get out of this 8/11/16
8// The General Linear Model Monday, Lecture Jeanette Mumford University of Wisconsin - Madison How we re approaching the GLM Regression for behavioral data Without using matrices Understand least squares
More informationLecture 6: Geometry of OLS Estimation of Linear Regession
Lecture 6: Geometry of OLS Estimation of Linear Regession Xuexin Wang WISE Oct 2013 1 / 22 Matrix Algebra An n m matrix A is a rectangular array that consists of nm elements arranged in n rows and m columns
More informationData Science for Engineers Department of Computer Science and Engineering Indian Institute of Technology, Madras
Data Science for Engineers Department of Computer Science and Engineering Indian Institute of Technology, Madras Lecture 36 Simple Linear Regression Model Assessment So, welcome to the second lecture on
More informationMultiple linear regression: estimation and model fitting
Multiple linear regression: estimation and model fitting January 25 Introduction The goal of today s class is to set up a multiple regression model in terms of matrices and then solve for the regression
More informationLecture 2. The Simple Linear Regression Model: Matrix Approach
Lecture 2 The Simple Linear Regression Model: Matrix Approach Matrix algebra Matrix representation of simple linear regression model 1 Vectors and Matrices Where it is necessary to consider a distribution
More informationOutline. Remedial Measures) Extra Sums of Squares Standardized Version of the Multiple Regression Model
Outline 1 Multiple Linear Regression (Estimation, Inference, Diagnostics and Remedial Measures) 2 Special Topics for Multiple Regression Extra Sums of Squares Standardized Version of the Multiple Regression
More informationMultiple Linear Regression
Multiple Linear Regression Simple linear regression tries to fit a simple line between two variables Y and X. If X is linearly related to Y this explains some of the variability in Y. In most cases, there
More informationAlgebraic Expressions
ALGEBRAIC EXPRESSIONS 229 Algebraic Expressions Chapter 12 12.1 INTRODUCTION We have already come across simple algebraic expressions like x + 3, y 5, 4x + 5, 10y 5 and so on. In Class VI, we have seen
More informationA Re-Introduction to General Linear Models (GLM)
A Re-Introduction to General Linear Models (GLM) Today s Class: You do know the GLM Estimation (where the numbers in the output come from): From least squares to restricted maximum likelihood (REML) Reviewing
More informationMultivariate Linear Regression Models
Multivariate Linear Regression Models Regression analysis is used to predict the value of one or more responses from a set of predictors. It can also be used to estimate the linear association between
More informationSTATISTICS 174: APPLIED STATISTICS FINAL EXAM DECEMBER 10, 2002
Time allowed: 3 HOURS. STATISTICS 174: APPLIED STATISTICS FINAL EXAM DECEMBER 10, 2002 This is an open book exam: all course notes and the text are allowed, and you are expected to use your own calculator.
More informationLecture II: Linear Algebra Revisited
Lecture II: Linear Algebra Revisited Overview Vector spaces, Hilbert & Banach Spaces, etrics & Norms atrices, Eigenvalues, Orthogonal Transformations, Singular Values Operators, Operator Norms, Function
More informationST505/S697R: Fall Homework 2 Solution.
ST505/S69R: Fall 2012. Homework 2 Solution. 1. 1a; problem 1.22 Below is the summary information (edited) from the regression (using R output); code at end of solution as is code and output for SAS. a)
More information[ Here 21 is the dot product of (3, 1, 2, 5) with (2, 3, 1, 2), and 31 is the dot product of
. Matrices A matrix is any rectangular array of numbers. For example 3 5 6 4 8 3 3 is 3 4 matrix, i.e. a rectangular array of numbers with three rows four columns. We usually use capital letters for matrices,
More informationBiostatistics 380 Multiple Regression 1. Multiple Regression
Biostatistics 0 Multiple Regression ORIGIN 0 Multiple Regression Multiple Regression is an extension of the technique of linear regression to describe the relationship between a single dependent (response)
More informationGregory Carey, 1998 Regression & Path Analysis - 1 MULTIPLE REGRESSION AND PATH ANALYSIS
Gregory Carey, 1998 Regression & Path Analysis - 1 MULTIPLE REGRESSION AND PATH ANALYSIS Introduction Path analysis and multiple regression go hand in hand (almost). Also, it is easier to learn about multivariate
More informationNext is material on matrix rank. Please see the handout
B90.330 / C.005 NOTES for Wednesday 0.APR.7 Suppose that the model is β + ε, but ε does not have the desired variance matrix. Say that ε is normal, but Var(ε) σ W. The form of W is W w 0 0 0 0 0 0 w 0
More informationJUST THE MATHS SLIDES NUMBER 9.3. MATRICES 3 (Matrix inversion & simultaneous equations) A.J.Hobson
JUST THE MATHS SLIDES NUMBER 93 MATRICES 3 (Matrix inversion & simultaneous equations) by AJHobson 93 Introduction 932 Matrix representation of simultaneous linear equations 933 The definition of a multiplicative
More informationIntro to Linear Regression
Intro to Linear Regression Introduction to Regression Regression is a statistical procedure for modeling the relationship among variables to predict the value of a dependent variable from one or more predictor
More informationNeed for Several Predictor Variables
Multiple regression One of the most widely used tools in statistical analysis Matrix expressions for multiple regression are the same as for simple linear regression Need for Several Predictor Variables
More informationWritten as per the revised G Scheme syllabus prescribed by the Maharashtra State Board of Technical Education (MSBTE) w.e.f. academic year
Written as per the revised G Scheme syllabus prescribed by the Maharashtra State Board of Technical Education (MSBTE) w.e.f. academic year 2012-2013 Basic MATHEMATICS First Year Diploma Semester - I First
More informationIntro to Linear Regression
Intro to Linear Regression Introduction to Regression Regression is a statistical procedure for modeling the relationship among variables to predict the value of a dependent variable from one or more predictor
More informationReview from Bootcamp: Linear Algebra
Review from Bootcamp: Linear Algebra D. Alex Hughes October 27, 2014 1 Properties of Estimators 2 Linear Algebra Addition and Subtraction Transpose Multiplication Cross Product Trace 3 Special Matrices
More informationInteractions and Factorial ANOVA
Interactions and Factorial ANOVA STA442/2101 F 2017 See last slide for copyright information 1 Interactions Interaction between explanatory variables means It depends. Relationship between one explanatory
More informationMatrix Algebra: Definitions and Basic Operations
Section 4 Matrix Algebra: Definitions and Basic Operations Definitions Analyzing economic models often involve working with large sets of linear equations. Matrix algebra provides a set of tools for dealing
More informationInteractions and Factorial ANOVA
Interactions and Factorial ANOVA STA442/2101 F 2018 See last slide for copyright information 1 Interactions Interaction between explanatory variables means It depends. Relationship between one explanatory
More informationST Correlation and Regression
Chapter 5 ST 370 - Correlation and Regression Readings: Chapter 11.1-11.4, 11.7.2-11.8, Chapter 12.1-12.2 Recap: So far we ve learned: Why we want a random sample and how to achieve it (Sampling Scheme)
More informationMatrix Approach to Simple Linear Regression: An Overview
Matrix Approach to Simple Linear Regression: An Overview Aspects of matrices that you should know: Definition of a matrix Addition/subtraction/multiplication of matrices Symmetric/diagonal/identity matrix
More informationPredictive Modeling Using Logistic Regression Step-by-Step Instructions
Predictive Modeling Using Logistic Regression Step-by-Step Instructions This document is accompanied by the following Excel Template IntegrityM Predictive Modeling Using Logistic Regression in Excel Template.xlsx
More informationSpatial inference. Spatial inference. Accounting for spatial correlation. Multivariate normal distributions
Spatial inference I will start with a simple model, using species diversity data Strong spatial dependence, Î = 0.79 what is the mean diversity? How precise is our estimate? Sampling discussion: The 64
More informationInverse of a Square Matrix. For an N N square matrix A, the inverse of A, 1
Inverse of a Square Matrix For an N N square matrix A, the inverse of A, 1 A, exists if and only if A is of full rank, i.e., if and only if no column of A is a linear combination 1 of the others. A is
More informationINTERMEDIATE ALGEBRA TEST ONE PRACTICE SOLUTIONS SHOW ALL CALCULATIONS AND SIMPLIFY ANSWERS. PAGE 1 OF 8.
SHOW ALL CALCULATIONS AND SIMPLIFY ANSWERS. PAGE 1 OF 8. 1. (a) Solve the inequality x 3 7. 7 x 3 7, 4 x 10 [ 4, 10] 7 + 3 x 3 + 3 7 + 3 or (b) Graph the solution to the inequality in (a), above. -6-4
More informationECON The Simple Regression Model
ECON 351 - The Simple Regression Model Maggie Jones 1 / 41 The Simple Regression Model Our starting point will be the simple regression model where we look at the relationship between two variables In
More informationPre-Calculus I. For example, the system. x y 2 z. may be represented by the augmented matrix
Pre-Calculus I 8.1 Matrix Solutions to Linear Systems A matrix is a rectangular array of elements. o An array is a systematic arrangement of numbers or symbols in rows and columns. Matrices (the plural
More informationFactorial ANOVA. More than one categorical explanatory variable. See last slide for copyright information 1
Factorial ANOVA More than one categorical explanatory variable See last slide for copyright information 1 Factorial ANOVA Categorical explanatory variables are called factors More than one at a time Primarily
More informationLecture 13: Simple Linear Regression in Matrix Format
See updates and corrections at http://www.stat.cmu.edu/~cshalizi/mreg/ Lecture 13: Simple Linear Regression in Matrix Format 36-401, Section B, Fall 2015 13 October 2015 Contents 1 Least Squares in Matrix
More information1) Answer the following questions as true (T) or false (F) by circling the appropriate letter.
1) Answer the following questions as true (T) or false (F) by circling the appropriate letter. T F T F T F a) Variance estimates should always be positive, but covariance estimates can be either positive
More informationChapter 2. Ma 322 Fall Ma 322. Sept 23-27
Chapter 2 Ma 322 Fall 2013 Ma 322 Sept 23-27 Summary ˆ Matrices and their Operations. ˆ Special matrices: Zero, Square, Identity. ˆ Elementary Matrices, Permutation Matrices. ˆ Voodoo Principle. What is
More informationStatistics 512: Solution to Homework#11. Problems 1-3 refer to the soybean sausage dataset of Problem 20.8 (ch21pr08.dat).
Statistics 512: Solution to Homework#11 Problems 1-3 refer to the soybean sausage dataset of Problem 20.8 (ch21pr08.dat). 1. Perform the two-way ANOVA without interaction for this model. Use the results
More informationHandout #8: Matrix Framework for Simple Linear Regression
Handout #8: Matrix Framework for Simple Linear Regression Example 8.1: Consider again the Wendy s subset of the Nutrition dataset that was initially presented in Handout #7. Assume the following structure
More informationChapter 10. Supplemental Text Material
Chater 1. Sulemental Tet Material S1-1. The Covariance Matri of the Regression Coefficients In Section 1-3 of the tetbook, we show that the least squares estimator of β in the linear regression model y=
More informationTopic 20: Single Factor Analysis of Variance
Topic 20: Single Factor Analysis of Variance Outline Single factor Analysis of Variance One set of treatments Cell means model Factor effects model Link to linear regression using indicator explanatory
More informationLecture 18 Miscellaneous Topics in Multiple Regression
Lecture 18 Miscellaneous Topics in Multiple Regression STAT 512 Spring 2011 Background Reading KNNL: 8.1-8.5,10.1, 11, 12 18-1 Topic Overview Polynomial Models (8.1) Interaction Models (8.2) Qualitative
More informationConsistent Bivariate Distribution
A Characterization of the Normal Conditional Distributions MATSUNO 79 Therefore, the function ( ) = G( : a/(1 b2)) = N(0, a/(1 b2)) is a solu- tion for the integral equation (10). The constant times of
More informationFinal Review. Yang Feng. Yang Feng (Columbia University) Final Review 1 / 58
Final Review Yang Feng http://www.stat.columbia.edu/~yangfeng Yang Feng (Columbia University) Final Review 1 / 58 Outline 1 Multiple Linear Regression (Estimation, Inference) 2 Special Topics for Multiple
More informationLinear Algebra I Lecture 8
Linear Algebra I Lecture 8 Xi Chen 1 1 University of Alberta January 25, 2019 Outline 1 2 Gauss-Jordan Elimination Given a system of linear equations f 1 (x 1, x 2,..., x n ) = 0 f 2 (x 1, x 2,..., x n
More informationSingular Value Decomposition. 1 Singular Value Decomposition and the Four Fundamental Subspaces
Singular Value Decomposition This handout is a review of some basic concepts in linear algebra For a detailed introduction, consult a linear algebra text Linear lgebra and its pplications by Gilbert Strang
More information=, v T =(e f ) e f B =
A Quick Refresher of Basic Matrix Algebra Matrices and vectors and given in boldface type Usually, uppercase is a matrix, lower case a vector (a matrix with only one row or column) a b e A, v c d f The
More information[y i α βx i ] 2 (2) Q = i=1
Least squares fits This section has no probability in it. There are no random variables. We are given n points (x i, y i ) and want to find the equation of the line that best fits them. We take the equation
More informationESS 265 Spring Quarter 2005 Time Series Analysis: Linear Regression
ESS 265 Spring Quarter 2005 Time Series Analysis: Linear Regression Lecture 11 May 10, 2005 Multivariant Regression A multi-variant relation between a dependent variable y and several independent variables
More informationGeneral Linear Models. with General Linear Hypothesis Tests and Likelihood Ratio Tests
General Linear Models with General Linear Hypothesis Tests and Likelihood Ratio Tests 1 Background Linear combinations of Normals are Normal XX nn ~ NN μμ, ΣΣ AAAA ~ NN AAμμ, AAAAAA A sum of squared, standardized
More informationMatrix Algebra 2.1 MATRIX OPERATIONS Pearson Education, Inc.
2 Matrix Algebra 2.1 MATRIX OPERATIONS MATRIX OPERATIONS m n If A is an matrixthat is, a matrix with m rows and n columnsthen the scalar entry in the ith row and jth column of A is denoted by a ij and
More informationWritten as per the revised G Scheme syllabus prescribed by the Maharashtra State Board of Technical Education (MSBTE) w.e.f. academic year
Written as per the revised G Scheme syllabus prescribed by the Maharashtra State Board of Technical Education (MSBTE) w.e.f. academic year 2012-2013 Basic MATHEMATICS First Year Diploma Semester - I First
More informationRegression Analysis: Exploring relationships between variables. Stat 251
Regression Analysis: Exploring relationships between variables Stat 251 Introduction Objective of regression analysis is to explore the relationship between two (or more) variables so that information
More informationLinear Regression. In this problem sheet, we consider the problem of linear regression with p predictors and one intercept,
Linear Regression In this problem sheet, we consider the problem of linear regression with p predictors and one intercept, y = Xβ + ɛ, where y t = (y 1,..., y n ) is the column vector of target values,
More informationStatistics 910, #5 1. Regression Methods
Statistics 910, #5 1 Overview Regression Methods 1. Idea: effects of dependence 2. Examples of estimation (in R) 3. Review of regression 4. Comparisons and relative efficiencies Idea Decomposition Well-known
More informationIntroduction to Confirmatory Factor Analysis
Introduction to Confirmatory Factor Analysis Multivariate Methods in Education ERSH 8350 Lecture #12 November 16, 2011 ERSH 8350: Lecture 12 Today s Class An Introduction to: Confirmatory Factor Analysis
More informationLecture Notes Part 2: Matrix Algebra
17.874 Lecture Notes Part 2: Matrix Algebra 2. Matrix Algebra 2.1. Introduction: Design Matrices and Data Matrices Matrices are arrays of numbers. We encounter them in statistics in at least three di erent
More informationReview Packet 1 B 11 B 12 B 13 B = B 21 B 22 B 23 B 31 B 32 B 33 B 41 B 42 B 43
Review Packet. For each of the following, write the vector or matrix that is specified: a. e 3 R 4 b. D = diag{, 3, } c. e R 3 d. I. For each of the following matrices and vectors, give their dimension.
More informationProperties of the least squares estimates
Properties of the least squares estimates 2019-01-18 Warmup Let a and b be scalar constants, and X be a scalar random variable. Fill in the blanks E ax + b) = Var ax + b) = Goal Recall that the least squares
More information