Homework 6. Wife Husband XY Sum Mean SS

Similar documents
Ch. 16: Correlation and Regression

Chs. 16 & 17: Correlation & Regression

Chs. 15 & 16: Correlation & Regression

Intro to Linear Regression

Intro to Linear Regression

Correlation. We don't consider one variable independent and the other dependent. Does x go up as y goes up? Does x go down as y goes up?

Correlation. A statistics method to measure the relationship between two variables. Three characteristics

Can you tell the relationship between students SAT scores and their college grades?

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #6

Chapter 16: Correlation

Correlation and Linear Regression

Correlation: Relationships between Variables

Statistics: revision

Reminder: Student Instructional Rating Surveys

" M A #M B. Standard deviation of the population (Greek lowercase letter sigma) σ 2

9 Correlation and Regression

Homework 2 Solutions

CHAPTER 17 CHI-SQUARE AND OTHER NONPARAMETRIC TESTS FROM: PAGANO, R. R. (2007)

1 A Review of Correlation and Regression

Chapter 13 Correlation

Correlation. Tests of Relationships: Correlation. Correlation. Correlation. Bivariate linear correlation. Correlation 9/8/2018

Chapter 16: Correlation

Nonparametric Statistics

Slide 7.1. Theme 7. Correlation

s e, which is large when errors are large and small Linear regression model

Correlation & Linear Regression. Slides adopted fromthe Internet

Area1 Scaled Score (NAPLEX) .535 ** **.000 N. Sig. (2-tailed)

DETAILED CONTENTS PART I INTRODUCTION AND DESCRIPTIVE STATISTICS. 1. Introduction to Statistics

CMPSCI 240: Reasoning Under Uncertainty

Chapter 8: Correlation & Regression

Sampling Distributions: Central Limit Theorem

Chapter 15. Correlation and Regression

Inverse of a Square Matrix. For an N N square matrix A, the inverse of A, 1

Association Between Variables Measured at the Interval-Ratio Level: Bivariate Correlation and Regression

CORELATION - Pearson-r - Spearman-rho

Notes for Recitation 7

UNIT 4 RANK CORRELATION (Rho AND KENDALL RANK CORRELATION

9/28/2013. PSY 511: Advanced Statistics for Psychological and Behavioral Research 1

Data are sometimes not compatible with the assumptions of parametric statistical tests (i.e. t-test, regression, ANOVA)

Bivariate Relationships Between Variables

Correlation and regression

psychological statistics

Dr. Junchao Xia Center of Biophysics and Computational Biology. Fall /1/2016 1/46

BIOL 4605/7220 CH 20.1 Correlation

Correlation and Regression

Correlation. January 11, 2018

7.2 One-Sample Correlation ( = a) Introduction. Correlation analysis measures the strength and direction of association between

Measuring Associations : Pearson s correlation

Relationship Between Interval and/or Ratio Variables: Correlation & Regression. Sorana D. BOLBOACĂ

Econometrics. 4) Statistical inference

Answer Key. 9.1 Scatter Plots and Linear Correlation. Chapter 9 Regression and Correlation. CK-12 Advanced Probability and Statistics Concepts 1

LI EAR REGRESSIO A D CORRELATIO

Non-parametric (Distribution-free) approaches p188 CN

REVIEW 8/2/2017 陈芳华东师大英语系

Rank-Based Methods. Lukas Meier

The t-test: A z-score for a sample mean tells us where in the distribution the particular mean lies

Deciphering Math Notation. Billy Skorupski Associate Professor, School of Education

Nonparametric Statistics. Leah Wright, Tyler Ross, Taylor Brown

The One-Way Repeated-Measures ANOVA. (For Within-Subjects Designs)

Non-parametric tests, part A:

Prob/Stats Questions? /32

Correlation and Simple Linear Regression

Correlation 1. December 4, HMS, 2017, v1.1

O2. The following printout concerns a best subsets regression. Questions follow.

Module 7 Practice problem and Homework answers

Introduction and Descriptive Statistics p. 1 Introduction to Statistics p. 3 Statistics, Science, and Observations p. 5 Populations and Samples p.

This gives us an upper and lower bound that capture our population mean.

Lecture 18: Analysis of variance: ANOVA

Notes 6: Correlation

ST430 Exam 1 with Answers

In many situations, there is a non-parametric test that corresponds to the standard test, as described below:

Lesson 2. Investigation. Name:

Designing Information Devices and Systems II Fall 2018 Elad Alon and Miki Lustig Homework 9

A-LEVEL Statistics. Statistics 3 SS03 Mark scheme June Version/Stage: Final

ANOVA - analysis of variance - used to compare the means of several populations.

STAT Exam Jam Solutions. Contents

Data analysis and Geostatistics - lecture VII

Describing Bivariate Data

THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE

Finding Relationships Among Variables

9. Linear Regression and Correlation

Lecture 11: Simple Linear Regression

Correlation. Engineering Mathematics III

ESTIMATION OF TREATMENT EFFECTS VIA MATCHING

HYPOTHESIS TESTING II TESTS ON MEANS. Sorana D. Bolboacă

Correlation and Regression

GROUP-RANK CORRELATION COEFFICIENT

A. Incorrect! Check your algebra when you solved for volume. B. Incorrect! Check your algebra when you solved for volume.

GROUPED DATA E.G. FOR SAMPLE OF RAW DATA (E.G. 4, 12, 7, 5, MEAN G x / n STANDARD DEVIATION MEDIAN AND QUARTILES STANDARD DEVIATION

PSY 307 Statistics for the Behavioral Sciences. Chapter 20 Tests for Ranked Data, Choosing Statistical Tests

One sided tests. An example of a two sided alternative is what we ve been using for our two sample tests:

Parametric versus Nonparametric Statistics-when to use them and which is more powerful? Dr Mahmoud Alhussami

Intro to Parametric & Nonparametric Statistics

Data files for today. CourseEvalua2on2.sav pontokprediktorok.sav Happiness.sav Ca;erplot.sav

Hypothesis Testing. We normally talk about two types of hypothesis: the null hypothesis and the research or alternative hypothesis.

The One-Way Independent-Samples ANOVA. (For Between-Subjects Designs)

Research Design - - Topic 12 MRC Analysis and Two Factor Designs: Completely Randomized and Repeated Measures 2010 R.C. Gardner, Ph.D.

Ch 13 & 14 - Regression Analysis

Advanced Experimental Design

Advanced Statistical Regression Analysis: Mid-Term Exam Chapters 1-5

Transcription:

. Homework Wife Husband X 5 7 5 7 7 3 3 9 9 5 9 5 3 3 9 Sum 5 7 Mean 7.5.375 SS.5 37.75 r = ( )( 7) - 5.5 ( )( 37.75) = 55.5 7.7 =.9 With r Crit () =.77, we would reject H : r =. Thus, it would make sense to compute the regression equation to allow us to make predictions. ˆ =.73X + 3. We would also want to compute the standard error of estimate in order to have a sense of the accuracy of predictions made using the regression equation. SEE = 3.7 =.99. Anxiety Exam Score X 5 7 7 5 7 79 553 3 5 5 5 Sum 3 9 5 Mean 5 3 SS 7 ( )( 9) 5-3 r = ( )( 7) = -3 3 = -.9

With r Crit () =., we would reject H : r =. Thus, we should construct the regression equation in order to make predictions. ˆ = -.7X + 9.9 To get a sense of the accuracy of predictions, we should compute the standard error of estimate: SEE =.97 =.93. The Spearman correlation is exactly the same (computationally) as the Pearson correlation. The only difference is that the computation of r is based on the ranked data, rather than the actual scores. X ank X ank X 7 9 3 3 5 3 3 5 5 5 Sum 7 9 5 5 5 SS 55. 59. ( )( 5) 5-5 r = 5 ( )( ) = 9 =.9 With r Crit (5) =., we would actually retain H in this case.. a. = $X + $ (Company A) and = $5X + $ (Company B) b. $7 from Company A and $7 from Company B, so the costs would be equal for rats. c. $3 from Company A and $ from Company B, so Company B is the better deal.. X X ˆ 7 9-3 5 5-3 - 5 7 3 7 Sum 3 3 Mean 3 SS - ˆ

( )( 3) 3 - r = ( )( ) = 3 3. =.9 With r Crit () =., we would reject H : r =. Thus, we should construct the regression equation in order to make predictions. ˆ = 3X - 3 If we sum the differences between the observed and predicted values, we will always get, as seen above. However, if we square the differences before adding, we get, which would be the SS Error. Note that if you compute r =. and then compute -r =.7, we can multiply the coefficient of alienation by SS and also get SS Error. (We actually get.3, due to rounding.). egression Summary vs. X Squared vs. X egression esidual.9.3.7.59 DF Sum of Squar Mean Square F-Value P-Value.33.33.9. 9.5.55 7 5.75 9 7 5 3 egression Plot 3 5 7 9 X =.5 -.7 * X; ^ =.3 egression Coefficients vs. X X.5..5.39 <. -.7.5 -.9-5.. As you can see, there is a significant correlation between X and, with p <.5.

egression Summary vs. X+5 Squared.9.3.7.59 9 7 5 3 egression Plot 7 9 3 5 X+5 =.55 -.7 * X; ^ =.3 vs. X+5 egression esidual.33.33.9. 9.5.55 7 5.75 egression Coefficients vs. X+5 X+5.55.75.55.7. -.7.5 -.9-5.. As you can see, adding 5 to each of the X values has no impact on the correlation coefficient. If you think of r as the mean product of z scores, that should make sense to you. That is, the addition of 5 to each of the X values would have no effect on the z scores, so r should remain the same. It s also the case that adding a constant to a set of scores would leave the SS intact, so the SS X would stay the same after the addition of a constant of 5. As you can see in the output below, multiplying each value of X by a constant (3) also has no impact on the correlation coefficient. The SS X goes from 3.5 to 57.5 (3 3 or 9 times larger), while SS would remain unchanged. The SP would be 3 times larger after multiplying each X by 3.

egression Summary vs. X*3 Squared vs. X*3 egression esidual.9.3.7.59.33.33.9. 9.5.55 7 5.75 9 7 5 3 egression Plot 5 5 5 3 X*3 =.5 -.9 * X; ^ =.3 egression Coefficients vs. X*3 X*3.5..5.39 <. -.9.53 -.9-5..

. egression Summary Errors vs. eaction Time Squared.773.597.53.5 Errors egression Plot Errors vs. eaction Time egression esidual 53.7 53.7.97.5 3.99.33 7 9.75 9 3 eaction Time = 35.37 -.33 * X; ^ =.597 egression Coefficients Errors vs. eaction Time eaction Time 35.37 9.3 35.37 3.7.9 -.33.5 -.773 -.93.5 As you can see, there is a significant negative linear relationship (r = -.773) between speed and accuracy. As speed increases, errors decrease and vice versa. ou could describe the relationship precisely by using the regression equation.

.7 egression Summary Doctor Visits vs. LCU Squared.77.77.7.5 Doctor Visits egression Plot 5 5 5 3 LCU =.39 +.3 * X; ^ =.77 Doctor Visits vs. LCU egression esidual 7.3 7.3 3.. 9.3.37 9.77 egression Coefficients Doctor Visits vs. LCU LCU.39.739.39 3.33.3.3..77 5.3. As you can see, there is a significant positive linear relationship between Doctor Visits and LCUs. Placing the same data in the Spearman Correlation analysis (under Nonparametric analyses) yields the following analysis: Spearman ank Correlation for LCU, Doctor Visits Sum of Squared Differences.5 ho. Z-Value.5 P-Value.99 ho corrected for ties.5 Tied Z-Value.577 Tied P-Value. # Ties, LCU # Ties, Doctor Visits