Lecture 16: Again on Regression
|
|
- Clifton Shepherd
- 5 years ago
- Views:
Transcription
1 Lecture 16: Again on Regression S. Massa, Department of Statistics, University of Oxford 10 February 2016
2 The Normality Assumption Body weights (Kg) and brain weights (Kg) of 62 mammals. Species Body weight (kg) Body rank Brain weight (g) Brain rank African elephant African giant pouched rat Arctic Fox Arctic ground squirrel Asian elephant Baboon Big brown bat Brazilian tapir Cat Chimpanzee Chinchilla Cow Desert hedgehog Donkey Eastern American mole Echidna European hedgehog Galago Genet Giant armadillo Giraffe Goat Golden hamster Gorilla Gray seal Gray wolf Ground squirrel Guinea pig
3 The Normality Assumption Suppose we want to test for correlation between body weight X and brain weight Y. Look at the scatterplot. Compute brain r xy = But we cannot see any obvious linear relationship. The regression line is completely dominated by the extreme values. Is our data really normal? body
4 The Normality Assumption Histogram of body Histogram of brain Frequency Frequency body Here you can see histograms of the body and brain weights. The data don t really look normal. What if we apply some transformation? Take logarithms for example. brain
5 The Normality Assumption Histogram of log(body) Histogram of log(brain) Frequency Frequency log(body) log(brain) Here you can see histograms of the logarithms of the body and brain weights. The situation is much better now. What about the scatter plot?
6 The Normality Assumption log(body) log(brain) After taking logarithms, there is an obvious linear relationship. We then compute r log(x),log(y) = 0.96.
7 The Normality Assumption log(body) log(brain) We can actually compute the regression line to be log 10 (y) = log 10 (x) , or after exponentiating y 8.5 x 3/4.
8 Spearman Rank Correlation Coefficient A non-parametric alternative to the correlation coefficient is known as the Spearman Rank Correlation coefficient. Replace the actual observations by their ranks. Drawback: This correlation coefficient will not give us a regression line! It can be used to test H 0 : ρ = 0.
9 Spearman Rank Correlation Coefficient To compute the Spearman Rank Correlation Coefficient: 1. Order and rank the x s and the y s. With each pair (x i, y i ) we will have two ranks (R x i, Ry i ). 2. Compute the absolute different of the two ranks d i = R x i Ry i. 3. Compute the sum of the squared differences D = i d 2 i = i R x i R y i The Spearman s rank correlation coefficient is r s = 1 6D n(n 2 1).
10 Hypothesis Testing Having computed r s we can now conduct the hypothesis test H 0 : there is no association between the ranks, against H 1 : there is association between the ranks. This the two-sided test. One can also conduct a one-sided test if there is reason to believe that any association present should be positive (or negative). Under the null hypothesis H 0 T := r s n 2 t 1 r 2 n 2. s
11 Example We compute the rank pairs for our dataset. R x R y d R x R y d R x R y d Then the sums of the squared differences is D = d 2 i = And Spearmans rank correlation coefficient is r s = 1 6D n(n 2 1) =
12 Example The observed value of the statistic is d.f. P = 0.10 P = 0.05 P = 0.02 P = t obs := r s 62 2 = 24.64, 1 r 2 s and as usual we compare it with a t-distribution with n 2 = 60 degrees of freedom. The critical value is 2, therefore reject the null hypothesis (24.64 > 2.00).
13 Residuals R 2 I Let (x i, y i ) be observations of normally distributed variables (X, Y ). Suppose that (X, Y ) are related through Y = α + βx + ɛ, where ɛ is independent of X and has zero mean. Since ɛ is independent of X var(y ) = β 2 var(x) + var(ɛ) This means that our uncertainty about Y consists of two parts: Our uncertainty about X; our uncertainty about ɛ. The first part should disappear once we know x.
14 Residuals R 2 II The residual uncertainty is due to ɛ and thus independent of X. In fact we can quantify the proportion of uncertainty in Y explained by X var(y ) = β 2 var(x) + var(ɛ) = r 2 var(y ) var(x) + var(ɛ) var(x) = r 2 var(y ) + var(ɛ). Therefore var(ɛ) = (1 r 2 )var(y ). r 2 is precisely the proportion of the variability in Y that is explained by the variability of X.
15 Example: the Galton Dataset again Child height (in.) Parent average height (in.)
16 Example: the Galton Dataset again Recall the summary statistics: Then we have Parent Child Sum Diff SD Variance s xy = 1 4 (s2 x+y s 2 x y) = 1 ( ) = 2.07, 4 and r xy = s xy s x s y = =
17 Example: the Galton dataset revisited Therefore we quickly compute r xy = b = s y s x r xy = a = ȳ b x = 23.8 y = 0.649x If the parents average height is 72 inches we can predict the child s height as y pred = = 70.5.
18 Example: Prediction What is the probability that the child s height Y is over 70 inches? We know that Y N(68.1, ). Then standardising and using the standard normal table we find ( Y ) P(Y > 70) = P > = P(Z > 0.75) = , Parents Child mean SD corr 0.459
19 Example: Prediction If we know the parents height is average, i.e., 68.3, what is the probability that the child s height Y is over 70 inches? Now we are looking within the second column. There is significantly less variability there than across the whole sample. Knowing X helps us predict Y.
20 Example: Prediction Recall that var(y ) = β 2 var(x)+var(ɛ), Thus if we know X, the residual variance is just var(ɛ) = (1 r 2 )var(y ) = ( ) =
21 Example: Prediction The height of a child whose parents average height is 68.3 will be approximately N(68.1, ). P(Y > 70 X = 68.3) ( Y ) = P > = P(Z > 0.85) = = <
22 Homoscedasticity Regression only makes sense when the data is homoscedastic. Homoscedasticity means that there is the same dispersion if we look at the y values for different x values. The opposite is known as heteroscedasticity. Roughly it means that in our model Y = α + βx + ɛ, the dispersion of ɛ does not depend on X. The scatterplot should be roughly oval shaped.
23 The right one is heteroscedastic. The variability of Y depends on X. Homoscedasticity Here you can see simulated data sets. The left one is homoscedastic. You can see similar variability across all the different vertical slices.
24 Regression to the mean I Recall the general form of the regression line y ȳ = r xy s y s x (x x). Standardising we can rewrite this as y ȳ s y = r xy x x s x. If r xy ±1, then we can observe what is known as regression to the mean: an extremely high (or low) value of x will usually occur with a less extreme value of y.
25 Regression to the mean II Suppose for example that s x = s y, then and since r xy < 1 we have y ȳ = r xy (x x), y ȳ < x x. In other words if the standardised deviation of x from its mean is extremely high, then we expect that y will be closer to its mean, if standardised.
26 Regression to the mean III Very important in practice: a test score of 100/100 will invariably be followed by a score 100, most likely < 100. An extremely high value will usually be followed by a less extreme one. You have to be careful to account for this regression to the mean before reaching any conclusions. Read the example with the traffic cameras: traffic cameras were installed in locations where large clusters of accidents were observed. A year later the number of accidents was much lower. How much of this was due to the cameras and how much due to regression to the mean?
27 Summary If data does not look normal then we use Spearman s rank correlation coefficient. If r is the correlation, then the standard deviation of Y when X = x is known is 1 r 2 SD(Y ). Regression only makes sense if data is homoscedastic: in other words variability of Y does not depend on X. If (X, Y ) have correlation ρ < 1 then there is regression to the mean has to be accounted for in order to reach safe conclusions.
Stat 529 (Winter 2011) A simple linear regression (SLR) case study. Mammals brain weights and body weights
Stat 529 (Winter 2011) A simple linear regression (SLR) case study Reading: Sections 8.1 8.4, 8.6, 8.7 Mammals brain weights and body weights Questions of interest Scatterplots of the data Log transforming
More informationScatter plot of data from the study. Linear Regression
1 2 Linear Regression Scatter plot of data from the study. Consider a study to relate birthweight to the estriol level of pregnant women. The data is below. i Weight (g / 100) i Weight (g / 100) 1 7 25
More informationScatter plot of data from the study. Linear Regression
1 2 Linear Regression Scatter plot of data from the study. Consider a study to relate birthweight to the estriol level of pregnant women. The data is below. i Weight (g / 100) i Weight (g / 100) 1 7 25
More informationLecture 14: ANOVA and the F-test
Lecture 14: ANOVA and the F-test S. Massa, Department of Statistics, University of Oxford 3 February 2016 Example Consider a study of 983 individuals and examine the relationship between duration of breastfeeding
More informationAMS 7 Correlation and Regression Lecture 8
AMS 7 Correlation and Regression Lecture 8 Department of Applied Mathematics and Statistics, University of California, Santa Cruz Suumer 2014 1 / 18 Correlation pairs of continuous observations. Correlation
More informationChapter 14. Linear least squares
Serik Sagitov, Chalmers and GU, March 5, 2018 Chapter 14 Linear least squares 1 Simple linear regression model A linear model for the random response Y = Y (x) to an independent variable X = x For a given
More informationStat 135, Fall 2006 A. Adhikari HOMEWORK 10 SOLUTIONS
Stat 135, Fall 2006 A. Adhikari HOMEWORK 10 SOLUTIONS 1a) The model is cw i = β 0 + β 1 el i + ɛ i, where cw i is the weight of the ith chick, el i the length of the egg from which it hatched, and ɛ i
More informationStatistics for EES Linear regression and linear models
Statistics for EES Linear regression and linear models Dirk Metzler http://evol.bio.lmu.de/_statgen 28. July 2010 Contents 1 Univariate linear regression: how and why? 2 t-test for linear regression 3
More informationStatistics for EES 7. Linear regression and linear models
Statistics for EES 7. Linear regression and linear models Dirk Metzler http://www.zi.biologie.uni-muenchen.de/evol/statgen.html 26. May 2009 Contents 1 Univariate linear regression: how and why? 2 t-test
More informationPHAR2821 Drug Discovery and Design B Statistics GARTH TARR SEMESTER 2, 2013
PHAR2821 Drug Discovery and Design B Statistics GARTH TARR SEMESTER 2, 2013 Housekeeping Contact details Email: garth.tarr@sydney.edu.au Room: 806 Carslaw Building Consultation: by appointment (email to
More informationAnalysing data: regression and correlation S6 and S7
Basic medical statistics for clinical and experimental research Analysing data: regression and correlation S6 and S7 K. Jozwiak k.jozwiak@nki.nl 2 / 49 Correlation So far we have looked at the association
More informationAnalyzing Data With Regression Models
Analyzing Data With Regression Models Professor Diane Lambert June 2010 Supported by MOE-Microsoft Key Laboratory of Statistics and Information Technology and the Beijing International Center for Mathematical
More informationLecture 18: Simple Linear Regression
Lecture 18: Simple Linear Regression BIOS 553 Department of Biostatistics University of Michigan Fall 2004 The Correlation Coefficient: r The correlation coefficient (r) is a number that measures the strength
More informationInterpretation, Prediction and Confidence Intervals
Interpretation, Prediction and Confidence Intervals Merlise Clyde September 15, 2017 Last Class Model for log brain weight as a function of log body weight Nested Model Comparison using ANOVA led to model
More informationChapter 7 Linear Regression
Chapter 7 Linear Regression 1 7.1 Least Squares: The Line of Best Fit 2 The Linear Model Fat and Protein at Burger King The correlation is 0.76. This indicates a strong linear fit, but what line? The line
More information401 Review. 6. Power analysis for one/two-sample hypothesis tests and for correlation analysis.
401 Review Major topics of the course 1. Univariate analysis 2. Bivariate analysis 3. Simple linear regression 4. Linear algebra 5. Multiple regression analysis Major analysis methods 1. Graphical analysis
More informationCorrelation and regression
NST 1B Experimental Psychology Statistics practical 1 Correlation and regression Rudolf Cardinal & Mike Aitken 11 / 12 November 2003 Department of Experimental Psychology University of Cambridge Handouts:
More informationCan you tell the relationship between students SAT scores and their college grades?
Correlation One Challenge Can you tell the relationship between students SAT scores and their college grades? A: The higher SAT scores are, the better GPA may be. B: The higher SAT scores are, the lower
More informationCorrelation and the Analysis of Variance Approach to Simple Linear Regression
Correlation and the Analysis of Variance Approach to Simple Linear Regression Biometry 755 Spring 2009 Correlation and the Analysis of Variance Approach to Simple Linear Regression p. 1/35 Correlation
More informationHypothesis Testing hypothesis testing approach
Hypothesis Testing In this case, we d be trying to form an inference about that neighborhood: Do people there shop more often those people who are members of the larger population To ascertain this, we
More information6.0 Lesson Plan. Answer Questions. Regression. Transformation. Extrapolation. Residuals
6.0 Lesson Plan Answer Questions Regression Transformation Extrapolation Residuals 1 Information about TAs Lab grader: Pontus, npl@duke.edu Hwk grader: Rachel, rmt6@duke.edu Quiz (Tuesday): Matt, matthew.campbell@duke.edu
More informationSimple Linear Regression for the Climate Data
Prediction Prediction Interval Temperature 0.2 0.0 0.2 0.4 0.6 0.8 320 340 360 380 CO 2 Simple Linear Regression for the Climate Data What do we do with the data? y i = Temperature of i th Year x i =CO
More informationInferences for Regression
Inferences for Regression An Example: Body Fat and Waist Size Looking at the relationship between % body fat and waist size (in inches). Here is a scatterplot of our data set: Remembering Regression In
More informationLecture 6: Linear Regression
Lecture 6: Linear Regression Reading: Sections 3.1-3 STATS 202: Data mining and analysis Jonathan Taylor, 10/5 Slide credits: Sergio Bacallado 1 / 30 Simple linear regression Model: y i = β 0 + β 1 x i
More informationNature vs. nurture? Lecture 18 - Regression: Inference, Outliers, and Intervals. Regression Output. Conditions for inference.
Understanding regression output from software Nature vs. nurture? Lecture 18 - Regression: Inference, Outliers, and Intervals In 1966 Cyril Burt published a paper called The genetic determination of differences
More informationCorrelation. A statistics method to measure the relationship between two variables. Three characteristics
Correlation Correlation A statistics method to measure the relationship between two variables Three characteristics Direction of the relationship Form of the relationship Strength/Consistency Direction
More informationy = a + bx 12.1: Inference for Linear Regression Review: General Form of Linear Regression Equation Review: Interpreting Computer Regression Output
12.1: Inference for Linear Regression Review: General Form of Linear Regression Equation y = a + bx y = dependent variable a = intercept b = slope x = independent variable Section 12.1 Inference for Linear
More informationCorrelation and simple linear regression S5
Basic medical statistics for clinical and eperimental research Correlation and simple linear regression S5 Katarzyna Jóźwiak k.jozwiak@nki.nl November 15, 2017 1/41 Introduction Eample: Brain size and
More informationBivariate Relationships Between Variables
Bivariate Relationships Between Variables BUS 735: Business Decision Making and Research 1 Goals Specific goals: Detect relationships between variables. Be able to prescribe appropriate statistical methods
More information7.2 One-Sample Correlation ( = a) Introduction. Correlation analysis measures the strength and direction of association between
7.2 One-Sample Correlation ( = a) Introduction Correlation analysis measures the strength and direction of association between variables. In this chapter we will test whether the population correlation
More informationCorrelation 1. December 4, HMS, 2017, v1.1
Correlation 1 December 4, 2017 1 HMS, 2017, v1.1 Chapter References Diez: Chapter 7 Navidi, Chapter 7 I don t expect you to learn the proofs what will follow. Chapter References 2 Correlation The sample
More informationInference for Regression
Inference for Regression Section 9.4 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 13b - 3339 Cathy Poliak, Ph.D. cathy@math.uh.edu
More informationIs a measure of the strength and direction of a linear relationship
More statistics: Correlation and Regression Coefficients Elie Gurarie Biol 799 - Lecture 2 January 2, 2017 January 2, 2017 Correlation (r) Is a measure of the strength and direction of a linear relationship
More informationLecture 06. DSUR CH 05 Exploring Assumptions of parametric statistics Hypothesis Testing Power
Lecture 06 DSUR CH 05 Exploring Assumptions of parametric statistics Hypothesis Testing Power Introduction Assumptions When broken then we are not able to make inference or accurate descriptions about
More informationREVIEW 8/2/2017 陈芳华东师大英语系
REVIEW Hypothesis testing starts with a null hypothesis and a null distribution. We compare what we have to the null distribution, if the result is too extreme to belong to the null distribution (p
More informationBusiness Statistics. Lecture 10: Course Review
Business Statistics Lecture 10: Course Review 1 Descriptive Statistics for Continuous Data Numerical Summaries Location: mean, median Spread or variability: variance, standard deviation, range, percentiles,
More informationSimple Linear Regression
Simple Linear Regression ST 370 Regression models are used to study the relationship of a response variable and one or more predictors. The response is also called the dependent variable, and the predictors
More informationLecture 19 Multiple (Linear) Regression
Lecture 19 Multiple (Linear) Regression Thais Paiva STA 111 - Summer 2013 Term II August 1, 2013 1 / 30 Thais Paiva STA 111 - Summer 2013 Term II Lecture 19, 08/01/2013 Lecture Plan 1 Multiple regression
More informationMinimum Regularized Covariance Determinant Estimator
Minimum Regularized Covariance Determinant Estimator Honey, we shrunk the data and the covariance matrix Kris Boudt (joint with: P. Rousseeuw, S. Vanduffel and T. Verdonck) Vrije Universiteit Brussel/Amsterdam
More informationDistribution Assumptions
Merlise Clyde Duke University November 22, 2016 Outline Topics Normality & Transformations Box-Cox Nonlinear Regression Readings: Christensen Chapter 13 & Wakefield Chapter 6 Linear Model Linear Model
More informationSimple Linear Regression
Simple Linear Regression ST 430/514 Recall: A regression model describes how a dependent variable (or response) Y is affected, on average, by one or more independent variables (or factors, or covariates)
More informationCorrelation: Relationships between Variables
Correlation Correlation: Relationships between Variables So far, nearly all of our discussion of inferential statistics has focused on testing for differences between group means However, researchers are
More informationStatistics Introductory Correlation
Statistics Introductory Correlation Session 10 oscardavid.barrerarodriguez@sciencespo.fr April 9, 2018 Outline 1 Statistics are not used only to describe central tendency and variability for a single variable.
More informationCorrelation and Linear Regression
Correlation and Linear Regression Correlation: Relationships between Variables So far, nearly all of our discussion of inferential statistics has focused on testing for differences between group means
More informationLecture 11: Simple Linear Regression
Lecture 11: Simple Linear Regression Readings: Sections 3.1-3.3, 11.1-11.3 Apr 17, 2009 In linear regression, we examine the association between two quantitative variables. Number of beers that you drink
More informationTHE ROYAL STATISTICAL SOCIETY 2008 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE 4 LINEAR MODELS
THE ROYAL STATISTICAL SOCIETY 008 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE 4 LINEAR MODELS The Society provides these solutions to assist candidates preparing for the examinations
More information8CORE SAMPLE. Revision of the Core
C H A P T E R 8CORE of the Core 8. Displaying, summarising and describing univariate data The following information relates to Questions to 3 The percentage investment returns of seven superannuation funds
More informationFinding Relationships Among Variables
Finding Relationships Among Variables BUS 230: Business and Economic Research and Communication 1 Goals Specific goals: Re-familiarize ourselves with basic statistics ideas: sampling distributions, hypothesis
More informationStat 101: Lecture 6. Summer 2006
Stat 101: Lecture 6 Summer 2006 Outline Review and Questions Example for regression Transformations, Extrapolations, and Residual Review Mathematical model for regression Each point (X i, Y i ) in the
More informationLinear regression and correlation
Faculty of Health Sciences Linear regression and correlation Statistics for experimental medical researchers 2018 Julie Forman, Christian Pipper & Claus Ekstrøm Department of Biostatistics, University
More informationChapter 12 - Lecture 2 Inferences about regression coefficient
Chapter 12 - Lecture 2 Inferences about regression coefficient April 19th, 2010 Facts about slope Test Statistic Confidence interval Hypothesis testing Test using ANOVA Table Facts about slope In previous
More informationLecture 3. The Population Variance. The population variance, denoted σ 2, is the sum. of the squared deviations about the population
Lecture 5 1 Lecture 3 The Population Variance The population variance, denoted σ 2, is the sum of the squared deviations about the population mean divided by the number of observations in the population,
More informationData Analysis and Statistical Methods Statistics 651
Data Analysis and Statistical Methods Statistics 65 http://www.stat.tamu.edu/~suhasini/teaching.html Suhasini Subba Rao Review In the previous lecture we considered the following tests: The independent
More informationSimple Linear Regression
Simple Linear Regression In simple linear regression we are concerned about the relationship between two variables, X and Y. There are two components to such a relationship. 1. The strength of the relationship.
More informationIntroduction and Single Predictor Regression. Correlation
Introduction and Single Predictor Regression Dr. J. Kyle Roberts Southern Methodist University Simmons School of Education and Human Development Department of Teaching and Learning Correlation A correlation
More informationRegression Analysis. BUS 735: Business Decision Making and Research
Regression Analysis BUS 735: Business Decision Making and Research 1 Goals and Agenda Goals of this section Specific goals Learn how to detect relationships between ordinal and categorical variables. Learn
More informationStat 20 Midterm 1 Review
Stat 20 Midterm Review February 7, 2007 This handout is intended to be a comprehensive study guide for the first Stat 20 midterm exam. I have tried to cover all the course material in a way that targets
More informationRegression Analysis. BUS 735: Business Decision Making and Research. Learn how to detect relationships between ordinal and categorical variables.
Regression Analysis BUS 735: Business Decision Making and Research 1 Goals of this section Specific goals Learn how to detect relationships between ordinal and categorical variables. Learn how to estimate
More informationFinal Exam - Solutions
Ecn 102 - Analysis of Economic Data University of California - Davis March 19, 2010 Instructor: John Parman Final Exam - Solutions You have until 5:30pm to complete this exam. Please remember to put your
More informationR 2 and F -Tests and ANOVA
R 2 and F -Tests and ANOVA December 6, 2018 1 Partition of Sums of Squares The distance from any point y i in a collection of data, to the mean of the data ȳ, is the deviation, written as y i ȳ. Definition.
More informationLecture 30. DATA 8 Summer Regression Inference
DATA 8 Summer 2018 Lecture 30 Regression Inference Slides created by John DeNero (denero@berkeley.edu) and Ani Adhikari (adhikari@berkeley.edu) Contributions by Fahad Kamran (fhdkmrn@berkeley.edu) and
More informationSimple Linear Regression
Simple Linear Regression EdPsych 580 C.J. Anderson Fall 2005 Simple Linear Regression p. 1/80 Outline 1. What it is and why it s useful 2. How 3. Statistical Inference 4. Examining assumptions (diagnostics)
More informationCorrelation. Tests of Relationships: Correlation. Correlation. Correlation. Bivariate linear correlation. Correlation 9/8/2018
Tests of Relationships: Parametric and non parametric approaches Whether samples from two different variables vary together in a linear fashion Parametric: Pearson product moment correlation Non parametric:
More informationRegression. Marc H. Mehlman University of New Haven
Regression Marc H. Mehlman marcmehlman@yahoo.com University of New Haven the statistician knows that in nature there never was a normal distribution, there never was a straight line, yet with normal and
More information7.0 Lesson Plan. Regression. Residuals
7.0 Lesson Plan Regression Residuals 1 7.1 More About Regression Recall the regression assumptions: 1. Each point (X i, Y i ) in the scatterplot satisfies: Y i = ax i + b + ɛ i where the ɛ i have a normal
More informationChapter 12 - Part I: Correlation Analysis
ST coursework due Friday, April - Chapter - Part I: Correlation Analysis Textbook Assignment Page - # Page - #, Page - # Lab Assignment # (available on ST webpage) GOALS When you have completed this lecture,
More informationDensity Temp vs Ratio. temp
Temp Ratio Density 0.00 0.02 0.04 0.06 0.08 0.10 0.12 Density 0.0 0.2 0.4 0.6 0.8 1.0 1. (a) 170 175 180 185 temp 1.0 1.5 2.0 2.5 3.0 ratio The histogram shows that the temperature measures have two peaks,
More informationINFERENCE FOR REGRESSION
CHAPTER 3 INFERENCE FOR REGRESSION OVERVIEW In Chapter 5 of the textbook, we first encountered regression. The assumptions that describe the regression model we use in this chapter are the following. We
More information0.1 The plug-in principle for finding estimators
The Bootstrap.1 The plug-in principle for finding estimators Under a parametric model P = {P θ ;θ Θ} (or a non-parametric P = {P F ;F F}), any real-valued characteristic τ of a particular member P θ (or
More informationLectures on Simple Linear Regression Stat 431, Summer 2012
Lectures on Simple Linear Regression Stat 43, Summer 0 Hyunseung Kang July 6-8, 0 Last Updated: July 8, 0 :59PM Introduction Previously, we have been investigating various properties of the population
More information2. Outliers and inference for regression
Unit6: Introductiontolinearregression 2. Outliers and inference for regression Sta 101 - Spring 2016 Duke University, Department of Statistical Science Dr. Çetinkaya-Rundel Slides posted at http://bit.ly/sta101_s16
More information23. Inference for regression
23. Inference for regression The Practice of Statistics in the Life Sciences Third Edition 2014 W. H. Freeman and Company Objectives (PSLS Chapter 23) Inference for regression The regression model Confidence
More informationReview. Number of variables. Standard Scores. Anecdotal / Clinical. Bivariate relationships. Ch. 3: Correlation & Linear Regression
Ch. 3: Correlation & Relationships between variables Scatterplots Exercise Correlation Race / DNA Review Why numbers? Distribution & Graphs : Histogram Central Tendency Mean (SD) The Central Limit Theorem
More informationSLR output RLS. Refer to slr (code) on the Lecture Page of the class website.
SLR output RLS Refer to slr (code) on the Lecture Page of the class website. Old Faithful at Yellowstone National Park, WY: Simple Linear Regression (SLR) Analysis SLR analysis explores the linear association
More informationChapter 13 Correlation
Chapter Correlation Page. Pearson correlation coefficient -. Inferential tests on correlation coefficients -9. Correlational assumptions -. on-parametric measures of correlation -5 5. correlational example
More informationTransformations. Merlise Clyde. Readings: Gelman & Hill Ch 2-4, ALR 8-9
Transformations Merlise Clyde Readings: Gelman & Hill Ch 2-4, ALR 8-9 Assumptions of Linear Regression Y i = β 0 + β 1 X i1 + β 2 X i2 +... β p X ip + ɛ i Model Linear in X j but X j could be a transformation
More informationPredicted Y Scores. The symbol stands for a predicted Y score
REGRESSION 1 Linear Regression Linear regression is a statistical procedure that uses relationships to predict unknown Y scores based on the X scores from a correlated variable. 2 Predicted Y Scores Y
More informationLecture 14 Simple Linear Regression
Lecture 4 Simple Linear Regression Ordinary Least Squares (OLS) Consider the following simple linear regression model where, for each unit i, Y i is the dependent variable (response). X i is the independent
More information9. Linear Regression and Correlation
9. Linear Regression and Correlation Data: y a quantitative response variable x a quantitative explanatory variable (Chap. 8: Recall that both variables were categorical) For example, y = annual income,
More informationBusiness Statistics. Lecture 10: Correlation and Linear Regression
Business Statistics Lecture 10: Correlation and Linear Regression Scatterplot A scatterplot shows the relationship between two quantitative variables measured on the same individuals. It displays the Form
More informationSSR = The sum of squared errors measures how much Y varies around the regression line n. It happily turns out that SSR + SSE = SSTO.
Analysis of variance approach to regression If x is useless, i.e. β 1 = 0, then E(Y i ) = β 0. In this case β 0 is estimated by Ȳ. The ith deviation about this grand mean can be written: deviation about
More informationSection 3: Simple Linear Regression
Section 3: Simple Linear Regression Carlos M. Carvalho The University of Texas at Austin McCombs School of Business http://faculty.mccombs.utexas.edu/carlos.carvalho/teaching/ 1 Regression: General Introduction
More informationReminder: Student Instructional Rating Surveys
Reminder: Student Instructional Rating Surveys You have until May 7 th to fill out the student instructional rating surveys at https://sakai.rutgers.edu/portal/site/sirs The survey should be available
More informationAMS 315/576 Lecture Notes. Chapter 11. Simple Linear Regression
AMS 315/576 Lecture Notes Chapter 11. Simple Linear Regression 11.1 Motivation A restaurant opening on a reservations-only basis would like to use the number of advance reservations x to predict the number
More informationHomework 2: Simple Linear Regression
STAT 4385 Applied Regression Analysis Homework : Simple Linear Regression (Simple Linear Regression) Thirty (n = 30) College graduates who have recently entered the job market. For each student, the CGPA
More informationVariance. Standard deviation VAR = = value. Unbiased SD = SD = 10/23/2011. Functional Connectivity Correlation and Regression.
10/3/011 Functional Connectivity Correlation and Regression Variance VAR = Standard deviation Standard deviation SD = Unbiased SD = 1 10/3/011 Standard error Confidence interval SE = CI = = t value for
More informationSimple Linear Regression. (Chs 12.1, 12.2, 12.4, 12.5)
10 Simple Linear Regression (Chs 12.1, 12.2, 12.4, 12.5) Simple Linear Regression Rating 20 40 60 80 0 5 10 15 Sugar 2 Simple Linear Regression Rating 20 40 60 80 0 5 10 15 Sugar 3 Simple Linear Regression
More informationNonparametric Heteroscedastic Transformation Regression Models for Skewed Data, with an Application to Health Care Costs
Nonparametric Heteroscedastic Transformation Regression Models for Skewed Data, with an Application to Health Care Costs Xiao-Hua Zhou, Huazhen Lin, Eric Johnson Journal of Royal Statistical Society Series
More information9 Correlation and Regression
9 Correlation and Regression SW, Chapter 12. Suppose we select n = 10 persons from the population of college seniors who plan to take the MCAT exam. Each takes the test, is coached, and then retakes the
More informationLec 3: Model Adequacy Checking
November 16, 2011 Model validation Model validation is a very important step in the model building procedure. (one of the most overlooked) A high R 2 value does not guarantee that the model fits the data
More informationMulticollinearity occurs when two or more predictors in the model are correlated and provide redundant information about the response.
Multicollinearity Read Section 7.5 in textbook. Multicollinearity occurs when two or more predictors in the model are correlated and provide redundant information about the response. Example of multicollinear
More informationModule 7 Practice problem and Homework answers
Module 7 Practice problem and Homework answers Practice problem, page 1 Is the research hypothesis one-tailed or two-tailed? Answer: one tailed In the set up for the problem, we predicted a specific outcome
More informationNotes 6. Basic Stats Procedures part II
Statistics 5106, Fall 2007 Notes 6 Basic Stats Procedures part II Testing for Correlation between Two Variables You have probably all heard about correlation. When two variables are correlated, they are
More informationClass 15. Daniel B. Rowe, Ph.D. Department of Mathematics, Statistics, and Computer Science. Marquette University MATH 1700
Class 15 Daniel B. Rowe, Ph.D. Department of Mathematics, Statistics, and Computer Science Copyright 17 by D.B. Rowe 1 Agenda: Recap Chapter 7.1 7. Lecture Chapter 7. Discussion of Chapters Problem Solving
More informationEstimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X.
Estimating σ 2 We can do simple prediction of Y and estimation of the mean of Y at any value of X. To perform inferences about our regression line, we must estimate σ 2, the variance of the error term.
More informationSociology 6Z03 Review II
Sociology 6Z03 Review II John Fox McMaster University Fall 2016 John Fox (McMaster University) Sociology 6Z03 Review II Fall 2016 1 / 35 Outline: Review II Probability Part I Sampling Distributions Probability
More informationLinear models and their mathematical foundations: Simple linear regression
Linear models and their mathematical foundations: Simple linear regression Steffen Unkel Department of Medical Statistics University Medical Center Göttingen, Germany Winter term 2018/19 1/21 Introduction
More informationReview: General Approach to Hypothesis Testing. 1. Define the research question and formulate the appropriate null and alternative hypotheses.
1 Review: Let X 1, X,..., X n denote n independent random variables sampled from some distribution might not be normal!) with mean µ) and standard deviation σ). Then X µ σ n In other words, X is approximately
More information18.0 Multiple and Nonlinear Regression
18.0 Multiple and Nonlinear Regression 1 Answer Questions Multiple Regression Nonlinear Regression 18.1 Multiple Regression Recall the regression assumptions: 1. Each point (X i,y i ) in the scatterplot
More informationReview 6. n 1 = 85 n 2 = 75 x 1 = x 2 = s 1 = 38.7 s 2 = 39.2
Review 6 Use the traditional method to test the given hypothesis. Assume that the samples are independent and that they have been randomly selected ) A researcher finds that of,000 people who said that
More information