MATH ASSIGNMENT 2: SOLUTIONS

Similar documents
Interpreting the coefficients

Example: Forced Expiratory Volume (FEV) Program L13. Example: Forced Expiratory Volume (FEV) Example: Forced Expiratory Volume (FEV)

McGill University. Faculty of Science MATH 204 PRINCIPLES OF STATISTICS II. Final Examination

Predict y from (possibly) many predictors x. Model Criticism Study the importance of columns

6. Multiple regression - PROC GLM

School of Mathematical Sciences. Question 1

Linear models Analysis of Covariance

Regression Analysis II

Linear models Analysis of Covariance

Linear Regression. Example: Lung FuncAon in CF pts. Example: Lung FuncAon in CF pts. Lecture 9: Linear Regression BMI 713 / GEN 212

Unit 6 - Introduction to linear regression

Unit 6 - Simple linear regression

SPSS Output. ANOVA a b Residual Coefficients a Standardized Coefficients

Linear regression. We have that the estimated mean in linear regression is. ˆµ Y X=x = ˆβ 0 + ˆβ 1 x. The standard error of ˆµ Y X=x is.

Area1 Scaled Score (NAPLEX) .535 ** **.000 N. Sig. (2-tailed)

Inference for Regression

Practical Biostatistics

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Acknowledgements. Outline. Marie Diener-West. ICTR Leadership / Team INTRODUCTION TO CLINICAL RESEARCH. Introduction to Linear Regression

Ordinary Least Squares Regression Explained: Vartanian

Multiple Regression. Peerapat Wongchaiwat, Ph.D.

WORKSHOP 3 Measuring Association

Sociology Research Statistics I Final Exam Answer Key December 15, 1993

Mathematics for Economics MA course

Regression Analysis. Regression: Methodology for studying the relationship among two or more variables

Multiple linear regression S6

Correlation and the Analysis of Variance Approach to Simple Linear Regression

Simple Linear Regression: One Quantitative IV

Linear Regression. Simple linear regression model determines the relationship between one dependent variable (y) and one independent variable (x).

Statistics and Quantitative Analysis U4320

9. Linear Regression and Correlation

Variance Decomposition and Goodness of Fit

Inferences for Regression

Multiple Regression. More Hypothesis Testing. More Hypothesis Testing The big question: What we really want to know: What we actually know: We know:

Chapter 4. Regression Models. Learning Objectives

ST430 Exam 1 with Answers

Chapter 12 - Lecture 2 Inferences about regression coefficient

PubH 7405: REGRESSION ANALYSIS. MLR: INFERENCES, Part I

In Class Review Exercises Vartanian: SW 540

Ron Heck, Fall Week 3: Notes Building a Two-Level Model

Statistics 5100 Spring 2018 Exam 1

The Multiple Regression Model

Multiple Regression Examples

MS&E 226: Small Data

Table of z values and probabilities for the standard normal distribution. z is the first column plus the top row. Each cell shows P(X z).

assumes a linear relationship between mean of Y and the X s with additive normal errors the errors are assumed to be a sample from N(0, σ 2 )

McGill University. Faculty of Science MATH 204 PRINCIPLES OF STATISTICS II. Final Examination

Simple Linear Regression: One Qualitative IV

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #6

Example. Multiple Regression. Review of ANOVA & Simple Regression /749 Experimental Design for Behavioral and Social Sciences

Lecture 6 Multiple Linear Regression, cont.

Variance Decomposition in Regression James M. Murray, Ph.D. University of Wisconsin - La Crosse Updated: October 04, 2017

Chapter 14 Simple Linear Regression (A)

Ch 2: Simple Linear Regression

Chapter 14 Student Lecture Notes Department of Quantitative Methods & Information Systems. Business Statistics. Chapter 14 Multiple Regression

STA 303 H1S / 1002 HS Winter 2011 Test March 7, ab 1cde 2abcde 2fghij 3

Lecture 15 Multiple regression I Chapter 6 Set 2 Least Square Estimation The quadratic form to be minimized is

Chapter 2 Inferences in Simple Linear Regression

Applied Regression Analysis

Regression Models. Chapter 4. Introduction. Introduction. Introduction

Bivariate Regression Analysis. The most useful means of discerning causality and significance of variables

Ordinary Least Squares Regression Explained: Vartanian

LECTURE 6. Introduction to Econometrics. Hypothesis testing & Goodness of fit

SSR = The sum of squared errors measures how much Y varies around the regression line n. It happily turns out that SSR + SSE = SSTO.

Section 3: Simple Linear Regression

Regression: Main Ideas Setting: Quantitative outcome with a quantitative explanatory variable. Example, cont.

Table of z values and probabilities for the standard normal distribution. z is the first column plus the top row. Each cell shows P(X z).

Analysing data: regression and correlation S6 and S7

Review of Multiple Regression

STAT 350 Final (new Material) Review Problems Key Spring 2016

Correlation Analysis

Simple Linear Regression

Multiple linear regression

36-309/749 Experimental Design for Behavioral and Social Sciences. Sep. 22, 2015 Lecture 4: Linear Regression

Econ 3790: Statistics Business and Economics. Instructor: Yogesh Uppal

Unit 11: Multiple Linear Regression

Finding Relationships Among Variables

Chapter 16. Simple Linear Regression and dcorrelation

Keller: Stats for Mgmt & Econ, 7th Ed July 17, 2006

STAT763: Applied Regression Analysis. Multiple linear regression. 4.4 Hypothesis testing

Regression Analysis IV... More MLR and Model Building

STA441: Spring Multiple Regression. This slide show is a free open source document. See the last slide for copyright information.

Lecture 10 Multiple Linear Regression

Simple Linear Regression

Regression Analysis. BUS 735: Business Decision Making and Research. Learn how to detect relationships between ordinal and categorical variables.

SPSS LAB FILE 1

Inference. ME104: Linear Regression Analysis Kenneth Benoit. August 15, August 15, 2012 Lecture 3 Multiple linear regression 1 1 / 58

Review of the General Linear Model

We like to capture and represent the relationship between a set of possible causes and their response, by using a statistical predictive model.

STAT Chapter 11: Regression

Lecture 9: Linear Regression

Chapter 16. Simple Linear Regression and Correlation

ECON 450 Development Economics

Introduction and Single Predictor Regression. Correlation

Sociology 593 Exam 1 Answer Key February 17, 1995

Multivariate Regression (Chapter 10)

regression analysis is a type of inferential statistics which tells us whether relationships between two or more variables exist

Random Intercept Models

Correlation and simple linear regression S5

MATH 644: Regression Analysis Methods

Transcription:

MATH 204 - ASSIGNMENT 2: SOLUTIONS (a) Fitting the simple linear regression model to each of the variables in turn yields the following results: we look at t-tests for the individual coefficients, and the -F test statistics. Variable T-test Conclusion β s β t p F p age 4.055.088 3.726 0.00 3.380 0.00 INFLUENTIAL height 0.932 0.260 3.590 0.002 2.885 0.002 INFLUENTIAL weight.87 0.30 3.944 0.00 5.559 0.00 INFLUENTIAL bmp 0.639 0.565.3 0.270.279 0.270 NOT INFLUENTIAL fev.354 0.555 2.439 0.023 5.95 0.023 MARGINALLY INFLUENTIAL rv -0.23 0.077 -.595 0.24 2.543 0.24 NOT INFLUENTIAL frc -0.39 0.45-2.202 0.038 4.847 0.038 MARGINALLY INFLUENTIAL tlc -0.358 0.404-0.886 0.385 0.785 0.385 NOT INFLUENTIAL sex 0 9.045 3.76.445 0.62 2.089 0.62 NOT INFLUENTIAL 98.455 9.860 9.985 0.000 The variables declared as Marginally Influential are not significant at the α = 0.05 significance level once a multiple testing correction is introduced; we are performing 9 tests, so we should test each hypothesis at the α = 0.05/9 = 0.0056 significance level. Note that the coefficient estimates for sex are baseline level (in the subgroup with sex=), and difference from baseline (in the subgroup with sex=0). The fact that baseline subgroup coefficient is significantly different from zero does not mean that this variable is influential in predicting y. It is possible to fit the model to the log-transformed response variable. This was not asked for, but is an acceptable approach. For the log-transformed data, the results are as follows; Variable T-test Conclusion β s β t p F p age 0.033 0.009 3.55 0.002 2.355 0.002 INFLUENTIAL height 0.008 0.002 3.498 0.002 2.234 0.002 INFLUENTIAL weight 0.00 0.003 3.656 0.00 3.364 0.00 INFLUENTIAL bmp 0.005 0.005 0.943 0.356 0.888 0.356 NOT INFLUENTIAL fev 0.0 0.005 2.207 0.038 4.872 0.038 MARGINALLY INFLUENTIAL rv -0.00 0.00 -.595 0.24 2.543 0.24 NOT INFLUENTIAL frc -0.003 0.00-2.5 0.042 4.625 0.042 MARGINALLY INFLUENTIAL tlc -0.003 0.003-0.830 0.45 0.689 0.45 NOT INFLUENTIAL sex 0 0.54 0.3.366 0.85.865 0.85 NOT INFLUENTIAL 4.566 0.074 54.37 0.000 Overall, inspecting the R 2 values, it seems that the prediction of y rather than log(y) is better, but the difference is marginal. Thus, on the basis of simple linear regression one variable at a time, only age, height and weight are influential variables in the model predicting y. 0 Marks MATH 204 ASSIGNMENT 2 SOLUTIONS Page of 2

(b) The results of the multiple regression analysis, with all variables included in linear form without interaction, as specified by the model can be found on pages 9-22 of the printout. y = β 0 + β x + β 2 x 2 + + β 9 x 9 + ɛ The following results can be discerned (only the first is needed for full marks) No variables are individually influential - all the p-values for the coefficients and in the componentwise tests are greater than 0.05 - which contradicts the results from (a). 5 Marks However, we can go further. Using the table, we can construct a global test of the hypothesis H 0 : β = β 2 =... = β 8 = β (0) 9 = 0 H a : At least one β j 0 to assess whether the variables are worth including at all, or whether the null model described by H 0 fits as well. Here β (0) 9 is the coefficient for the sex=0 subgroup. We construct the test statistic F by F = MSR MSE = (SS SSE)/k SSE/(n k ) where k = 9 is the total number of predictors, and MSR is the mean square due to the regression - recall that SS = SSR + SSE. Here, for the original scale data F = (26832.64 937.250)/9 937.250/5 = 3.05 In the usual way for, we compare this with the Fisher-F(9, 5) distribution. We find that F 0.95 (9, 5) = 2.587 so H 0 is rejected; hence it is worth fitting the collection of variables, as this model fits better than the model with all variables excluded. This type of analysis is sometimes called ANCOVA, with covariates. We also have R 2 = SSE SS = 973.250 26832.64 = 0.637 Adjusted R 2 = SSE/(n k ) SS/(n ) so the overall explanatory power is not very good. = 973.250/5 26832.64/24 = 0.420 (c) The correlation table (page 23) indicates that a lot of the variables are strongly correlated. In particular, age, height and weight have correlations above 0.9, so are strongly positively correlated. 3 Marks (d) The results in (c) partly explain those in (a) and (b) as the strong positive correlation between the apparently influential variables age, height and weight makes the results of the multiple regression hard to interpret. In the multiple regression, we are not truly observing the effect of each variable manifested through the estimated coefficient. The variables are borrowing strength from each other in the prediction of the response; when they are fitted together, the influence of each covariate is diluted. 2 Marks MATH 204 ASSIGNMENT 2 SOLUTIONS Page 2 of 2

Regression for pemax.63.376.349 26.974 Regression 0098.478 0098.478 3.880.00 Residual 6734.62 23 727.572 Total 26832.640 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) 50.408 6.657 3.026.006 5.950 84.866 Age in years 4.055.088.63 3.726.00.803 6.306.599.359.33 27.345 Regression 9634.636 9634.636 2.885.002 Residual 798.004 23 747.739 Total 26832.640 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) -33.276 40.044 -.83.45-6.4 49.563 Height (cm).932.260.599 3.590.002.395.469

2.635.404.378 26.380 Regression 0827.59 0827.59 5.559.00 Residual 6005.48 23 695.890 Total 26832.640 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) 63.546 2.702 5.003.000 37.270 89.82 Weight (kg).87.30.635 3.944.00.564.809.230.053.0 33.244 Regression 43.464 43.464.279.270 Residual 2549.76 23 05.82 Total 26832.640 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) 59.080 44.744.320.200-33.48 5.64 Body mass percentage (% of normal).639.565.230.3.270 -.530.809

3.453.206.7 30.444 Regression 555.438 555.438 5.95.023 Residual 237.202 23 926.835 Total 26832.640 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) 62.4 20.208 3.074.005 20.309 03.98 Forced expiratory volume.354.555.453 2.439.023.206 2.502.36.00.060 32.4 Regression 267.777 267.777 2.543.24 Residual 2460.863 23 050.472 Total 26832.640 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) 40.423 20.67 6.793.000 97.662 83.85 Residual volume -.23.077 -.36 -.595.24 -.282.036

4.47.74.38 3.04 Regression 4670.553 4670.553 4.847.038 Residual 2262.087 23 963.569 Total 26832.640 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) 58.706 23.363 6.793.000 0.377 207.035 Functional residual capacity -.39.45 -.47-2.202.038 -.69 -.09.82.033 -.009 33.588 Regression 885.055 885.055.785.385 Residual 25947.585 23 28.56 Total 26832.640 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) 49.99 46.550 3.22.004 53.623 246.25 Total lung capacity -.358.404 -.82 -.886.385 -.94.478

5 Regression for log(pemax).59.349.32.23457 Regression.680.680 2.355.002 Residual.266 23.055 Total.945 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) 4.70.45 28.788.000 3.870 4.470 Age in years.033.009.59 3.55.002.04.053.589.347.39.23497 Regression.675.675 2.234.002 Residual.270 23.055 Total.945 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) 3.460.344 0.054.000 2.748 4.7 Height (cm).008.002.589 3.498.002.003.02

6.606.368.340.2329 Regression.75.75 3.364.00 Residual.230 23.053 Total.945 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) 4.28. 38.445.000 4.05 4.52 Weight (kg).00.003.606 3.656.00.004.05.93.037 -.005.28536 Regression.072.072.888.356 Residual.873 23.08 Total.945 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) 4.294.384.79.000 3.499 5.088 Body mass percentage (% of normal).005.005.93.943.356 -.005.05

7.48.75.39.2649 Regression.340.340 4.872.038 Residual.605 23.070 Total.945 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) 4.283.75 24.422.000 3.920 4.645 Forced expiratory volume.0.005.48 2.207.038.00.02.36.00.060.27596 Regression.94.94 2.544.24 Residual.752 23.076 Total.945 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) 4.98.76 27.945.000 4.554 5.282 Residual volume -.00.00 -.36 -.595.24 -.002.000

8.409.67.3.26536 Regression.326.326 4.625.042 Residual.620 23.070 Total.945 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) 5.066.200 25.365.000 4.653 5.479 Functional residual capacity -.003.00 -.409-2.5.042 -.005.000.7.029 -.03.28656 Regression.057.057.689.45 Residual.889 23.082 Total.945 24 for B B Std. Error Beta t Sig. Lower Bound Upper Bound (Constant) 4.978.397 2.534.000 4.56 5.800 Total lung capacity -.003.003 -.7 -.830.45 -.00.004

9 General Linear Analysis for pemax Corrected 0098.478 0098.478 3.880.00 Intercept 6663.073 6663.073 9.58.006 age 0098.478 0098.478 3.880.00 Error 6734.62 23 727.572 Total 32452.000 25 Corrected Total 26832.640 24 Intercept 50.408 6.657 3.026.006 5.950 84.866 age 4.055.088 3.726.00.803 6.306 Corrected 9634.636 9634.636 2.885.002 Intercept 56.320 56.320.69.45 height 9634.636 9634.636 2.885.002 Error 798.004 23 747.739 Total 32452.000 25 Corrected Total 26832.640 24 Intercept -33.276 40.044 -.83.45-6.4 49.563 height.932.260 3.590.002.395.469

0 Corrected 0827.59 0827.59 5.559.00 Intercept 747.83 747.83 25.030.000 weight 0827.59 0827.59 5.559.00 Error 6005.48 23 695.890 Total 32452.000 25 Corrected Total 26832.640 24 Intercept 63.546 2.702 5.003.000 37.270 89.82 weight.87.30 3.944.00.564.809 Corrected 43.464 43.464.279.270 Intercept 926.820 926.820.743.200 bmp 43.464 43.464.279.270 Error 2549.76 23 05.82 Total 32452.000 25 Corrected Total 26832.640 24 Intercept 59.080 44.744.320.200-33.48 5.64 bmp.639.565.3.270 -.530.809

Corrected 555.438 555.438 5.95.023 Intercept 8756.29 8756.29 9.447.005 fev 555.438 555.438 5.95.023 Error 237.202 23 926.835 Total 32452.000 25 Corrected Total 26832.640 24 Intercept 62.4 20.208 3.074.005 20.309 03.98 fev.354.555 2.439.023.206 2.502 Corrected 267.777 267.777 2.543.24 Intercept 48477.536 48477.536 46.48.000 rv 267.777 267.777 2.543.24 Error 2460.863 23 050.472 Total 32452.000 25 Corrected Total 26832.640 24 Intercept 40.423 20.67 6.793.000 97.662 83.85 rv -.23.077 -.595.24 -.282.036

2 Corrected 4670.553 4670.553 4.847.038 Intercept 44466.07 44466.07 46.47.000 frc 4670.553 4670.553 4.847.038 Error 2262.087 23 963.569 Total 32452.000 25 Corrected Total 26832.640 24 Intercept 58.706 23.363 6.793.000 0.377 207.035 frc -.39.45-2.202.038 -.69 -.09 Corrected 885.055 885.055.785.385 Intercept 70.53 70.53 0.372.004 tlc 885.055 885.055.785.385 Error 25947.585 23 28.56 Total 32452.000 25 Corrected Total 26832.640 24 Intercept 49.99 46.550 3.22.004 53.623 246.25 tlc -.358.404 -.886.385 -.94.478

3 Between-Subjects Factors Sex of patient Value Label N 0 Male 4 Female Corrected 2234.43 2234.43 2.089.62 Intercept 287280.03 287280.03 268.64.000 sex 2234.43 2234.43 2.089.62 Error 24598.227 23 069.488 Total 32452.000 25 Corrected Total 26832.640 24 Intercept 98.455 9.860 9.985.000 78.057 8.852 [sex=0] 9.045 3.76.445.62-8.22 46.303 [sex=] 0.....

4 General Linear Analysis for log(pemax) Corrected.680.680 2.355.002 Intercept 45.600 45.600 828.753.000 age.680.680 2.355.002 Error.266 23.055 Total 542.926 25 Corrected Total.945 24 Intercept 4.70.45 28.788.000 3.870 4.470 age.033.009 3.55.002.04.053 Corrected.675.675 2.234.002 Intercept 5.58 5.58 0.082.000 height.675.675 2.234.002 Error.270 23.055 Total 542.926 25 Corrected Total.945 24 Intercept 3.460.344 0.054.000 2.748 4.7 height.008.002 3.498.002.003.02

5 Corrected.75.75 3.364.00 Intercept 79.069 79.069 478.054.000 weight.75.75 3.364.00 Error.230 23.053 Total 542.926 25 Corrected Total.945 24 Intercept 4.28. 38.445.000 4.05 4.52 weight.00.003 3.656.00.004.05 Corrected.072.072.888.356 Intercept 0.77 0.77 24.980.000 bmp.072.072.888.356 Error.873 23.08 Total 542.926 25 Corrected Total.945 24 Intercept 4.294.384.79.000 3.499 5.088 bmp.005.005.943.356 -.005.05

6 Corrected.340.340 4.872.038 Intercept 4.627 4.627 596.425.000 fev.340.340 4.872.038 Error.605 23.070 Total 542.926 25 Corrected Total.945 24 Intercept 4.283.75 24.422.000 3.920 4.645 fev.0.005 2.207.038.00.02 Corrected.94.94 2.544.24 Intercept 59.47 59.47 780.97.000 rv.94.94 2.544.24 Error.752 23.076 Total 542.926 25 Corrected Total.945 24 Intercept 4.98.76 27.945.000 4.554 5.282 rv -.00.00 -.595.24 -.002.000

7 Corrected.326.326 4.625.042 Intercept 45.305 45.305 643.374.000 frc.326.326 4.625.042 Error.620 23.070 Total 542.926 25 Corrected Total.945 24 Intercept 5.066.200 25.365.000 4.653 5.479 frc -.003.00-2.5.042 -.005.000 Corrected.057.057.689.45 Intercept 2.902 2.902 57.2.000 tlc.057.057.689.45 Error.889 23.082 Total 542.926 25 Corrected Total.945 24 Intercept 4.978.397 2.534.000 4.56 5.800 tlc -.003.003 -.830.45 -.00.004

8 Between-Subjects Factors Sex of patient Value Label N 0 Male 4 Female Corrected.46.46.865.85 Intercept 53.076 53.076 6788.253.000 sex.46.46.865.85 Error.799 23.078 Total 542.926 25 Corrected Total.945 24 Intercept 4.566.084 54.37.000 4.39 4.740 [sex=0].54.3.366.85 -.079.387 [sex=] 0.....

9 Multiple regression for pemax Between-Subjects Factors Sex of patient Value Label N 0 Male 4 Female Corrected 70.390 9 900.54 2.929.032 Intercept 396.750 396.750.62.446 age 8.83 8.83.280.604 height 58.37 58.37.244.628 weight 44.23 44.23 2.222.57 bmp 480.23 480.23 2.28.52 fev 648.450 648.450.000.333 rv 653.775 653.775.008.33 frc 254.552 254.552.392.540 tlc 92.404 92.404.42.7 sex 37.902 37.902.058.82 Error 973.250 5 648.750 Total 32452.000 25 Corrected Total 26832.640 24 Intercept 72.32 29.82.784.445-296.25 640.858 age -2.542 4.802 -.529.604-2.777 7.693 height -.446.903 -.494.628-2.372.479 weight 2.993 2.008.490.57 -.287 7.273 bmp -.745.55 -.50.52-4.207.77 fev.08.08.000.333 -.223 3.385 rv.97.96.004.33 -.22.65 frc -.308.492 -.626.540 -.358.74 tlc.89.500.377.7 -.877.254 [sex=0] 3.737 5.460.242.82-29.25 36.689 [sex=] 0.....

20 Std. Residual Predicted Observed Observed Predicted Std. Residual : Intercept + age + height + weight + bmp + fev + rv + frc + tlc + sex

2 Multiple regression for log(pemax) Corrected.42 9.27 2.370.067 Intercept.367.367 6.857.09 age.02.02.394.540 height.005.005.097.760 weight.04.04.933.85 bmp.27.27 2.364.45 fev.036.036.670.426 rv.030.030.558.466 frc.05.05.279.605 tlc.006.006.4.740 sex.003.003.049.828 Error.803 5.054 Total 542.926 25 Corrected Total.945 24 Intercept 5.283.997 2.646.08.027 9.540 age -.027.044 -.628.540 -.20.066 height -.003.008 -.3.760 -.020.05 weight.025.08.390.85 -.04.064 bmp -.06.00 -.537.45 -.039.006 fev.008.00.88.426 -.03.029 rv.00.002.747.466 -.002.005 frc -.002.004 -.528.605 -.02.007 tlc.002.005.338.740 -.008.0 [sex=0].03.40.22.828 -.268.330 [sex=] 0.....

22 Std. Residual Predicted Observed Observed Predicted Std. Residual : Intercept + age + height + weight + bmp + fev + rv + frc + tlc + sex

(c) Correlation matrix for continuous covariates, response and log response Age in years Height (cm) Weight (kg) Body mass percentage (% of normal) Forced expiratory volume Residual volume Functional residual capacity Total lung capacity Maximum expiratory pressure (cm water) log(pemax) Pearson Correlation Sig. (2-tailed) N Pearson Correlation Sig. (2-tailed) N Pearson Correlation Sig. (2-tailed) N Pearson Correlation Sig. (2-tailed) N Pearson Correlation Sig. (2-tailed) N Pearson Correlation Sig. (2-tailed) N Pearson Correlation Sig. (2-tailed) N Pearson Correlation Sig. (2-tailed) N Pearson Correlation Sig. (2-tailed) N Pearson Correlation Sig. (2-tailed) N **. Correlation is significant at the 0.0 level (2-tailed). *. Correlation is significant at the 0.05 level (2-tailed). Correlations Body mass percentage Forced expiratory Residual Functional residual Total lung Maximum expiratory pressure Age in years Height (cm) Weight (kg) (% of normal) volume volume capacity capacity (cm water) log(pemax).926**.906**.378.294 -.552** -.639** -.469*.63**.59**.000.000.063.53.004.00.08.00.002 25 25 25 25 25 25 25 25 25 25.926**.92**.44*.37 -.570** -.624** -.457*.599**.589**.000.000.027.23.003.00.022.002.002 25 25 25 25 25 25 25 25 25 25.906**.92**.673**.449* -.622** -.67** -.48*.635**.606**.000.000.000.024.00.00.037.00.00 25 25 25 25 25 25 25 25 25 25.378.44*.673**.546** -.582** -.434* -.365.230.93.063.027.000.005.002.030.073.270.356 25 25 25 25 25 25 25 25 25 25.294.37.449*.546** -.666** -.665** -.443*.453*.48*.53.23.024.005.000.000.027.023.038 25 25 25 25 25 25 25 25 25 25 -.552** -.570** -.622** -.582** -.666**.9**.589** -.36 -.36.004.003.00.002.000.000.002.24.24 25 25 25 25 25 25 25 25 25 25 -.639** -.624** -.67** -.434* -.665**.9**.704** -.47* -.409*.00.00.00.030.000.000.000.038.042 25 25 25 25 25 25 25 25 25 25 -.469* -.457* -.48* -.365 -.443*.589**.704** -.82 -.7.08.022.037.073.027.002.000.385.45 25 25 25 25 25 25 25 25 25 25.63**.599**.635**.230.453* -.36 -.47* -.82.989**.00.002.00.270.023.24.038.385.000 25 25 25 25 25 25 25 25 25 25.59**.589**.606**.93.48* -.36 -.409* -.7.989**.002.002.00.356.038.24.042.45.000 25 25 25 25 25 25 25 25 25 25