14.0 RESPONSE SURFACE METHODOLOGY (RSM)

Size: px
Start display at page:

Download "14.0 RESPONSE SURFACE METHODOLOGY (RSM)"

Transcription

1 4. RESPONSE SURFACE METHODOLOGY (RSM) (Updated Spring ) So far, we ve focused on experiments that: Identify a few important variables from a large set of candidate variables, i.e., a screening experiment. Ascertain how a few variables impact the response Now we want to answer the question: What specific levels of the important variables produce an optimum response? x ( x, x ) ( x, x ) x How do we get from starting position (, ) to optimum position ( x, x ), top of the hill? x x Our approach will be based on characterizing the true response surface, y η + ε where η is the mean response and ε is the error, with a model, ŷ. RSM: An experimental optimization procedure:. Plan and run a factorial (or fractional factorial) design near/at our starting point. ( x, x, ). Fit a linear model (no interaction or quadratic terms) to the data. 3. Determine path of steepest ascent (PSA) - quick way to move to the optimum - gradient based 4. Run tests on the PSA until response no longer improves. 9

2 5. If curvature of surface large go to step 6, else go to step 6. Neighborhood of optimum - design, run, and fit (using least squares) a nd order model. 7. Based on nd order model - pick optimal settings of independent variables. Example: Froth Flotation Process - Desire to whiten a wood pulp mixture. independent variables: () Concentration of environmentally safe bleach () Mixture temperature. Current operating point: %Bleach 4%, Temp 8 o F. we believe we are well away from the optimum. Allowable variable ranges: -% Bleach 6-3 Temp. An initial factorial design Variable Low Level Midpoint High Level ()% Bleach 4 6 () Temp x ŷ 9 ŷ Averge3.5 E 4 E E (It is small relative to E and E, so the response surface is fairly linear) x

3 Linear model: y b + b x + b x + ε Fitted model is: ŷ x + 5.5x A one unit change in x is a change of % in Bleach. A one unit change in x is a change of 5 deg. in Temperature. %Bleach-4 x , Bleach% x +4 Temp - 8 x , Temp 5x +8 5 We want to move from our starting point (x, x or % Bleach 4, Temp 8) in the direction that will increase the response the fastest. Thus, we want to move in a direction normal to the contours. Path of steepest ascent (PSA): line passing through point (x, x ) that is perpendicular to the contours. Define contour passing through (,) Relation between x & x for this contour 7x + 5.5x or x 7x ŷ x + 5.5x 3.5 Line normal to this one (slope of PSA ) contour slope x 5.5x Interpretation: for a one unit change in x, x must change by 5.5/7. units to remain on the PSA. A 7 unit change in x -> a 5.5 unit change in x. Gradient of ŷ may also be used to determine PSA. Let s define some points (candidate tests) on the path of steepest ascent (PSA) 94

4 Candidate Test x x 5.5x %Bleach x +4 Temp 5x +8 y Since it appears that the response surface is fairly linear (E is small) where we conducted tests, no reason to examine/conduct test or. Run st test on or beyond boundary defined by x, x +. Continue running tests on the PSA until response no longer increases. Highest response occurs at (Bleach 3%, Temp 98 o ). Run new factorial at this point. Ran initial factorial design - defined path of steepest ascent - moved on PSA to Bleach3%, Temperature 98 o, Whiteness 86%. Conduct another factorial design near this point 95

5 Temp x Averge79 E 6 E -5 E -7(It is fairly large) x 4 Bleach ŷ x.5x describes response surface with a plane. At x, x, ŷ 79 The combinations of (x, x ) that give ŷ 79 are specified by: 3x -.5 x 3 x x < (x, x ) relationship along 79 contour.5 ŷ.5 PSA: x x or x -.x 3.833x Let s follow the PSA but take steps in the x direction since is bigger than E. E 96

6 Candidate Test x x -.833x %Bleach 3+x Temp 97+5x y Highest response at Bleach 4.5%, Temp 9.8 o -----> 9% Whiteness. Note that we weren t able to move as far along the path before we fell off. We now believe we are fairly close to the optimum, so we must characterize the response surface with more than a plane, perhaps a model of the form: y b + b x + b x + b x x + b x + b x + ε To estimate the b s in this model, we need more than a - level factorial design. A Central Composite Design (CCD) is more efficient than a 3- level factorial design. Given the results of a nd order design - must use least squares (linear regression) to estimate the b s. We will examine the topic of regression after we finish up the RSM procedure for our example. Fit the following model to the response surface near the point: 97

7 %Bleach4.5, Temp 9.8, Whiteness9% Note a small change in nomenclature: use B s as the model parameter to be estimated, use b s as the estimate of the parameter. So, we are assuming the response surface can be characterized by the model: y B + B x + B x + B x x + B x + B x +ε The fitted model will be of the form: ŷ b + b x + b x + b x x + b x +b x b b b b b b b ( x T x ) T ỹ x Three-level Factorial Design x x + + x 3 - x x 3 k For three-level factorial designs, or designs, the number of tests in the 98

8 design is, where k is the number of variables being examined in the experiment. 3 k Variables Tests Central Composite Design x x +α + +α + x 3 - -α x - -α - -α + +α -α - + +α -α - + +α x In general, a central composite design (CCD) consists of points that form the base design, k star points, and several center points. Central Composite Designs (adapted from Montgomery) k Factorial Portion F Axial Points α Center Points (unif. prec.) n Center Points (orthog.) n Total Tests (unif. prec.) N Total Tests (orthog.) N k Number of Factorial Points, F k, α ( F) 4 : rotatability property - variance contours of ŷ are concentric circles. Unif. Prec.: variance at origin same as variance a unit away from origin. 99

9 Variable Levels -α α + Bleach (%) Temp (deg) Test Plan Test x x %Bleach Temp Calculating the Parameter Estimates x ỹ

10 T ( ) x x T ( ) x x T ( ỹ) x b b b b b b b ( x T x ) T ỹ x The fitted model is shown in the graph below

11 ŷ x x Linear Regression y x To describe the data above, propose the model: y B + B x + ε Fitted model will then be ŷ b + b x

12 n 6 Want to select values for b &b that minimize ( y i ŷ i ). i Define n 6 Sb (, b ) y i ŷ i i ( ) : the model residual Sum of Squares. Minimize n 6 ( y i b b x i ) Sb (, b ) ( y i ŷ i ) i n 6 i To find minimum S, take partial derivatives of S with respect to b &b, set these equal to zero, and solve for b &b. Sb (, b b ) Σ( y i b b x i )( ) Sb (, b b ) Σ( y i b b x i )( x i ) Σy i + Σb o + Σb x i Simplifying, we obtain: Σx i y i + Σb o x i + Σb x i nb + b Σx i Σy i These two equations are known as Normal Equations. The values of b &b that satisfy the Normal Equations are the least squares estimates -- they are the values that give a minimum S. In matrix form b Σx i + b Σx i Σx i y i 3

13 n Σx i Σx i Σx i b Σ y i b Σ x i y i b b Least Squares Estimates n Σx i Σx i Σx i Σ y i Σ x i y i b b Σx nσx i ( Σx i ) i Σx i Σx i n Σ y i Σ x i y i Σx i Σyi Σx i Σx i y i nσx i ( Σx i ) Σx i Σy i + nσx i y i b b b, are the values of b & b that minimize S, the Residual Sum of b Squares. b Bˆ an estimate of B b Bˆ an estimate of B 4

14 Y X Matrix Approach 3 ỹ : Vector of Observations 5, : Matrix of Independent Variables (Design Matrix) x ŷ ŷ ŷ Vector of Observations : : x b ŷ n b coefficients b b 5

15 ẽ Vector of Prediction Errors e e : : e n ẽ ỹ ŷ Want to Min ẽ T ẽ or Min ( ỹ ) x b T ( ỹ ) x b Take derivative with respect to b s and set T T T ( ỹ ) ỹ + ( x x b x x x )b T ( x x )b x T ỹ Therefore, b T ( ) x x T ỹ x It is analogous to b n Σx b Σx Σx Σy Σxy Re-run experiments several times b.4667 b.539 b.55,, b b b If true model is y B +B x + ε Then E(b ) B, E(b )B, E[ ] b B 6

16 b B σ y T Var( ) ( ) b x x σ y where describes the experimental error variation in the y s( ). σ ε Var( b For our example, Var( ) ) Cov( b, b ). b Cov( b, b ) Var( b ) σ y σ ε If (or ) is unknown, we can estimate it with s ( y ŷ)t ( y ŷ) e T e ( n - # of parameters) n p and for the example, S res ν s y n 6 i ( ˆ ) y i y i n.6769 s b T Var( ) ( ) b x x s y ).586, s b.767 standard error of b s b.39, s b.97 standard error of b Marginal Confidence Interval for Parameter Values Given our parameter estimates, we can develop a (-α)% confidence intervals for the unknown parameter values. The confidence intervals take 7

17 the form: b i ± t α ν, -- s bi For our example (t 4, ), For B :.4667 ± (.776) (.767),.4667 ±.5 For B :.943 ± (.776) (.97),.943 ±.546 Note that these confidence intervals do not consider the fact that the parameter estimates may be correlated. Joint Confidence Interval for Parameter Values Joint Confidence Interval (approximately) is bounded by a Sum of Squares Contour, S. Consider all coefficients simultaneously, p S S R + n F p pn, p, α where S R ( ) ( y Sb i ŷ i ) n i Sum of Squares contours are only circles if b &b, are uncorrelated - (x T x) - is diagonal. Sum of Squares often contours appear as ellipses. Confidence Interval for Mean Response The predicted response at an arbitrary point, o, is ŷ. Due to the x o x b uncertainty in the b s, there is uncertainty in (remember is the best estimate available for the mean response at o ). x ŷ o Var( ŷ ) x ( x T x) x T σy ŷ o 8

18 of course, if the variance must be estimated from the data, we obtain: Var( ŷ ) x ( x T x) ) x T sy We can develop a ( α )% C.I. for the true mean response at x µ ( ): ŷ ± t α[ Var( ŷ )] α, -- ) Y X Prediction Interval Even if we know with certainty the ŷ o for a given x, there would still be variation in the responses due to system noise(ε). The variance of the mean of g future trials at is: x ) Var( y ) s Var( ŷ g ) ) We can define limits within which the mean of g future observations will fall. These limits for a future outcome are known as a (-α)% prediction interval. 9

19 ˆ ± t α ν y, -- s Var( ŷ g ) ) Y X Model Adequacy Checking. Residual Analysis The model adequacy can be checked by looking at the residuals. The normal probability plot should not show violations of the normality assumption. If there are items not included in the model that are of potential interest, then the residuals should be plotted against these omitted factors. Any structure in such a plot would indicate that the model could be improved by the addition of that factor.. F test If the sum of square of the residuals of the model is S. The sum of squares of the residuals of a new model with s factors added is S, then S ν F( ν S ν, ν ) and ν n p ν n p s where n is the number of total observations, and p is the number of parameters in the original model.

Lecture # 37. Prof. John W. Sutherland. Nov. 28, 2005

Lecture # 37. Prof. John W. Sutherland. Nov. 28, 2005 Lecture # 37 Prof. John W. Sutherland Nov. 28, 2005 Linear Regression 8 y 7 6 5 4 3 2 1 0 0 1 2 3 4 5 6 7 8 x Modeling To describe the data above, propose the model: y = B 0 + B 1 x + ε Fitted model will

More information

Response Surface Methodology

Response Surface Methodology Response Surface Methodology Bruce A Craig Department of Statistics Purdue University STAT 514 Topic 27 1 Response Surface Methodology Interested in response y in relation to numeric factors x Relationship

More information

7. Response Surface Methodology (Ch.10. Regression Modeling Ch. 11. Response Surface Methodology)

7. Response Surface Methodology (Ch.10. Regression Modeling Ch. 11. Response Surface Methodology) 7. Response Surface Methodology (Ch.10. Regression Modeling Ch. 11. Response Surface Methodology) Hae-Jin Choi School of Mechanical Engineering, Chung-Ang University 1 Introduction Response surface methodology,

More information

Analysis of Variance and Design of Experiments-II

Analysis of Variance and Design of Experiments-II Analysis of Variance and Design of Experiments-II MODULE VIII LECTURE - 36 RESPONSE SURFACE DESIGNS Dr. Shalabh Department of Mathematics & Statistics Indian Institute of Technology Kanpur 2 Design for

More information

Regression Analysis. Simple Regression Multivariate Regression Stepwise Regression Replication and Prediction Error EE290H F05

Regression Analysis. Simple Regression Multivariate Regression Stepwise Regression Replication and Prediction Error EE290H F05 Regression Analysis Simple Regression Multivariate Regression Stepwise Regression Replication and Prediction Error 1 Regression Analysis In general, we "fit" a model by minimizing a metric that represents

More information

DESIGN OF EXPERIMENT ERT 427 Response Surface Methodology (RSM) Miss Hanna Ilyani Zulhaimi

DESIGN OF EXPERIMENT ERT 427 Response Surface Methodology (RSM) Miss Hanna Ilyani Zulhaimi + DESIGN OF EXPERIMENT ERT 427 Response Surface Methodology (RSM) Miss Hanna Ilyani Zulhaimi + Outline n Definition of Response Surface Methodology n Method of Steepest Ascent n Second-Order Response Surface

More information

10.0 REPLICATED FULL FACTORIAL DESIGN

10.0 REPLICATED FULL FACTORIAL DESIGN 10.0 REPLICATED FULL FACTORIAL DESIGN (Updated Spring, 001) Pilot Plant Example ( 3 ), resp - Chemical Yield% Lo(-1) Hi(+1) Temperature 160 o 180 o C Concentration 10% 40% Catalyst A B Test# Temp Conc

More information

Response Surface Methodology

Response Surface Methodology Response Surface Methodology Process and Product Optimization Using Designed Experiments Second Edition RAYMOND H. MYERS Virginia Polytechnic Institute and State University DOUGLAS C. MONTGOMERY Arizona

More information

Response Surface Methodology:

Response Surface Methodology: Response Surface Methodology: Process and Product Optimization Using Designed Experiments RAYMOND H. MYERS Virginia Polytechnic Institute and State University DOUGLAS C. MONTGOMERY Arizona State University

More information

Response Surface Methodology IV

Response Surface Methodology IV LECTURE 8 Response Surface Methodology IV 1. Bias and Variance If y x is the response of the system at the point x, or in short hand, y x = f (x), then we can write η x = E(y x ). This is the true, and

More information

J. Response Surface Methodology

J. Response Surface Methodology J. Response Surface Methodology Response Surface Methodology. Chemical Example (Box, Hunter & Hunter) Objective: Find settings of R, Reaction Time and T, Temperature that produced maximum yield subject

More information

Response Surface Methods

Response Surface Methods Response Surface Methods 3.12.2014 Goals of Today s Lecture See how a sequence of experiments can be performed to optimize a response variable. Understand the difference between first-order and second-order

More information

Statistical View of Least Squares

Statistical View of Least Squares Basic Ideas Some Examples Least Squares May 22, 2007 Basic Ideas Simple Linear Regression Basic Ideas Some Examples Least Squares Suppose we have two variables x and y Basic Ideas Simple Linear Regression

More information

2.830J / 6.780J / ESD.63J Control of Manufacturing Processes (SMA 6303) Spring 2008

2.830J / 6.780J / ESD.63J Control of Manufacturing Processes (SMA 6303) Spring 2008 MIT OpenCourseWare http://ocw.mit.edu 2.830J / 6.780J / ESD.63J Control of Processes (SMA 6303) Spring 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Design & Analysis of Experiments 7E 2009 Montgomery

Design & Analysis of Experiments 7E 2009 Montgomery Chapter 5 1 Introduction to Factorial Design Study the effects of 2 or more factors All possible combinations of factor levels are investigated For example, if there are a levels of factor A and b levels

More information

Unit 10: Simple Linear Regression and Correlation

Unit 10: Simple Linear Regression and Correlation Unit 10: Simple Linear Regression and Correlation Statistics 571: Statistical Methods Ramón V. León 6/28/2004 Unit 10 - Stat 571 - Ramón V. León 1 Introductory Remarks Regression analysis is a method for

More information

Variance. Standard deviation VAR = = value. Unbiased SD = SD = 10/23/2011. Functional Connectivity Correlation and Regression.

Variance. Standard deviation VAR = = value. Unbiased SD = SD = 10/23/2011. Functional Connectivity Correlation and Regression. 10/3/011 Functional Connectivity Correlation and Regression Variance VAR = Standard deviation Standard deviation SD = Unbiased SD = 1 10/3/011 Standard error Confidence interval SE = CI = = t value for

More information

Unit 12: Response Surface Methodology and Optimality Criteria

Unit 12: Response Surface Methodology and Optimality Criteria Unit 12: Response Surface Methodology and Optimality Criteria STA 643: Advanced Experimental Design Derek S. Young 1 Learning Objectives Revisit your knowledge of polynomial regression Know how to use

More information

Lecture 9 SLR in Matrix Form

Lecture 9 SLR in Matrix Form Lecture 9 SLR in Matrix Form STAT 51 Spring 011 Background Reading KNNL: Chapter 5 9-1 Topic Overview Matrix Equations for SLR Don t focus so much on the matrix arithmetic as on the form of the equations.

More information

OPTIMIZATION OF FIRST ORDER MODELS

OPTIMIZATION OF FIRST ORDER MODELS Chapter 2 OPTIMIZATION OF FIRST ORDER MODELS One should not multiply explanations and causes unless it is strictly necessary William of Bakersville in Umberto Eco s In the Name of the Rose 1 In Response

More information

Analysis of Variance. Source DF Squares Square F Value Pr > F. Model <.0001 Error Corrected Total

Analysis of Variance. Source DF Squares Square F Value Pr > F. Model <.0001 Error Corrected Total Math 221: Linear Regression and Prediction Intervals S. K. Hyde Chapter 23 (Moore, 5th Ed.) (Neter, Kutner, Nachsheim, and Wasserman) The Toluca Company manufactures refrigeration equipment as well as

More information

Response Surface Methodology: Process and Product Optimization Using Designed Experiments, 3rd Edition

Response Surface Methodology: Process and Product Optimization Using Designed Experiments, 3rd Edition Brochure More information from http://www.researchandmarkets.com/reports/705963/ Response Surface Methodology: Process and Product Optimization Using Designed Experiments, 3rd Edition Description: Identifying

More information

Response Surface Methodology III

Response Surface Methodology III LECTURE 7 Response Surface Methodology III 1. Canonical Form of Response Surface Models To examine the estimated regression model we have several choices. First, we could plot response contours. Remember

More information

Measuring the fit of the model - SSR

Measuring the fit of the model - SSR Measuring the fit of the model - SSR Once we ve determined our estimated regression line, we d like to know how well the model fits. How far/close are the observations to the fitted line? One way to do

More information

Bias Variance Trade-off

Bias Variance Trade-off Bias Variance Trade-off The mean squared error of an estimator MSE(ˆθ) = E([ˆθ θ] 2 ) Can be re-expressed MSE(ˆθ) = Var(ˆθ) + (B(ˆθ) 2 ) MSE = VAR + BIAS 2 Proof MSE(ˆθ) = E((ˆθ θ) 2 ) = E(([ˆθ E(ˆθ)]

More information

Linear Regression. Simple linear regression model determines the relationship between one dependent variable (y) and one independent variable (x).

Linear Regression. Simple linear regression model determines the relationship between one dependent variable (y) and one independent variable (x). Linear Regression Simple linear regression model determines the relationship between one dependent variable (y) and one independent variable (x). A dependent variable is a random variable whose variation

More information

THE ROYAL STATISTICAL SOCIETY 2008 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE 4 LINEAR MODELS

THE ROYAL STATISTICAL SOCIETY 2008 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE 4 LINEAR MODELS THE ROYAL STATISTICAL SOCIETY 008 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE 4 LINEAR MODELS The Society provides these solutions to assist candidates preparing for the examinations

More information

Chemometrics Unit 4 Response Surface Methodology

Chemometrics Unit 4 Response Surface Methodology Chemometrics Unit 4 Response Surface Methodology Chemometrics Unit 4. Response Surface Methodology In Unit 3 the first two phases of experimental design - definition and screening - were discussed. In

More information

CHAPTER 6 MACHINABILITY MODELS WITH THREE INDEPENDENT VARIABLES

CHAPTER 6 MACHINABILITY MODELS WITH THREE INDEPENDENT VARIABLES CHAPTER 6 MACHINABILITY MODELS WITH THREE INDEPENDENT VARIABLES 6.1 Introduction It has been found from the literature review that not much research has taken place in the area of machining of carbon silicon

More information

Solution to Final Exam

Solution to Final Exam Stat 660 Solution to Final Exam. (5 points) A large pharmaceutical company is interested in testing the uniformity (a continuous measurement that can be taken by a measurement instrument) of their film-coated

More information

2.830J / 6.780J / ESD.63J Control of Manufacturing Processes (SMA 6303) Spring 2008

2.830J / 6.780J / ESD.63J Control of Manufacturing Processes (SMA 6303) Spring 2008 MIT OpenCourseWare http://ocw.mit.edu 2.830J / 6.780J / ESD.63J Control of Processes (SMA 6303) Spring 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Regression Models - Introduction

Regression Models - Introduction Regression Models - Introduction In regression models there are two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent

More information

1 Statistical inference for a population mean

1 Statistical inference for a population mean 1 Statistical inference for a population mean 1. Inference for a large sample, known variance Suppose X 1,..., X n represents a large random sample of data from a population with unknown mean µ and known

More information

Inference in Regression Analysis

Inference in Regression Analysis Inference in Regression Analysis Dr. Frank Wood Frank Wood, fwood@stat.columbia.edu Linear Regression Models Lecture 4, Slide 1 Today: Normal Error Regression Model Y i = β 0 + β 1 X i + ǫ i Y i value

More information

Confidence Intervals, Testing and ANOVA Summary

Confidence Intervals, Testing and ANOVA Summary Confidence Intervals, Testing and ANOVA Summary 1 One Sample Tests 1.1 One Sample z test: Mean (σ known) Let X 1,, X n a r.s. from N(µ, σ) or n > 30. Let The test statistic is H 0 : µ = µ 0. z = x µ 0

More information

ECO220Y Simple Regression: Testing the Slope

ECO220Y Simple Regression: Testing the Slope ECO220Y Simple Regression: Testing the Slope Readings: Chapter 18 (Sections 18.3-18.5) Winter 2012 Lecture 19 (Winter 2012) Simple Regression Lecture 19 1 / 32 Simple Regression Model y i = β 0 + β 1 x

More information

Inference in Normal Regression Model. Dr. Frank Wood

Inference in Normal Regression Model. Dr. Frank Wood Inference in Normal Regression Model Dr. Frank Wood Remember We know that the point estimator of b 1 is b 1 = (Xi X )(Y i Ȳ ) (Xi X ) 2 Last class we derived the sampling distribution of b 1, it being

More information

Contents. 2 2 factorial design 4

Contents. 2 2 factorial design 4 Contents TAMS38 - Lecture 10 Response surface methodology Lecturer: Zhenxia Liu Department of Mathematics - Mathematical Statistics 12 December, 2017 2 2 factorial design Polynomial Regression model First

More information

Statistics 910, #15 1. Kalman Filter

Statistics 910, #15 1. Kalman Filter Statistics 910, #15 1 Overview 1. Summary of Kalman filter 2. Derivations 3. ARMA likelihoods 4. Recursions for the variance Kalman Filter Summary of Kalman filter Simplifications To make the derivations

More information

Review of probability and statistics 1 / 31

Review of probability and statistics 1 / 31 Review of probability and statistics 1 / 31 2 / 31 Why? This chapter follows Stock and Watson (all graphs are from Stock and Watson). You may as well refer to the appendix in Wooldridge or any other introduction

More information

Mustafa H. Tongarlak Bruce E. Ankenman Barry L. Nelson

Mustafa H. Tongarlak Bruce E. Ankenman Barry L. Nelson Proceedings of the 0 Winter Simulation Conference S. Jain, R. R. Creasey, J. Himmelspach, K. P. White, and M. Fu, eds. RELATIVE ERROR STOCHASTIC KRIGING Mustafa H. Tongarlak Bruce E. Ankenman Barry L.

More information

Design and Analysis of Experiments

Design and Analysis of Experiments Design and Analysis of Experiments Part IX: Response Surface Methodology Prof. Dr. Anselmo E de Oliveira anselmo.quimica.ufg.br anselmo.disciplinas@gmail.com Methods Math Statistics Models/Analyses Response

More information

Laboratory Exercise System Identification. Laboratory Experiment AS-PA6. "Design of Experiments"

Laboratory Exercise System Identification. Laboratory Experiment AS-PA6. Design of Experiments Formel-Kapitel Abschnitt FACHGEBIET Systemanalyse Laboratory Exercise System Identification Laboratory Experiment AS-PA6 "Design of Experiments" Responsible professor: Dr.-Ing. Thomas Glotzbach Responsible

More information

Sizing Mixture (RSM) Designs for Adequate Precision via Fraction of Design Space (FDS)

Sizing Mixture (RSM) Designs for Adequate Precision via Fraction of Design Space (FDS) for Adequate Precision via Fraction of Design Space (FDS) Pat Whitcomb Stat-Ease, Inc. 61.746.036 036 fax 61.746.056 pat@statease.com Gary W. Oehlert School of Statistics University it of Minnesota 61.65.1557

More information

LECTURE 5. Introduction to Econometrics. Hypothesis testing

LECTURE 5. Introduction to Econometrics. Hypothesis testing LECTURE 5 Introduction to Econometrics Hypothesis testing October 18, 2016 1 / 26 ON TODAY S LECTURE We are going to discuss how hypotheses about coefficients can be tested in regression models We will

More information

Probability and Statistics Notes

Probability and Statistics Notes Probability and Statistics Notes Chapter Seven Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Seven Notes Spring 2011 1 / 42 Outline

More information

Simple Linear Regression (Part 3)

Simple Linear Regression (Part 3) Chapter 1 Simple Linear Regression (Part 3) 1 Write an Estimated model Statisticians/Econometricians usually write an estimated model together with some inference statistics, the following are some formats

More information

Process/product optimization using design of experiments and response surface methodology

Process/product optimization using design of experiments and response surface methodology Process/product optimization using design of experiments and response surface methodology M. Mäkelä Sveriges landbruksuniversitet Swedish University of Agricultural Sciences Department of Forest Biomaterials

More information

2 Introduction to Response Surface Methodology

2 Introduction to Response Surface Methodology 2 Introduction to Response Surface Methodology 2.1 Goals of Response Surface Methods The experimenter is often interested in 1. Finding a suitable approximating function for the purpose of predicting a

More information

Contents. Response Surface Designs. Contents. References.

Contents. Response Surface Designs. Contents. References. Response Surface Designs Designs for continuous variables Frédéric Bertrand 1 1 IRMA, Université de Strasbourg Strasbourg, France ENSAI 3 e Année 2017-2018 Setting Visualizing the Response First-Order

More information

3.4. A computer ANOVA output is shown below. Fill in the blanks. You may give bounds on the P-value.

3.4. A computer ANOVA output is shown below. Fill in the blanks. You may give bounds on the P-value. 3.4. A computer ANOVA output is shown below. Fill in the blanks. You may give bounds on the P-value. One-way ANOVA Source DF SS MS F P Factor 3 36.15??? Error??? Total 19 196.04 Completed table is: One-way

More information

Spatial inference. Spatial inference. Accounting for spatial correlation. Multivariate normal distributions

Spatial inference. Spatial inference. Accounting for spatial correlation. Multivariate normal distributions Spatial inference I will start with a simple model, using species diversity data Strong spatial dependence, Î = 0.79 what is the mean diversity? How precise is our estimate? Sampling discussion: The 64

More information

Regression Analysis: Basic Concepts

Regression Analysis: Basic Concepts The simple linear model Regression Analysis: Basic Concepts Allin Cottrell Represents the dependent variable, y i, as a linear function of one independent variable, x i, subject to a random disturbance

More information

2.830J / 6.780J / ESD.63J Control of Manufacturing Processes (SMA 6303) Spring 2008

2.830J / 6.780J / ESD.63J Control of Manufacturing Processes (SMA 6303) Spring 2008 MIT OpenCourseWare http://ocw.mit.edu 2.830J / 6.780J / ESD.63J Control of Processes (SMA 6303) Spring 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Chapter 2 Inferences in Simple Linear Regression

Chapter 2 Inferences in Simple Linear Regression STAT 525 SPRING 2018 Chapter 2 Inferences in Simple Linear Regression Professor Min Zhang Testing for Linear Relationship Term β 1 X i defines linear relationship Will then test H 0 : β 1 = 0 Test requires

More information

2010 Stat-Ease, Inc. Dual Response Surface Methods (RSM) to Make Processes More Robust* Presented by Mark J. Anderson (

2010 Stat-Ease, Inc. Dual Response Surface Methods (RSM) to Make Processes More Robust* Presented by Mark J. Anderson ( Dual Response Surface Methods (RSM) to Make Processes More Robust* *Posted at www.statease.com/webinar.html Presented by Mark J. Anderson (Email: Mark@StatEase.com ) Timer by Hank Anderson July 2008 Webinar

More information

Math Section MW 1-2:30pm SR 117. Bekki George 206 PGH

Math Section MW 1-2:30pm SR 117. Bekki George 206 PGH Math 3339 Section 21155 MW 1-2:30pm SR 117 Bekki George bekki@math.uh.edu 206 PGH Office Hours: M 11-12:30pm & T,TH 10:00 11:00 am and by appointment Linear Regression (again) Consider the relationship

More information

Applied Econometrics (QEM)

Applied Econometrics (QEM) Applied Econometrics (QEM) The Simple Linear Regression Model based on Prinicples of Econometrics Jakub Mućk Department of Quantitative Economics Jakub Mućk Applied Econometrics (QEM) Meeting #2 The Simple

More information

Gradient Descent. Sargur Srihari

Gradient Descent. Sargur Srihari Gradient Descent Sargur srihari@cedar.buffalo.edu 1 Topics Simple Gradient Descent/Ascent Difficulties with Simple Gradient Descent Line Search Brent s Method Conjugate Gradient Descent Weight vectors

More information

Process Characterization Using Response Surface Methodology

Process Characterization Using Response Surface Methodology Process Characterization Using Response Surface Methodology A Senior Project Presented to The Faculty of the Statistics Department California Polytechnic State University, San Luis Obispo In Partial Fulfillment

More information

15.1 The Regression Model: Analysis of Residuals

15.1 The Regression Model: Analysis of Residuals 15.1 The Regression Model: Analysis of Residuals Tom Lewis Fall Term 2009 Tom Lewis () 15.1 The Regression Model: Analysis of Residuals Fall Term 2009 1 / 12 Outline 1 The regression model 2 Estimating

More information

QUANTITATIVE TOOLS IN BUSINESS MAY Four Quarterly Moving Total

QUANTITATIVE TOOLS IN BUSINESS MAY Four Quarterly Moving Total SOLUTION 1 (a) Q 1 2 3 4 6 7 8 9 10 11 13 14 Sales 200 20 24 214 208 26 260 216 220 260 266 216 218 264 Four Quarterly Moving Total 918 926 932 938 940 92 96 962 962 960 964 Four Quarterly Moving Average

More information

Chapter 6 The 2 k Factorial Design Solutions

Chapter 6 The 2 k Factorial Design Solutions Solutions from Montgomery, D. C. (004) Design and Analysis of Experiments, Wiley, NY Chapter 6 The k Factorial Design Solutions 6.. A router is used to cut locating notches on a printed circuit board.

More information

Physics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester

Physics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester Physics 403 Parameter Estimation, Correlations, and Error Bars Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Best Estimates and Reliability

More information

Contents. TAMS38 - Lecture 10 Response surface. Lecturer: Jolanta Pielaszkiewicz. Response surface 3. Response surface, cont. 4

Contents. TAMS38 - Lecture 10 Response surface. Lecturer: Jolanta Pielaszkiewicz. Response surface 3. Response surface, cont. 4 Contents TAMS38 - Lecture 10 Response surface Lecturer: Jolanta Pielaszkiewicz Matematisk statistik - Matematiska institutionen Linköpings universitet Look beneath the surface; let not the several quality

More information

Ch 2: Simple Linear Regression

Ch 2: Simple Linear Regression Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component

More information

Physics 403. Segev BenZvi. Numerical Methods, Maximum Likelihood, and Least Squares. Department of Physics and Astronomy University of Rochester

Physics 403. Segev BenZvi. Numerical Methods, Maximum Likelihood, and Least Squares. Department of Physics and Astronomy University of Rochester Physics 403 Numerical Methods, Maximum Likelihood, and Least Squares Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Quadratic Approximation

More information

Problems. Suppose both models are fitted to the same data. Show that SS Res, A SS Res, B

Problems. Suppose both models are fitted to the same data. Show that SS Res, A SS Res, B Simple Linear Regression 35 Problems 1 Consider a set of data (x i, y i ), i =1, 2,,n, and the following two regression models: y i = β 0 + β 1 x i + ε, (i =1, 2,,n), Model A y i = γ 0 + γ 1 x i + γ 2

More information

Review of Statistics

Review of Statistics Review of Statistics Topics Descriptive Statistics Mean, Variance Probability Union event, joint event Random Variables Discrete and Continuous Distributions, Moments Two Random Variables Covariance and

More information

2.1 Linear regression with matrices

2.1 Linear regression with matrices 21 Linear regression with matrices The values of the independent variables are united into the matrix X (design matrix), the values of the outcome and the coefficient are represented by the vectors Y and

More information

Business Statistics. Lecture 10: Correlation and Linear Regression

Business Statistics. Lecture 10: Correlation and Linear Regression Business Statistics Lecture 10: Correlation and Linear Regression Scatterplot A scatterplot shows the relationship between two quantitative variables measured on the same individuals. It displays the Form

More information

Estimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X.

Estimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X. Estimating σ 2 We can do simple prediction of Y and estimation of the mean of Y at any value of X. To perform inferences about our regression line, we must estimate σ 2, the variance of the error term.

More information

CHAPTER 5 NON-LINEAR SURROGATE MODEL

CHAPTER 5 NON-LINEAR SURROGATE MODEL 96 CHAPTER 5 NON-LINEAR SURROGATE MODEL 5.1 INTRODUCTION As set out in the research methodology and in sequent with the previous section on development of LSM, construction of the non-linear surrogate

More information

Obtaining Uncertainty Measures on Slope and Intercept

Obtaining Uncertainty Measures on Slope and Intercept Obtaining Uncertainty Measures on Slope and Intercept of a Least Squares Fit with Excel s LINEST Faith A. Morrison Professor of Chemical Engineering Michigan Technological University, Houghton, MI 39931

More information

RESPONSE SURFACE MODELLING, RSM

RESPONSE SURFACE MODELLING, RSM CHEM-E3205 BIOPROCESS OPTIMIZATION AND SIMULATION LECTURE 3 RESPONSE SURFACE MODELLING, RSM Tool for process optimization HISTORY Statistical experimental design pioneering work R.A. Fisher in 1925: Statistical

More information

Stochastic optimization - how to improve computational efficiency?

Stochastic optimization - how to improve computational efficiency? Stochastic optimization - how to improve computational efficiency? Christian Bucher Center of Mechanics and Structural Dynamics Vienna University of Technology & DYNARDO GmbH, Vienna Presentation at Czech

More information

Intro to Linear Regression

Intro to Linear Regression Intro to Linear Regression Introduction to Regression Regression is a statistical procedure for modeling the relationship among variables to predict the value of a dependent variable from one or more predictor

More information

Gaussian and Linear Discriminant Analysis; Multiclass Classification

Gaussian and Linear Discriminant Analysis; Multiclass Classification Gaussian and Linear Discriminant Analysis; Multiclass Classification Professor Ameet Talwalkar Slide Credit: Professor Fei Sha Professor Ameet Talwalkar CS260 Machine Learning Algorithms October 13, 2015

More information

Linear classifiers: Logistic regression

Linear classifiers: Logistic regression Linear classifiers: Logistic regression STAT/CSE 416: Machine Learning Emily Fox University of Washington April 19, 2018 How confident is your prediction? The sushi & everything else were awesome! The

More information

Experimental design (KKR031, KBT120) Tuesday 11/ :30-13:30 V

Experimental design (KKR031, KBT120) Tuesday 11/ :30-13:30 V Experimental design (KKR031, KBT120) Tuesday 11/1 2011-8:30-13:30 V Jan Rodmar will be available at ext 3024 and will visit the examination room ca 10:30. The examination results will be available for

More information

Statistical View of Least Squares

Statistical View of Least Squares May 23, 2006 Purpose of Regression Some Examples Least Squares Purpose of Regression Purpose of Regression Some Examples Least Squares Suppose we have two variables x and y Purpose of Regression Some Examples

More information

Intro to Linear Regression

Intro to Linear Regression Intro to Linear Regression Introduction to Regression Regression is a statistical procedure for modeling the relationship among variables to predict the value of a dependent variable from one or more predictor

More information

Topic 14: Inference in Multiple Regression

Topic 14: Inference in Multiple Regression Topic 14: Inference in Multiple Regression Outline Review multiple linear regression Inference of regression coefficients Application to book example Inference of mean Application to book example Inference

More information

Chapter 2 The Simple Linear Regression Model: Specification and Estimation

Chapter 2 The Simple Linear Regression Model: Specification and Estimation Chapter The Simple Linear Regression Model: Specification and Estimation Page 1 Chapter Contents.1 An Economic Model. An Econometric Model.3 Estimating the Regression Parameters.4 Assessing the Least Squares

More information

Simple Linear Regression. Material from Devore s book (Ed 8), and Cengagebrain.com

Simple Linear Regression. Material from Devore s book (Ed 8), and Cengagebrain.com 12 Simple Linear Regression Material from Devore s book (Ed 8), and Cengagebrain.com The Simple Linear Regression Model The simplest deterministic mathematical relationship between two variables x and

More information

Design of Experiments SUTD - 21/4/2015 1

Design of Experiments SUTD - 21/4/2015 1 Design of Experiments SUTD - 21/4/2015 1 Outline 1. Introduction 2. 2 k Factorial Design Exercise 3. Choice of Sample Size Exercise 4. 2 k p Fractional Factorial Design Exercise 5. Follow-up experimentation

More information

Data Set 1A: Algal Photosynthesis vs. Salinity and Temperature

Data Set 1A: Algal Photosynthesis vs. Salinity and Temperature Data Set A: Algal Photosynthesis vs. Salinity and Temperature Statistical setting These data are from a controlled experiment in which two quantitative variables were manipulated, to determine their effects

More information

Whitening and Coloring Transformations for Multivariate Gaussian Data. A Slecture for ECE 662 by Maliha Hossain

Whitening and Coloring Transformations for Multivariate Gaussian Data. A Slecture for ECE 662 by Maliha Hossain Whitening and Coloring Transformations for Multivariate Gaussian Data A Slecture for ECE 662 by Maliha Hossain Introduction This slecture discusses how to whiten data that is normally distributed. Data

More information

Non-linear least squares

Non-linear least squares Non-linear least squares Concept of non-linear least squares We have extensively studied linear least squares or linear regression. We see that there is a unique regression line that can be determined

More information

36-309/749 Experimental Design for Behavioral and Social Sciences. Dec 1, 2015 Lecture 11: Mixed Models (HLMs)

36-309/749 Experimental Design for Behavioral and Social Sciences. Dec 1, 2015 Lecture 11: Mixed Models (HLMs) 36-309/749 Experimental Design for Behavioral and Social Sciences Dec 1, 2015 Lecture 11: Mixed Models (HLMs) Independent Errors Assumption An error is the deviation of an individual observed outcome (DV)

More information

Inverse of a Square Matrix. For an N N square matrix A, the inverse of A, 1

Inverse of a Square Matrix. For an N N square matrix A, the inverse of A, 1 Inverse of a Square Matrix For an N N square matrix A, the inverse of A, 1 A, exists if and only if A is of full rank, i.e., if and only if no column of A is a linear combination 1 of the others. A is

More information

Consider the joint probability, P(x,y), shown as the contours in the figure above. P(x) is given by the integral of P(x,y) over all values of y.

Consider the joint probability, P(x,y), shown as the contours in the figure above. P(x) is given by the integral of P(x,y) over all values of y. ATMO/OPTI 656b Spring 009 Bayesian Retrievals Note: This follows the discussion in Chapter of Rogers (000) As we have seen, the problem with the nadir viewing emission measurements is they do not contain

More information

ECON 3150/4150, Spring term Lecture 7

ECON 3150/4150, Spring term Lecture 7 ECON 3150/4150, Spring term 2014. Lecture 7 The multivariate regression model (I) Ragnar Nymoen University of Oslo 4 February 2014 1 / 23 References to Lecture 7 and 8 SW Ch. 6 BN Kap 7.1-7.8 2 / 23 Omitted

More information

Final Review. Yang Feng. Yang Feng (Columbia University) Final Review 1 / 58

Final Review. Yang Feng.   Yang Feng (Columbia University) Final Review 1 / 58 Final Review Yang Feng http://www.stat.columbia.edu/~yangfeng Yang Feng (Columbia University) Final Review 1 / 58 Outline 1 Multiple Linear Regression (Estimation, Inference) 2 Special Topics for Multiple

More information

Simple Linear Regression

Simple Linear Regression Simple Linear Regression ST 430/514 Recall: A regression model describes how a dependent variable (or response) Y is affected, on average, by one or more independent variables (or factors, or covariates)

More information

DESIGN AND ANALYSIS OF EXPERIMENTS Third Edition

DESIGN AND ANALYSIS OF EXPERIMENTS Third Edition DESIGN AND ANALYSIS OF EXPERIMENTS Third Edition Douglas C. Montgomery ARIZONA STATE UNIVERSITY JOHN WILEY & SONS New York Chichester Brisbane Toronto Singapore Contents Chapter 1. Introduction 1-1 What

More information

Linear models: the perceptron and closest centroid algorithms. D = {(x i,y i )} n i=1. x i 2 R d 9/3/13. Preliminaries. Chapter 1, 7.

Linear models: the perceptron and closest centroid algorithms. D = {(x i,y i )} n i=1. x i 2 R d 9/3/13. Preliminaries. Chapter 1, 7. Preliminaries Linear models: the perceptron and closest centroid algorithms Chapter 1, 7 Definition: The Euclidean dot product beteen to vectors is the expression d T x = i x i The dot product is also

More information

Statistical Techniques II EXST7015 Simple Linear Regression

Statistical Techniques II EXST7015 Simple Linear Regression Statistical Techniques II EXST7015 Simple Linear Regression 03a_SLR 1 Y - the dependent variable 35 30 25 The objective Given points plotted on two coordinates, Y and X, find the best line to fit the data.

More information

CORRELATION AND REGRESSION

CORRELATION AND REGRESSION CORRELATION AND REGRESSION CORRELATION The correlation coefficient is a number, between -1 and +1, which measures the strength of the relationship between two sets of data. The closer the correlation coefficient

More information

Introduction to Confirmatory Factor Analysis

Introduction to Confirmatory Factor Analysis Introduction to Confirmatory Factor Analysis Multivariate Methods in Education ERSH 8350 Lecture #12 November 16, 2011 ERSH 8350: Lecture 12 Today s Class An Introduction to: Confirmatory Factor Analysis

More information