J. Response Surface Methodology

Size: px
Start display at page:

Download "J. Response Surface Methodology"

Transcription

1 J. Response Surface Methodology Response Surface Methodology. Chemical Example (Box, Hunter & Hunter) Objective: Find settings of R, Reaction Time and T, Temperature that produced maximum yield subject to () color index > 8, (2) viscosity of the product be at least 70 units and no more than units. Current settings: R = minutes and T = 30 C. Known standard deviation: from the past experience, σ =.5 and it is unlikely that this value changes in the experiment to be performed. Focus on maximizing the yield for now:. First stage of experimentation: Background: the experimenter believed, but was not certain, that he was some distance away from the maximum (some way down the smooth hillside that represents the true response surface). predominant local characteristics of the surface were its gradients local surface could be roughly represented by the planar model having slope β in the x direction and slope β 2 in the x 2 direction. y = β 0 + β x + β 2 x 2 + ε first-order model (planar model), where x and x 2 are coded variables for time and temperature, respectively. Design chosen: a first-order design 2 2 with 3 replicates of centerpoint. actual coded yield run run factors variables (grams) number order R T x x 2 y range of R: 70 to minutes range of T: 27.5 to 32.5 C coded variables: R minutes x = 5 minutes x 2 = T 30 C 2.5 C J.

2 Analysis: Least squares fit: Fitted equation: ˆβ = 2 (main effect R) = 4.70 = ˆβ 2 = 2 (main effect T) = 9.00 = std.err(effect) = σ = 2.5 =.5 nf 4 std.err(coefficient) = 2 std.err(effect) =.5 = 0. 2 ˆβ 0 = y = 62.0 std.err( ˆβ 0 ) = σ 7 =.5 7 = 0.57 ŷ = x x 2 Diagnostic checks for the adequacy of the planar model interaction check insignificant interaction expected. Note: interaction model is y = β 0 + β x + β 2 x 2 + β 2 x x 2 + ε. ˆβ 2 = 2 RT interaction effect = (.30) = which is less than 2 std.err(effect) in magnitude and consequently the interaction is insignificant. curvature check insignificant curvature expected. Note: second-order model is y = β 0 +β x +β 2 x 2 +β x 2 +β 2 x x 2 +β 22 x 2 2+ε. The column x 2 = x 2 2 provides a contrast for estimating β + β 22. β + β 22 = y f y c = 0.50 Var( β + β 22 ) = Var(y f ) + Var(y c ) = σ 2 ( + ) n f n c std.err(curvature) = σ + =.5. n f n c The curvature effect is insignificant since its magnitude is less than twice the standard error of the curvature. lack-of-fit check no lack-of-fit expected. Pure error SS amd MS: SS E = ( ) ( ) 2 /3 = 8.0 which is computed from the center points. MS E = s 2 = SS E /2 = 4.0. SS R = n f /4 R 2 = 4/ = J.2

3 SS T = n f /4 T 2 = 4/ = Residual SS: SS r = SS total SS R SS T = 0.9. Lack-of-fit SS: SS l = SS r SS E = 2.9. Source SS DF MS F P residual lack-of-fit pure error Estimation of experimental error: So, there is no evidence of lack of fit. s 2 = MS E = 4.0 and hence s = 2.0 agreeing fairly well with σ =.5. Fitted equation, contour diagram, and the path of steepest ascent. The contour line equation obtained from the fitted planar model is ( ŷ ) x x 2 = 0. Using the contour equation, calculate straight line equations at several values of ŷ: 56, 60, 64, 68, say. x 0 temperature ( C) path of steepest ascent y^ = 64 y^ = 60 y^ = 68 0 x 2 Steepest ascent direction: starting at the center of the first design, the path is followed by simultaneously moving ˆβ 2 = 4.50 units in x 2 for every ˆβ = 2.35 units moved in x ; or equivalently, 4.50/2.35=.9 units in x 2 direction for every unit in x direction. y^ = time (min.) 2. Probing for better settings along the path of steepest ascent: x x 2 R T run yield center ,6,7 62.3(average) path of steepest ascent Second stage experimentation: second design will be centered at R = 90 min. and T = 45 C. background: moving up the surface experimental error increases need to broden J.3

4 the factor space of the design by a factor of, say, two to increase the absolute magnitude of the effects in relation to the error. actual coded yield coded variables: run run factors variables (grams) R 90 minutes number order R T x x 2 y x = 0 minutes x 2 = T 45 C 5 C Analysis of the second first-order design (runs to 6): Start with a first order design for planar model (runs to 6), the results will most likely be inadequate; if the planar model fails, then augment to second order design to fit the second order model (runs 7 to 22). β 2 = 4.88 ± 0. and β + β 22 = 5.28 ±.30 are both significant. Consequently the planar model is inadequate. s 2 2 = 2 [ ( ) 2 /2] = 4.2 s 2 = 2. so that the pooled estimate of σ 2 from the two first-order designs is given by s 2 = 2 s2 + s = 4.07 s = 2.02 again agreeing reasonably well with σ =.5. The second first-order design is inadequate for fitting a second-order model: y = β 0 + β x + β 2 x 2 + β x 2 + β 22 x β 2 x x 2 + ε which prompts us to augment the second first-order design with four axial points and 2 center points. The combined design is called a central composite design (or a secondorder design) as given above. Analysis of the second-order design: The regression line is given by ŷ = x x x x x x 2. The response surface can be visualized by the following figures: J.4

5 x temperature ( C) x 2 yhat time (min.) x x2 Diagnostic check for model adequacy exactly quadratic checks in x and x 2 directions no descrepancy from exact quadratic expected x line 2 line Exact quadratic in x direction: β β 22 (coefficients of x 3 and of x x 2 2, respectively, in a third-order model) measures descrepancy from exact quadratic which is given by β β 22 = slope line slope line 2 of which a value of zero is expected if we have exact quadratic in x direction (i.e., a parabola and the two lines are parallel). β β 22 = y 8 y 7 2 y 2 + y 4 y y which has a value of.28 and Var( β β 22 ) = (σ 2 + σ 2 )/(2 2) σ2 = 2 σ2. Therefore, std.error( β β 22 )=.06 which indicates that the exact quadratic in x direction is justified. The exact quadratic in the x 2 direction can be obtained in a similar fashion:.93 ±.06, a borderline situation. J.5

6 Estimating experimental error: s 2 3 = 2 [ ( ) 2 /2] = 0.5 and the pooled estimate of σ 2 is [2 2++ s2 + s s 2 3] = 3.8 so that s =.78 which is close to σ =.5. Block effect: estimated by y y 2 =.70, the difference in the mean responses in the two blocks. The standard error of the block effect is then σ + = There 6 6 is some evidence that somewhat lower yields were obtained in the six augmented runs. Nature of the fitted surface: the average variance of fitted values is Var(ŷ) = pσ2 n = =.25, where p is the number of parameters in the model and n is the number of observations. Hence, the average standard error of fitted values is.25 =.06. The range of predicted values is about from 77 to 90 which is approximately 2 times the average standard error of the fitted values ( ). This implies that there.06 is no substantial lack of fit and it is worthwhile to interpret the fitted surface. Now, if we fit the color index and the viscosity (actual data not available), their fitted equations can be computed. It turned out that they are both planar. The constraints are indicated at the beginning. The response contours of the three responses were superimposed with the feasible region unshaded: x 0 temperature ( C) w=8 y=88 y=76 y=84 z= y= y= w=9 y=86 z= z=65 z=60 y=72 z= z=70 w=0 y=84 0 x 2 From the plot, we observe that the best settings are about R = 82.6 minutes and T = 47.5 C with yield of at least 88 grams time (min.) J.6

7 .2 Characterization of the second-order response surface. Express the fitted equation in canonical form. 2. Perform the canonical analysis on the canonical form to find the best test condition. Canonical forms: The second order fitted equation in x and x 2 can be written as where ŷ = ˆβ 0 + ˆβ x + ˆβ 2 x 2 + ˆβ x 2 + ˆβ 22 x ˆβ 2 x x 2 x = [ x x 2 ŷ = ˆβ 0 + x T b + x T Bx, ] [ ] [ ] ˆβ ˆβ ˆβ2 /2, b =, B = ˆβ 2 ˆβ 2 /2 ˆβ. 22 Note: if there are three factors with coded variables x, x 2, and x 3 then the second order fitted equation has canonical form as in equation (J.) with x = x x 2 x 3, b = ˆβ ˆβ 2, B = ˆβ 3 This can be generalized to general number of factors k. stationary point: x S = 2 B b. ˆβ ˆβ2 /2 ˆβ 3 /2 ˆβ 2 /2 ˆβ 22 ˆβ23 /2 ˆβ 3 /2 ˆβ 23 /2 ˆβ 33. (J.) response at x S : ŷ S = ˆβ xt b. Let the eigenvalues of B be λ,..., λ k and their corresponding eigenvectors be m,..., m k. Denote M = [m,..., m k ], and Λ = diag(λ,..., λ k ). Then the spectral decomposition of B is given by B = MΛM T. Let X = [X,..., X k ] T = M T (x x S ). Equation (J.) can be expressed as the following canonical form (known as canonical form B) ŷ = ŷ S + X T ΛX = ŷ S + λ X λ k X 2 k. (J.2) For k = 2, the surface of ŷ = ŷ S + λ X 2 + λ 2 X 2 2 takes one of the following forms: J.7

8 sign of other surface example λ λ 2 properties type λ > λ 2 simple maximum Figure (a) (hill) + + λ > λ 2 simple minimum response decrease toward S (valley) + λ > λ 2 saddle Figure (b) + saddle no λ 2 = 0 stationary Figure (c) center not ridge of unique; a line maximum of centers on the X 2 axis + no stationary ridge of minimum no λ 2 = 0 rising Figure (d) center at ridge infinity + no falling ridge.3 Rising (falling) ridge analysis : Suppose Then x S = the nearest point on the ridge to the center of the design Z = M T x S. λ 2 λ, Z = x S = M ( Z 0 ( Z Linear coefficient B 2 = 2λ 2 Z 2. ŷ S = ŷ S + λ 2 Z 2 2. Then the fitted equation in approximated canonical form (known as canonical form A): and the equation for the ridge is X = 0. Z 2 ). ŷ ŷ S = λ X 2 + B 2 X 2, ). J.8

9 yhat yhat yhat yhat The figures below give four types of response surfaces for 2-factor problems: simple maximum/minimum minimax (saddle) stationary ridge rising/falling ridge y^ = x x + 7.2x x 2 3.7x x x 2 = X X 2 y^ = x x x x x x x 2 = X X 2 y^ = x x x x x x x 2 = X X 2 y^ = x x + 8.9x x x x x 2 = X X 2 x3 x4 x4 x3 x3 x4 x3 x4 2 Multiple responses case. If there are few responses and only two to three important input variables, the problem can be solved by overlaying and examining the contour plots for the responses over a few pairs of input variables as seen in the chemical example. Otherwise,a more formal method is required. 2. If, among the responses, one is of primary importance, it can be formulated as a constrained optimization problem by optimizing this response subject to constraints imposed on the other responses. Typically one can employ standard optimization packages, such as linear or nonlinear programming, to solve the constrained optimization problem. 3. The previous method is not appropriate when the goal of the investigation is to find an op- J.9

10 timal balance among several response characteristics. For instance, in a development of a tire tread compound, the following responses are all important: (i) an abrasion index to be maximized, (ii) elongation at break to be within specified limits, and (iii) hardness to be within specified limits. An alternative to provide a trade-off among several responses is to transform each predicted response to a desirability value d, where 0 d. The value of the desirability function d decreases as the desirability of the corresponding response decreases. Derringer and Suich (9) proposed a class of desirability functions for three types of problems: α d = ŷ U a U, a ŷ U, d = 0 for ŷ > U for ST B α d = ŷ L U L, L ŷ U, d = 0 for ŷ < L and d = for ŷ > U for LT B { ŷ L d = α t L ŷ U α 2 t U, L ŷ t,, a ŷ U, d = 0 for ŷ < L or ŷ > U for NT B; where t is the target value, L the lower bound below which the product is considered to be unacceptable, U the upper bound above which the product is considered to be unacceptable under NTB and STB and to be nearly perfect under LTB, while a the smallest possible value for the response. The choices of α, α, and α 2 are more subjective. In some practical situations, the L and U values cannot be properly chosen. A different desirability function could then be defined over the wole real line or half line. For example, use of the exponential function appears to be ideal for this purpose: d = exp{ c ŷ a α }, a ŷ < for ST B; d = exp{ cŷ α }/ exp{ cl α }, L ŷ <, d = 0 for ŷ < L for LT B; { exp{ c ŷ t d = α }, < ŷ t, exp{ c 2 ŷ t α 2 ; }, t ŷ <, where the constants c and α can be used to fine-tune the scale and shape of the desirability function. A smaller α value makes the desirability function drop more slowly from its peak. A smaller c value increases the spread of the desirability function by lowering the whole exponential curve between 0 and. An overall desirability function d can be defined as the geometric mean of the desirability functions d i s for the responses: d = [d d m ] m. A more general desirability function would be d = m i= d w i i J.0

11 to reflect possible difference in the importance of the different responses, where the weights w i satisfy 0 < w i < and w + + w m =. Any setting for the input factors that maximizes the d value is chosen to be one which achieves an optimal balance over the m responses. Running a confirmation experiment at the identified settings will provide assurance that all responses are in a desirable region. 2. Chemical Example II (Montgomery) A chemical engineer is interested in determining the operating conditions that maximize the yield of a process. Two process variables influence process yield: R, (reaction time), and T, (temperature). Two constraints are imposed on the maximization of the yield: () the viscosity of the product is required to be at least 62 and at most 68 units; (2) the molecular weight is to be at most 3400 units; (3) the yield is required to be at least 78.5 units. With a first-order design centered on the current settings of R = 35 min. and T = 55 F, a planar was fitted nicely and yielded to the new design center of R = min. and T = F after some subsequent exploratory runs along the path of steepest ascent direction implied by the first-order design. The coded variables for the second stage of the experimentation are x R = R min. 5 min. and x T = T F. 5 F A first-order design was performed around these new settings and later augmented to the following CCD (with data): original coded responses factors variables yield viscosity molecular weight R T x R x T y y 2 y J.

12 The fitted equations for the three responses are ŷ = x R x T.38x 2 R.00x 2 T x R x T, ŷ 2 = x R 0.95x T 0.69x 2 R 6.69x 2 T, 25x R x T, and ŷ 3 = x R x T. The overlaid contour plot of the three responses with the imposed constraints is given below x y 3 = 3400 y 2 = 62 reaction time (min.) 90 y = 78.5 y 2 = 62 y 2 = temperature The two unshaded areas within the design ranges gives the feasible regions for setting the two variables. The experimenter would be most interested in the larger of the two feasible regions. A more formal method such as nonlinear programming method may be called for to find the solutions. The two solutions are R = 83.5 min., T = 77. F with ŷ = 79.5; and R = 86.6 min., T = F with ŷ = x The first solution is in the smaller region while the second is in the larger one. J.2

Solution to Final Exam

Solution to Final Exam Stat 660 Solution to Final Exam. (5 points) A large pharmaceutical company is interested in testing the uniformity (a continuous measurement that can be taken by a measurement instrument) of their film-coated

More information

7. Response Surface Methodology (Ch.10. Regression Modeling Ch. 11. Response Surface Methodology)

7. Response Surface Methodology (Ch.10. Regression Modeling Ch. 11. Response Surface Methodology) 7. Response Surface Methodology (Ch.10. Regression Modeling Ch. 11. Response Surface Methodology) Hae-Jin Choi School of Mechanical Engineering, Chung-Ang University 1 Introduction Response surface methodology,

More information

Response Surface Methodology

Response Surface Methodology Response Surface Methodology Bruce A Craig Department of Statistics Purdue University STAT 514 Topic 27 1 Response Surface Methodology Interested in response y in relation to numeric factors x Relationship

More information

DESIGN OF EXPERIMENT ERT 427 Response Surface Methodology (RSM) Miss Hanna Ilyani Zulhaimi

DESIGN OF EXPERIMENT ERT 427 Response Surface Methodology (RSM) Miss Hanna Ilyani Zulhaimi + DESIGN OF EXPERIMENT ERT 427 Response Surface Methodology (RSM) Miss Hanna Ilyani Zulhaimi + Outline n Definition of Response Surface Methodology n Method of Steepest Ascent n Second-Order Response Surface

More information

Response Surface Methods

Response Surface Methods Response Surface Methods 3.12.2014 Goals of Today s Lecture See how a sequence of experiments can be performed to optimize a response variable. Understand the difference between first-order and second-order

More information

Process/product optimization using design of experiments and response surface methodology

Process/product optimization using design of experiments and response surface methodology Process/product optimization using design of experiments and response surface methodology Mikko Mäkelä Sveriges landbruksuniversitet Swedish University of Agricultural Sciences Department of Forest Biomaterials

More information

14.0 RESPONSE SURFACE METHODOLOGY (RSM)

14.0 RESPONSE SURFACE METHODOLOGY (RSM) 4. RESPONSE SURFACE METHODOLOGY (RSM) (Updated Spring ) So far, we ve focused on experiments that: Identify a few important variables from a large set of candidate variables, i.e., a screening experiment.

More information

Response Surface Methodology

Response Surface Methodology Response Surface Methodology Process and Product Optimization Using Designed Experiments Second Edition RAYMOND H. MYERS Virginia Polytechnic Institute and State University DOUGLAS C. MONTGOMERY Arizona

More information

OPTIMIZATION OF FIRST ORDER MODELS

OPTIMIZATION OF FIRST ORDER MODELS Chapter 2 OPTIMIZATION OF FIRST ORDER MODELS One should not multiply explanations and causes unless it is strictly necessary William of Bakersville in Umberto Eco s In the Name of the Rose 1 In Response

More information

Contents. Response Surface Designs. Contents. References.

Contents. Response Surface Designs. Contents. References. Response Surface Designs Designs for continuous variables Frédéric Bertrand 1 1 IRMA, Université de Strasbourg Strasbourg, France ENSAI 3 e Année 2017-2018 Setting Visualizing the Response First-Order

More information

Unit 12: Response Surface Methodology and Optimality Criteria

Unit 12: Response Surface Methodology and Optimality Criteria Unit 12: Response Surface Methodology and Optimality Criteria STA 643: Advanced Experimental Design Derek S. Young 1 Learning Objectives Revisit your knowledge of polynomial regression Know how to use

More information

7 The Analysis of Response Surfaces

7 The Analysis of Response Surfaces 7 The Analysis of Response Surfaces Goal: the researcher is seeking the experimental conditions which are most desirable, that is, determine optimum design variable levels. Once the researcher has determined

More information

Response Surface Methodology:

Response Surface Methodology: Response Surface Methodology: Process and Product Optimization Using Designed Experiments RAYMOND H. MYERS Virginia Polytechnic Institute and State University DOUGLAS C. MONTGOMERY Arizona State University

More information

7.3 Ridge Analysis of the Response Surface

7.3 Ridge Analysis of the Response Surface 7.3 Ridge Analysis of the Response Surface When analyzing a fitted response surface, the researcher may find that the stationary point is outside of the experimental design region, but the researcher wants

More information

Response Surface Methodology IV

Response Surface Methodology IV LECTURE 8 Response Surface Methodology IV 1. Bias and Variance If y x is the response of the system at the point x, or in short hand, y x = f (x), then we can write η x = E(y x ). This is the true, and

More information

Design and Analysis of Experiments

Design and Analysis of Experiments Design and Analysis of Experiments Part IX: Response Surface Methodology Prof. Dr. Anselmo E de Oliveira anselmo.quimica.ufg.br anselmo.disciplinas@gmail.com Methods Math Statistics Models/Analyses Response

More information

Response Surface Methodology: Process and Product Optimization Using Designed Experiments, 3rd Edition

Response Surface Methodology: Process and Product Optimization Using Designed Experiments, 3rd Edition Brochure More information from http://www.researchandmarkets.com/reports/705963/ Response Surface Methodology: Process and Product Optimization Using Designed Experiments, 3rd Edition Description: Identifying

More information

Response Surface Methodology III

Response Surface Methodology III LECTURE 7 Response Surface Methodology III 1. Canonical Form of Response Surface Models To examine the estimated regression model we have several choices. First, we could plot response contours. Remember

More information

Answer Keys to Homework#10

Answer Keys to Homework#10 Answer Keys to Homework#10 Problem 1 Use either restricted or unrestricted mixed models. Problem 2 (a) First, the respective means for the 8 level combinations are listed in the following table A B C Mean

More information

Assignment 9 Answer Keys

Assignment 9 Answer Keys Assignment 9 Answer Keys Problem 1 (a) First, the respective means for the 8 level combinations are listed in the following table A B C Mean 26.00 + 34.67 + 39.67 + + 49.33 + 42.33 + + 37.67 + + 54.67

More information

Deep Learning. Authors: I. Goodfellow, Y. Bengio, A. Courville. Chapter 4: Numerical Computation. Lecture slides edited by C. Yim. C.

Deep Learning. Authors: I. Goodfellow, Y. Bengio, A. Courville. Chapter 4: Numerical Computation. Lecture slides edited by C. Yim. C. Chapter 4: Numerical Computation Deep Learning Authors: I. Goodfellow, Y. Bengio, A. Courville Lecture slides edited by 1 Chapter 4: Numerical Computation 4.1 Overflow and Underflow 4.2 Poor Conditioning

More information

Statistics & Data Sciences: First Year Prelim Exam May 2018

Statistics & Data Sciences: First Year Prelim Exam May 2018 Statistics & Data Sciences: First Year Prelim Exam May 2018 Instructions: 1. Do not turn this page until instructed to do so. 2. Start each new question on a new sheet of paper. 3. This is a closed book

More information

Addition of Center Points to a 2 k Designs Section 6-6 page 271

Addition of Center Points to a 2 k Designs Section 6-6 page 271 to a 2 k Designs Section 6-6 page 271 Based on the idea of replicating some of the runs in a factorial design 2 level designs assume linearity. If interaction terms are added to model some curvature results

More information

Data Set 8: Laysan Finch Beak Widths

Data Set 8: Laysan Finch Beak Widths Data Set 8: Finch Beak Widths Statistical Setting This handout describes an analysis of covariance (ANCOVA) involving one categorical independent variable (with only two levels) and one quantitative covariate.

More information

Response-surface illustration

Response-surface illustration Response-surface illustration Russ Lenth October 23, 2017 Abstract In this vignette, we give an illustration, using simulated data, of a sequential-experimentation process to optimize a response surface.

More information

Variance. Standard deviation VAR = = value. Unbiased SD = SD = 10/23/2011. Functional Connectivity Correlation and Regression.

Variance. Standard deviation VAR = = value. Unbiased SD = SD = 10/23/2011. Functional Connectivity Correlation and Regression. 10/3/011 Functional Connectivity Correlation and Regression Variance VAR = Standard deviation Standard deviation SD = Unbiased SD = 1 10/3/011 Standard error Confidence interval SE = CI = = t value for

More information

Introduction to Linear regression analysis. Part 2. Model comparisons

Introduction to Linear regression analysis. Part 2. Model comparisons Introduction to Linear regression analysis Part Model comparisons 1 ANOVA for regression Total variation in Y SS Total = Variation explained by regression with X SS Regression + Residual variation SS Residual

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Regularization: Ridge Regression and Lasso Week 14, Lecture 2

MA 575 Linear Models: Cedric E. Ginestet, Boston University Regularization: Ridge Regression and Lasso Week 14, Lecture 2 MA 575 Linear Models: Cedric E. Ginestet, Boston University Regularization: Ridge Regression and Lasso Week 14, Lecture 2 1 Ridge Regression Ridge regression and the Lasso are two forms of regularized

More information

3.4. A computer ANOVA output is shown below. Fill in the blanks. You may give bounds on the P-value.

3.4. A computer ANOVA output is shown below. Fill in the blanks. You may give bounds on the P-value. 3.4. A computer ANOVA output is shown below. Fill in the blanks. You may give bounds on the P-value. One-way ANOVA Source DF SS MS F P Factor 3 36.15??? Error??? Total 19 196.04 Completed table is: One-way

More information

8 RESPONSE SURFACE DESIGNS

8 RESPONSE SURFACE DESIGNS 8 RESPONSE SURFACE DESIGNS Desirable Properties of a Response Surface Design 1. It should generate a satisfactory distribution of information throughout the design region. 2. It should ensure that the

More information

Vasil Khalidov & Miles Hansard. C.M. Bishop s PRML: Chapter 5; Neural Networks

Vasil Khalidov & Miles Hansard. C.M. Bishop s PRML: Chapter 5; Neural Networks C.M. Bishop s PRML: Chapter 5; Neural Networks Introduction The aim is, as before, to find useful decompositions of the target variable; t(x) = y(x, w) + ɛ(x) (3.7) t(x n ) and x n are the observations,

More information

Nonlinear Optimization: What s important?

Nonlinear Optimization: What s important? Nonlinear Optimization: What s important? Julian Hall 10th May 2012 Convexity: convex problems A local minimizer is a global minimizer A solution of f (x) = 0 (stationary point) is a minimizer A global

More information

Diagnostics and Remedial Measures

Diagnostics and Remedial Measures Diagnostics and Remedial Measures Yang Feng http://www.stat.columbia.edu/~yangfeng Yang Feng (Columbia University) Diagnostics and Remedial Measures 1 / 72 Remedial Measures How do we know that the regression

More information

Gradient Descent. Dr. Xiaowei Huang

Gradient Descent. Dr. Xiaowei Huang Gradient Descent Dr. Xiaowei Huang https://cgi.csc.liv.ac.uk/~xiaowei/ Up to now, Three machine learning algorithms: decision tree learning k-nn linear regression only optimization objectives are discussed,

More information

Ch 2: Simple Linear Regression

Ch 2: Simple Linear Regression Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component

More information

2. Regression Review

2. Regression Review 2. Regression Review 2.1 The Regression Model The general form of the regression model y t = f(x t, β) + ε t where x t = (x t1,, x tp ), β = (β 1,..., β m ). ε t is a random variable, Eε t = 0, Var(ε t

More information

Factor models. May 11, 2012

Factor models. May 11, 2012 Factor models May 11, 2012 Factor Models Macro economists have a peculiar data situation: Many data series, but usually short samples How can we utilize all this information without running into degrees

More information

Soil Phosphorus Discussion

Soil Phosphorus Discussion Solution: Soil Phosphorus Discussion Summary This analysis is ambiguous: there are two reasonable approaches which yield different results. Both lead to the conclusion that there is not an independent

More information

Regression Analysis. Regression: Methodology for studying the relationship among two or more variables

Regression Analysis. Regression: Methodology for studying the relationship among two or more variables Regression Analysis Regression: Methodology for studying the relationship among two or more variables Two major aims: Determine an appropriate model for the relationship between the variables Predict the

More information

Factor models. March 13, 2017

Factor models. March 13, 2017 Factor models March 13, 2017 Factor Models Macro economists have a peculiar data situation: Many data series, but usually short samples How can we utilize all this information without running into degrees

More information

Ma 3/103: Lecture 25 Linear Regression II: Hypothesis Testing and ANOVA

Ma 3/103: Lecture 25 Linear Regression II: Hypothesis Testing and ANOVA Ma 3/103: Lecture 25 Linear Regression II: Hypothesis Testing and ANOVA March 6, 2017 KC Border Linear Regression II March 6, 2017 1 / 44 1 OLS estimator 2 Restricted regression 3 Errors in variables 4

More information

Contents. 2 2 factorial design 4

Contents. 2 2 factorial design 4 Contents TAMS38 - Lecture 10 Response surface methodology Lecturer: Zhenxia Liu Department of Mathematics - Mathematical Statistics 12 December, 2017 2 2 factorial design Polynomial Regression model First

More information

Algebra 1 Khan Academy Video Correlations By SpringBoard Activity and Learning Target

Algebra 1 Khan Academy Video Correlations By SpringBoard Activity and Learning Target Algebra 1 Khan Academy Video Correlations By SpringBoard Activity and Learning Target SB Activity Activity 1 Investigating Patterns 1-1 Learning Targets: Identify patterns in data. Use tables, graphs,

More information

The 2 k Factorial Design. Dr. Mohammad Abuhaiba 1

The 2 k Factorial Design. Dr. Mohammad Abuhaiba 1 The 2 k Factorial Design Dr. Mohammad Abuhaiba 1 HoweWork Assignment Due Tuesday 1/6/2010 6.1, 6.2, 6.17, 6.18, 6.19 Dr. Mohammad Abuhaiba 2 Design of Engineering Experiments The 2 k Factorial Design Special

More information

Physics 403. Segev BenZvi. Numerical Methods, Maximum Likelihood, and Least Squares. Department of Physics and Astronomy University of Rochester

Physics 403. Segev BenZvi. Numerical Methods, Maximum Likelihood, and Least Squares. Department of Physics and Astronomy University of Rochester Physics 403 Numerical Methods, Maximum Likelihood, and Least Squares Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Quadratic Approximation

More information

Notes on Some Methods for Solving Linear Systems

Notes on Some Methods for Solving Linear Systems Notes on Some Methods for Solving Linear Systems Dianne P. O Leary, 1983 and 1999 and 2007 September 25, 2007 When the matrix A is symmetric and positive definite, we have a whole new class of algorithms

More information

FIVE-FACTOR CENTRAL COMPOSITE DESIGNS ROBUST TO A PAIR OF MISSING OBSERVATIONS. Munir Akhtar Vice-Chanselor, Islamia University Bahawalpur, Pakistan.

FIVE-FACTOR CENTRAL COMPOSITE DESIGNS ROBUST TO A PAIR OF MISSING OBSERVATIONS. Munir Akhtar Vice-Chanselor, Islamia University Bahawalpur, Pakistan. Journal of Research (Science), Bahauddin Zakariya University, Multan, Pakistan. Vol.2, No.2, December 2, pp. 5-5 ISSN 2-2 FIVE-FACTOR CENTRAL COMPOSITE DESIGNS ROBUST TO A PAIR OF MISSING OBSERVATIONS

More information

Optimality Conditions

Optimality Conditions Chapter 2 Optimality Conditions 2.1 Global and Local Minima for Unconstrained Problems When a minimization problem does not have any constraints, the problem is to find the minimum of the objective function.

More information

LECTURE 22: SWARM INTELLIGENCE 3 / CLASSICAL OPTIMIZATION

LECTURE 22: SWARM INTELLIGENCE 3 / CLASSICAL OPTIMIZATION 15-382 COLLECTIVE INTELLIGENCE - S19 LECTURE 22: SWARM INTELLIGENCE 3 / CLASSICAL OPTIMIZATION TEACHER: GIANNI A. DI CARO WHAT IF WE HAVE ONE SINGLE AGENT PSO leverages the presence of a swarm: the outcome

More information

Nonlinear Regression. Summary. Sample StatFolio: nonlinear reg.sgp

Nonlinear Regression. Summary. Sample StatFolio: nonlinear reg.sgp Nonlinear Regression Summary... 1 Analysis Summary... 4 Plot of Fitted Model... 6 Response Surface Plots... 7 Analysis Options... 10 Reports... 11 Correlation Matrix... 12 Observed versus Predicted...

More information

CHAPTER 2: QUADRATIC PROGRAMMING

CHAPTER 2: QUADRATIC PROGRAMMING CHAPTER 2: QUADRATIC PROGRAMMING Overview Quadratic programming (QP) problems are characterized by objective functions that are quadratic in the design variables, and linear constraints. In this sense,

More information

Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology. Jeffrey R. Edwards University of North Carolina

Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology. Jeffrey R. Edwards University of North Carolina Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology Jeffrey R. Edwards University of North Carolina 1 Outline I. Types of Difference Scores II. Questions Difference

More information

Numerical Optimization

Numerical Optimization Numerical Optimization Unit 2: Multivariable optimization problems Che-Rung Lee Scribe: February 28, 2011 (UNIT 2) Numerical Optimization February 28, 2011 1 / 17 Partial derivative of a two variable function

More information

Exploring the energy landscape

Exploring the energy landscape Exploring the energy landscape ChE210D Today's lecture: what are general features of the potential energy surface and how can we locate and characterize minima on it Derivatives of the potential energy

More information

Linear Regression. In this problem sheet, we consider the problem of linear regression with p predictors and one intercept,

Linear Regression. In this problem sheet, we consider the problem of linear regression with p predictors and one intercept, Linear Regression In this problem sheet, we consider the problem of linear regression with p predictors and one intercept, y = Xβ + ɛ, where y t = (y 1,..., y n ) is the column vector of target values,

More information

Matrix Factorizations

Matrix Factorizations 1 Stat 540, Matrix Factorizations Matrix Factorizations LU Factorization Definition... Given a square k k matrix S, the LU factorization (or decomposition) represents S as the product of two triangular

More information

Problems. Suppose both models are fitted to the same data. Show that SS Res, A SS Res, B

Problems. Suppose both models are fitted to the same data. Show that SS Res, A SS Res, B Simple Linear Regression 35 Problems 1 Consider a set of data (x i, y i ), i =1, 2,,n, and the following two regression models: y i = β 0 + β 1 x i + ε, (i =1, 2,,n), Model A y i = γ 0 + γ 1 x i + γ 2

More information

Two-Stage Computing Budget Allocation. Approach for Response Surface Method PENG JI

Two-Stage Computing Budget Allocation. Approach for Response Surface Method PENG JI Two-Stage Computing Budget Allocation Approach for Response Surface Method PENG JI NATIONAL UNIVERSITY OF SINGAPORE 2005 Two-Stage Computing Budget Allocation Approach for Response Surface Method PENG

More information

Chapter 3: Regression Methods for Trends

Chapter 3: Regression Methods for Trends Chapter 3: Regression Methods for Trends Time series exhibiting trends over time have a mean function that is some simple function (not necessarily constant) of time. The example random walk graph from

More information

SMA 6304 / MIT / MIT Manufacturing Systems. Lecture 10: Data and Regression Analysis. Lecturer: Prof. Duane S. Boning

SMA 6304 / MIT / MIT Manufacturing Systems. Lecture 10: Data and Regression Analysis. Lecturer: Prof. Duane S. Boning SMA 6304 / MIT 2.853 / MIT 2.854 Manufacturing Systems Lecture 10: Data and Regression Analysis Lecturer: Prof. Duane S. Boning 1 Agenda 1. Comparison of Treatments (One Variable) Analysis of Variance

More information

4 Multiple Linear Regression

4 Multiple Linear Regression 4 Multiple Linear Regression 4. The Model Definition 4.. random variable Y fits a Multiple Linear Regression Model, iff there exist β, β,..., β k R so that for all (x, x 2,..., x k ) R k where ε N (, σ

More information

3 Time Series Regression

3 Time Series Regression 3 Time Series Regression 3.1 Modelling Trend Using Regression Random Walk 2 0 2 4 6 8 Random Walk 0 2 4 6 8 0 10 20 30 40 50 60 (a) Time 0 10 20 30 40 50 60 (b) Time Random Walk 8 6 4 2 0 Random Walk 0

More information

Iterative Methods for Solving A x = b

Iterative Methods for Solving A x = b Iterative Methods for Solving A x = b A good (free) online source for iterative methods for solving A x = b is given in the description of a set of iterative solvers called templates found at netlib: http

More information

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018 MATH 57: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 18 1 Global and Local Optima Let a function f : S R be defined on a set S R n Definition 1 (minimizers and maximizers) (i) x S

More information

Estadística II Chapter 5. Regression analysis (second part)

Estadística II Chapter 5. Regression analysis (second part) Estadística II Chapter 5. Regression analysis (second part) Chapter 5. Regression analysis (second part) Contents Diagnostic: Residual analysis The ANOVA (ANalysis Of VAriance) decomposition Nonlinear

More information

Written Examination

Written Examination Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes

More information

Estimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X.

Estimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X. Estimating σ 2 We can do simple prediction of Y and estimation of the mean of Y at any value of X. To perform inferences about our regression line, we must estimate σ 2, the variance of the error term.

More information

Regression Models - Introduction

Regression Models - Introduction Regression Models - Introduction In regression models there are two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent

More information

Assignment 2 (Sol.) Introduction to Machine Learning Prof. B. Ravindran

Assignment 2 (Sol.) Introduction to Machine Learning Prof. B. Ravindran Assignment 2 (Sol.) Introduction to Machine Learning Prof. B. Ravindran 1. Let A m n be a matrix of real numbers. The matrix AA T has an eigenvector x with eigenvalue b. Then the eigenvector y of A T A

More information

20g g g Analyze the residuals from this experiment and comment on the model adequacy.

20g g g Analyze the residuals from this experiment and comment on the model adequacy. 3.4. A computer ANOVA output is shown below. Fill in the blanks. You may give bounds on the P-value. One-way ANOVA Source DF SS MS F P Factor 3 36.15??? Error??? Total 19 196.04 3.11. A pharmaceutical

More information

Nonparametric Principal Components Regression

Nonparametric Principal Components Regression Int. Statistical Inst.: Proc. 58th World Statistical Congress, 2011, Dublin (Session CPS031) p.4574 Nonparametric Principal Components Regression Barrios, Erniel University of the Philippines Diliman,

More information

Lagrange Multipliers

Lagrange Multipliers Optimization with Constraints As long as algebra and geometry have been separated, their progress have been slow and their uses limited; but when these two sciences have been united, they have lent each

More information

Lecture 6 Positive Definite Matrices

Lecture 6 Positive Definite Matrices Linear Algebra Lecture 6 Positive Definite Matrices Prof. Chun-Hung Liu Dept. of Electrical and Computer Engineering National Chiao Tung University Spring 2017 2017/6/8 Lecture 6: Positive Definite Matrices

More information

UNIVERSITETET I OSLO

UNIVERSITETET I OSLO UNIVERSITETET I OSLO Det matematisk-naturvitenskapelige fakultet Examination in: STK4030 Modern data analysis - FASIT Day of examination: Friday 13. Desember 2013. Examination hours: 14.30 18.30. This

More information

11 Hypothesis Testing

11 Hypothesis Testing 28 11 Hypothesis Testing 111 Introduction Suppose we want to test the hypothesis: H : A q p β p 1 q 1 In terms of the rows of A this can be written as a 1 a q β, ie a i β for each row of A (here a i denotes

More information

Data Set 1A: Algal Photosynthesis vs. Salinity and Temperature

Data Set 1A: Algal Photosynthesis vs. Salinity and Temperature Data Set A: Algal Photosynthesis vs. Salinity and Temperature Statistical setting These data are from a controlled experiment in which two quantitative variables were manipulated, to determine their effects

More information

Linear model selection and regularization

Linear model selection and regularization Linear model selection and regularization Problems with linear regression with least square 1. Prediction Accuracy: linear regression has low bias but suffer from high variance, especially when n p. It

More information

Statistical View of Least Squares

Statistical View of Least Squares May 23, 2006 Purpose of Regression Some Examples Least Squares Purpose of Regression Purpose of Regression Some Examples Least Squares Suppose we have two variables x and y Purpose of Regression Some Examples

More information

Unsupervised dimensionality reduction

Unsupervised dimensionality reduction Unsupervised dimensionality reduction Guillaume Obozinski Ecole des Ponts - ParisTech SOCN course 2014 Guillaume Obozinski Unsupervised dimensionality reduction 1/30 Outline 1 PCA 2 Kernel PCA 3 Multidimensional

More information

The prediction of house price

The prediction of house price 000 001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 045 046 047 048 049 050

More information

Probing the covariance matrix

Probing the covariance matrix Probing the covariance matrix Kenneth M. Hanson Los Alamos National Laboratory (ret.) BIE Users Group Meeting, September 24, 2013 This presentation available at http://kmh-lanl.hansonhub.com/ LA-UR-06-5241

More information

Mathematical optimization

Mathematical optimization Optimization Mathematical optimization Determine the best solutions to certain mathematically defined problems that are under constrained determine optimality criteria determine the convergence of the

More information

Algorithms for Constrained Optimization

Algorithms for Constrained Optimization 1 / 42 Algorithms for Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University April 19, 2015 2 / 42 Outline 1. Convergence 2. Sequential quadratic

More information

ECON 4160, Lecture 11 and 12

ECON 4160, Lecture 11 and 12 ECON 4160, 2016. Lecture 11 and 12 Co-integration Ragnar Nymoen Department of Economics 9 November 2017 1 / 43 Introduction I So far we have considered: Stationary VAR ( no unit roots ) Standard inference

More information

Statistical View of Least Squares

Statistical View of Least Squares Basic Ideas Some Examples Least Squares May 22, 2007 Basic Ideas Simple Linear Regression Basic Ideas Some Examples Least Squares Suppose we have two variables x and y Basic Ideas Simple Linear Regression

More information

CE 191: Civil and Environmental Engineering Systems Analysis. LEC 05 : Optimality Conditions

CE 191: Civil and Environmental Engineering Systems Analysis. LEC 05 : Optimality Conditions CE 191: Civil and Environmental Engineering Systems Analysis LEC : Optimality Conditions Professor Scott Moura Civil & Environmental Engineering University of California, Berkeley Fall 214 Prof. Moura

More information

Applied Econometrics (QEM)

Applied Econometrics (QEM) Applied Econometrics (QEM) based on Prinicples of Econometrics Jakub Mućk Department of Quantitative Economics Jakub Mućk Applied Econometrics (QEM) Meeting #3 1 / 42 Outline 1 2 3 t-test P-value Linear

More information

Positive Definite Matrix

Positive Definite Matrix 1/29 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Positive Definite, Negative Definite, Indefinite 2/29 Pure Quadratic Function

More information

GI07/COMPM012: Mathematical Programming and Research Methods (Part 2) 2. Least Squares and Principal Components Analysis. Massimiliano Pontil

GI07/COMPM012: Mathematical Programming and Research Methods (Part 2) 2. Least Squares and Principal Components Analysis. Massimiliano Pontil GI07/COMPM012: Mathematical Programming and Research Methods (Part 2) 2. Least Squares and Principal Components Analysis Massimiliano Pontil 1 Today s plan SVD and principal component analysis (PCA) Connection

More information

Applied Time Series Topics

Applied Time Series Topics Applied Time Series Topics Ivan Medovikov Brock University April 16, 2013 Ivan Medovikov, Brock University Applied Time Series Topics 1/34 Overview 1. Non-stationary data and consequences 2. Trends and

More information

Administration. Homework 1 on web page, due Feb 11 NSERC summer undergraduate award applications due Feb 5 Some helpful books

Administration. Homework 1 on web page, due Feb 11 NSERC summer undergraduate award applications due Feb 5 Some helpful books STA 44/04 Jan 6, 00 / 5 Administration Homework on web page, due Feb NSERC summer undergraduate award applications due Feb 5 Some helpful books STA 44/04 Jan 6, 00... administration / 5 STA 44/04 Jan 6,

More information

RESPONSE SURFACE MODELLING, RSM

RESPONSE SURFACE MODELLING, RSM CHEM-E3205 BIOPROCESS OPTIMIZATION AND SIMULATION LECTURE 3 RESPONSE SURFACE MODELLING, RSM Tool for process optimization HISTORY Statistical experimental design pioneering work R.A. Fisher in 1925: Statistical

More information

Unconstrained Optimization

Unconstrained Optimization 1 / 36 Unconstrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University February 2, 2015 2 / 36 3 / 36 4 / 36 5 / 36 1. preliminaries 1.1 local approximation

More information

Weighted Least Squares

Weighted Least Squares Weighted Least Squares The standard linear model assumes that Var(ε i ) = σ 2 for i = 1,..., n. As we have seen, however, there are instances where Var(Y X = x i ) = Var(ε i ) = σ2 w i. Here w 1,..., w

More information

Analysis of Bivariate Data

Analysis of Bivariate Data Analysis of Bivariate Data Data Two Quantitative variables GPA and GAES Interest rates and indices Tax and fund allocation Population size and prison population Bivariate data (x,y) Case corr&reg 2 Independent

More information

Performing response surface analysis using the SAS RSREG procedure

Performing response surface analysis using the SAS RSREG procedure Paper DV02-2012 Performing response surface analysis using the SAS RSREG procedure Zhiwu Li, National Database Nursing Quality Indicator and the Department of Biostatistics, University of Kansas Medical

More information

Lecture 17: Numerical Optimization October 2014

Lecture 17: Numerical Optimization October 2014 Lecture 17: Numerical Optimization 36-350 22 October 2014 Agenda Basics of optimization Gradient descent Newton s method Curve-fitting R: optim, nls Reading: Recipes 13.1 and 13.2 in The R Cookbook Optional

More information

Multivariate Newton Minimanization

Multivariate Newton Minimanization Multivariate Newton Minimanization Optymalizacja syntezy biosurfaktantu Rhamnolipid Rhamnolipids are naturally occuring glycolipid produced commercially by the Pseudomonas aeruginosa species of bacteria.

More information

Weighted Least Squares

Weighted Least Squares Weighted Least Squares ST 430/514 Recall the linear regression equation E(Y ) = β 0 + β 1 x 1 + β 2 x 2 + + β k x k We have estimated the parameters β 0, β 1, β 2,..., β k by minimizing the sum of squared

More information

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary

More information