In an optimization problem, the objective is to optimize (maximize or minimize) some function f. This function f is called the objective function.

Size: px
Start display at page:

Download "In an optimization problem, the objective is to optimize (maximize or minimize) some function f. This function f is called the objective function."

Transcription

1 Optimization In an optimization problem, the objective is to optimize (maximize or minimize) some function f. This function f is called the objective function. For example, an objective function f to be maximized may be the revenue in a production of TV sets, the yield per minute in a chemical process, the hourly number of customers served in some office, the hardness of steel, or the tensile strength of a rope. Similarly, we may want to minimize f if f is the cost per unit of producing certain cameras, the operating cost of some power plant, the daily loss of heat in a heating system, the idling time of some lathe, or the time needed to produce a fender. EE202 ~ Page 48 / Part 2

2 Unconstrained Optimization: An Example Cour rtesy of Professor Wu-Shen ng Lu of University of Victo oria, Can nada EE202 ~ Page 49 / Part 2

3 Unconstrained Optimization: An Example (cont.) Cour rtesy of Professor Wu-Shen ng Lu of University of Victo oria, Can nada EE202 ~ Page 50 / Part 2

4 Unconstrained Optimization: An Example (cont.) Cour rtesy of Professor Wu-Shen ng Lu of University of Victo oria, Can nada EE202 ~ Page 5 / Part 2

5 Unconstrained Optimization: An Example (cont.) EE202 ~ Page 52 / Part 2

6 Constrained Optimization: An Example Courtesy of Pro ofessor Wu-Sheng Lu of University of Victor ria, Cana ada EE202 ~ Page 53 / Part 2

7 Background In most optimization problems the objective function f depends on several variables x,, x. n There are called control variables because we can control them, e.g., ) The yield of a chemical process may depend on pressure x and temperature x 2. 2) The efficiency of a certain air conditioning system may depend on temperature x, air pressure x 2, moisture content x 3, crosssectional area of outlet x 4, etc. EE202 ~ Page 54 / Part 2

8 Background Optimization theory develops methods for optimal choices of x, x 2,, x n, which maximize (or minimize) the objective function f, that is methods for finding optimal values of x, x 2,, x n. In many problems the choice of values of x, x 2,, x n is not entirely free but is subject to some constraints, that is, additional conditions arising from the nature of the problem and the variables. For example, if x is production cost, then x 0 and there are many other variables (time, weight, distance travelled by a salesman, etc.) that can take nonnegative values only. Constraints can also have the form of equations (instead of inequalities). EE202 ~ Page 55 / Part 2

9 Unconstrained Optimization Let us first consider unconstrained optimization in the case of a real-valued function f (x, x 2,, x n ). We also write x = (x, x 2,, x n )' and f (x), for convenience. f has a minimum at point x = X 0 in a region R if f (x) f (X 0 ) for x in R. Similarly, f has a maximum at X 0 if f (x) f (X 0 ) for all x in R. Minima and maxima are called extrema. f is said to have a local minimum at X 0 if f (x) f (X 0 ) for all x in a neighbourhood of X 0, for all x satisfying where r > 0 is sufficiently small. x X r 0 EE202 ~ Page 56 / Part 2

10 Example: Minimizing a Function of a Single Variable Golden Section Search EE202 ~ Page 57 / Part 2

11 Unconstrained Optimization Question: How can one reach the valley bottom (minima) fastest? Answer: Take the path in the most downhill direction. Mathematically, this is the direction of the negative gradient. EE202 ~ Page 58 / Part 2

12 Unconstrained Optimization (cont.) If f is differentiable and has an extremum at a point X 0, then the partial derivatives f / x must be zero at X 0. Thus its gradient,, f / xn f x f ( X0 ) 0. () f x n x=x0 A point X 0 at which f ( X ) 0 is called a stationary point (valley bottom) of f. 0 Condition () is necessary for an extremum of f at X 0 in the interior of R, but is not sufficient. In practice, solving () will often be difficult. For this reason, one generally prefers solution by iteration, by search processes that start at some point and move stepwise to points at which f is smaller (if a minimum of f is wanted) or bigger (in the case of maximum). EE202 ~ Page 59 / Part 2

13 Cauchy s Method Cauchy s method of steepest descent or gradient method is one such popular method. However, convergence can sometimes be slow. Given a multivariable function f (x), we examine its Taylor expansion (obtained from Taylor series) f ( x t x ) f ( x ) t f ( x ) x, Thus, we can expect a decrease in value of f if we set the step direction x to be T x x f x x, x, f ( x) x n x n f x n x 2 f ( x) xf ( x) and the step size t 0. This is because T f ( x t f( x)) f( x) t f ( ) f ( ) x x f ( x) EE202 ~ Page 60 / Part 2 0 x x+tδx x f ( xtx )

14 Cauchy s Method (cont.) T Why f( x) f( x) 0? Noting that f x f x T f f f( x) f( x) x xn f xn f xn and f x 2 2 T f f f f f( x) f( x) 0 x x n x xn f x n thus, we have T f ( x t f( x )) f( x ) t f( ) f( ) x x f( x ) (2) T EE202 ~ Page 6 / Part 2

15 Basic Idea of Cauchy s Method Now suppose that we start from an initial point, say x 0, and an appropriate step size t 0 and let x = x 0 t f f (x( 0 ). From (2), we have T f( x ) f( x 0 tf( x 0)) f( x 0) t f( 0) f( 0) x x f( x 0) Let us move on from x by defining x 2 = x t 2 f (x ) with an appropriately chosen step size t 2. Again from (2), we have T f( x 2) f( x t2f( x )) f( x ) t 2 f( ) f( ) x x f( x ) f( x 0) By repeating this process, we obtain an iterative sequence or scheme: x x tf( x ) (3) i i i i such that the resulting sequence has the following property: f( x ) f( x ) f( x ) f( x ) f( X ) 0 i i 0 The sequence generated by (3) would converge to X 0 as i tends to. EE202 ~ Page 62 / Part 2

16 Cauchy s Method Selection of Step Size The issue on how to select an appropriate step size is quite complicated. In principle, we can fix the step size t to be some appropriate small positive scalar, say t *, and carry on the iteration: * i i t f i x x ( x ) (4) Such an iteration might not be efficient, but surely works. Alternatively, we can also determine the step size t i and the corresponding point x x tf( x ) i i i i at which the function f (x i ) is the smallest (a bit closer to the minima ) among all the possible choices of step size t. We illustrate this in an example EE202 ~ Page 63 / Part 2

17 An Illustrative Example x f ( x ) x 3x f (x) x 2 minima x 2 x Oops, overshoot x EE202 ~ Page 64 / Part 2

18 Example 2 2 Determine a minimum of f ( x) x 3 x, starting from x0 6 3 using the 2 method of steepest descent. Clearly, inspection shows that f (x) has a minimum at 0. Knowing the solution gives us a better feeling of how the method works. We first obtain the gradient and the iteration scheme f x 2x f ( x) f x 6x 2 2 x x, i 2 x, i ( 2 t) x, i i() t x i tf ( x i) t x2, i 6 x2, i ( 6t) x (5) 2, i Note that x i is depended on the choice of the step size t, and x,i and x 2,i are the values of x and x 2 corresponding to x i. We are now to select a step size t i such that the resulting f (x i ) is the smallest among all the choices of t. EE202 ~ Page 65 / Part 2

19 From (5), we have Optimal Selection of Step Size ( 2 tx ) f t f t x t x ( 6 tx ) 2, i, i x i () ( 2), i 3( 6) 2, i To find the small value of f above with respect to t, we take d d 2 2 d 2 2 f xi() t ( 2) t x, i 3( 6) t x2, i dt dt dt 2 2, i tx 2, i 2 ( 2 tx ) ( 2) 23( 6 ) ( 6) 2 2, i tx2, i 4( 2 tx ) 36( 6 ) x, i x 2, i t x, i x 2, i f ( x ) x 3x t i x 2 2, i 9 x2, i 2 2, i x 2, i 2x 54 the optimal choice of step size EE202 ~ Page 66 / Part 2

20 6 Starting from x we compute the values for x and x as listed in the, table below and plotted in the figures on the next page. Step i x i x i x 2 i t i+ Step Size,i 2,i t i x 2 2, i 9 x 2, i 2 2, i x2, i 2x x, i f ( x i ) 6x2, i fast x x tf( x ) i i i i steepest.m EE202 ~ Page 67 / Part 2

21 x 2 x 0 2D View x 2 x x x 0 f (x) x 2 3D View x 2 x x EE202 ~ Page 68 / Part 2

22 Example (cont.) Solving it with a fixed step size Step i x i x i x 2 i Size Fixed Step Size t * 2x, i,i 2,i f ( x i ) 6x2, i * t * : : : : : slow x x t f( x ) i i i The drawback of the method with a fixed step size is: it is generally slow in converging to the solution. steepest2.m EE202 ~ Page 69 / Part 2

23 Summary of Unconstrained Optimization (Cauchy s Method) Given a real valued nonlinear function f (x) = f (x,, x n ) and an initial point x 0, the problem is to find a solution X 0 such that f (X 0 ) is optimal (either minimum or maximum) Compute the gradient of f (x), i.e., f (x) Design and perform an iterative e scheme: e x i = x i t i f (x i ) If the iterative i scheme converges and stops at a certain step, say k,, then X 0 x k and the minimum value of f (x) is then approximately given by f (x k ), i.e., f (X 0 ) f (x k ) EE202 ~ Page 70 / Part 2

24 Linear Programming Linear programming (or linear optimization) consists of methods for solving optimization problems with constraints in which the objective function f is a linear function of the control variables x, x 2,, x n. Problems of this type arise in production, distribution of goods economics, and approximation theory. EE202 ~ Page 7 / Part 2

25 Example A Container Production Optimization Problem: Suppose that in producing two types of containers K and L one uses two machines M and M 2. To produce a container K, one needs M two minutes 2 and M 2 four minutes. Similarly, L occupies M eight minutes and M 2 four minutes. The net profit for a container K is $29 and for L it is $45. Determine the production plan that maximizes the net profit. EE202 ~ Page 72 / Part 2

26 Example (cont.) Problem Formulation: If we produce x containers K and x 2 containers L per hour, the profit per hour is f ( xx, ) 29x 45 x. 2 2 The constraints are 2x 8x 60 (resulting from machine M ) 2 4x 4x 60 (resulting from machine M ) 2 2 x x EE202 ~ Page 73 / Part 2

27 Figure 2 shows these constraints (x, x 2 ) must lie in the first quadrant and below or on the straight line 2 x + 8 x 2 = 60 as well as below or on the line 4 x + 4 x 2 = 60. Thus, (x, x 2 ) is restricted to the quadrangle OABC. We have to find (x, x 2 ) in OABC such that f (x, x 2 ) is maximum. Now f (x,x 2 ) = 0 gives x 2 = (29/45) x (see Fig. 2). The lines f (x, x 2 ) = constant are parallel to that line. We see that B, that is, x = 0 and x 2 = 5, gives the optimum f (0, 5) = 55. (M 2 ) x 2 Hence the answer is that the optimal production Figure 2 plan that maximizes the C B profit is achieved by producing containers K and L in the ratio 2:, the O 0 A (M ) maximum profit being f = 55 f = 0 $55 per hour. f = 200 x EE202 ~ Page 74 / Part 2

28 Simplex Method In practice, linear programming problems might contain many more variables than the two variables considered in the previous example. e A computational o a method is then required ed to solve the problem. One such technique is the Simplex Method. We illustrate such a method through examples EE202 ~ Page 75 / Part 2

29 Example Maximize f x 4x 2 subject to constraints x 2 x 6 2 5x 4 x 40 2 x, x 0 2 Idea: Introduce slack variables w, w. The constraints then become 2 x 2x w 6 2 5x 4x w x, x, w, w and the objective function f x 4 x 0 2 Solution: Simplex table (to be constructed on the next page) EE202 ~ Page 76 / Part 2

30 Formulation of Simplex Method The problem is equivalent to finding one of the solutions of the following set of linear equations: 0 0 f x 2x w 0w f 5x 4x 0w w f x 4x 0w 0w subject to the constraints, x, x 2, w, w 2 0, such that f is maximum. From what we have learnt in linear algebra, we know solutions to the above linear equations remain unchanged with the following 3 basic operations (B.O. B.O.): BO:Interchange B.O.: of two equations B.O.2: Multiplication of an equation by a nonzero constant BO3:Addition B.O.3: of a multiple of one equation to another equation EE202 ~ Page 77 / Part 2

31 Key Idea of Simplex Method The key idea of Simplex Method is to carry out a series of these basic operations (mainly B.O.2 and B.O.3) on the equations to transfer them into an certain form for which the desired solution (the solution corresponding to the maximum f can be easily observed and deduced). We will illustrate this idea through specific examples EE202 ~ Page 78 / Part 2

32 Simplex Table: Similar to the Matrix Form of Linear Equations Problem variables Slack variables Constants Basis x x 2 w w 2 b check w w f f x 2x w 0w f 5 x 4 x 0 w w f x 4x 0w 0w unity matrix EE202 ~ Page 79 / Part 2

33 Simplex Method (cont.) The check column on the right hand side is included to provide a check on the numerical calculations as we develop the simplex. For each row, total up the entries in that row and enter the sum in the check column. We are to perform a series of basic operations (B.O.2 and B.O.3) such that the coefficients associated with the objective function are all nonnegative. For the given example, it will be seen soon that the objective function can be transformed into: f 0x0 x2 w w2 24 f 24 w w2, w, w Clearly, the maximum of f is 24 (by setting w = w 2 = 0). EE202 ~ Page 80 / Part 2

34 EE202 ~ Page 8 / Part 2

35 EE202 ~ Page 82 / Part 2

36 EE202 ~ Page 83 / Part 2

37 f 0 x0 x2 w w2 24 f 24 w w2, w, w EE202 ~ Page 84 / Part 2

38 Simplex Method (re-cap) EE202 ~ Page 85 / Part 2

39 Com mplete e Simp plex Ta able Basis x x 2 w w 2 b check w w f w w f x 2 w w f x w f x 2 0 x w Simplex_Ex_ EE202 ~ Page 86 / Part 2 f f max 24 with x 4, x2 5. f x x 24 with 4, 5 max 2

40 Container Production Optimization Problem (re-visit) The Problem: Suppose that in producing two types of containers K and L one uses two machines M and M 2. To produce a container K, one needs M two minutes and M 2 four minutes. Similarly, L occupies M eight minutes and M 2 four minutes. The net profit for a container K is $29 and for L it is $45. Determine the production plan that maximizes the net profit per hour. Formulation: If we produce x containers K and x 2 containers L per hour, the profit is f ( x, x ) 29x 45x subject to the constraints: 2 2 2x 8x x 4x 60 2 x 0, x x 8 x w x 4x w f 29x 45x 0 2 x, x2, w, w2 0 EE202 ~ Page 87 / Part 2

41 Sim mplex Metho od x x 2 w w2 basis b check f x x w 2 w2 basis b check x, x2, w, w f x x 2 w w 2 basis b check f EE202 ~ Page 88 / Part 2

42 Sim mplex Metho od x x 2 w w 2 basis b check f x x 2 w w 2 basis b check x, x2, w, w2 0 f x x 2 w w 2 basis b check x x x Simplex_Ex_2 f f 2.666w 5.97w 55 f 2.666w 5.97w 55 f max EE202 ~ Page 89 / Part 2

43 Simplex Method for More Variables Many problems in real life involve more than just two variables. However, the method of computation and the key idea remain basically the same. It is an iterative process which is repeated until the index row contains no negative entry, at which h point the optimal value of the objective function is attained. We illustrate the process for solving optimization problems for more variables thru the following examples EE202 ~ Page 90 / Part 2

44 Example of 3 Variables EE202 ~ Page 9 / Part 2

45 Basis x x 2 x 3 w w 2 w 3 b check w w w f B.O.2. w w w f B.O.3. x 2 w w w f EE202 ~ Page 92 / Part 2

46 Basis x x 2 x 3 w w 2 w 3 b check x w B.O.2. w f x w B.O.3. 9 x 3 w f fmax 60 with x 0, x2 4, x3 9. Simplex_Ex_ f x w w3 60 f 60 x w w3, x, w, w EE202 ~ Page 93 / Part 2

47 Example with Constraint x x Introduction ti of slack variables w, w 2, w 3 gives 2x x2 w 50 This innovative procedure was 4x 3x2 w2 350 suggested by Yin Mingbao, a student xx2 w3 80 taking EE202 in Semester of x, x, w, w, w 0 Academic Year 2009/ EE202 ~ Page 94 / Part 2

48 Com mplete Simpl lex Ta able basis x x 2 w w 2 w 3 b check w w w f w w w f x w B.O.3. w w f x w w w f EE202 ~ Page 95 / Part 2

49 Com mplete Simpl lex Tab ble (co ont.) basis x x 2 w w 2 w 3 b check x w w x 2 w f x w 3 w x f x w x f B.O.3. f 550 with x 50, x 50, w 20 Simplex_Ex_4 max 2 3 f 2.5w 0.5w 550 f w 0.5 w, w, w EE202 ~ Page 96 / Part 2

50 Another Example with Constraint 3x 5x 25 2 Introduction of slack variables w, w 2 2, w 3 gives 2x3x2 w 20 x We are to solve this problem once x2 w2 45 3x 5x2 w3 25 again using the procedure suggested by Yin Mingbao. x, x, w, w, w EE202 ~ Page 97 / Part 2

51 Simp plex Ta able basis x x 2 w w 2 w 3 b check w w w f w w w f w x w w B.O.3. f f 4x 8w 360 f 360 4x 8 w, x, w 0 f 360? max 8x 3w w 60 w This is a contradiction! EE202 ~ Page 98 / Part 2

52 What to do next? The remaining control variable. The row has problem basis x x 2 w w 2 w 3 b check w x w f Pivot: The intersection of the row containing w 3 and the control variable x 2 to o get rid of w 3 in the basis and obtain an explicit solution for x 2. EE202 ~ Page 99 / Part 2

53 Simple ex Tab ble (co nt.) basis x x 2 w w 2 w 3 b check w x w /8 / f w 0 0 9/8 / x 0 0 5/8 / x /8 / f B.O.3. f 280 with x 25, x 20, w 0 max 2 f 6.5w 0.5w 280 f w 0.5 w, w, w Simplex_Ex_5 EE202 ~ Page 200 / Part 2

54 Summary of Constrained Optimization (Simplex Method) Given a set of linear equations and/or inequalities (constraints), the problem is to determine a solution such that a certain objective function is maximized Construct a Simplex table with slack variables Perform iteratively eyase series eso of basic bascope operations ato s( (B.O.2 & B.O.3) O3) The maximum value of the objective function can be determined d when all its resulting coefficients are nonnegative EE202 ~ Page 20 / Part 2

Optimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30

Optimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30 Optimization Escuela de Ingeniería Informática de Oviedo (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30 Unconstrained optimization Outline 1 Unconstrained optimization 2 Constrained

More information

Gradient Descent. Dr. Xiaowei Huang

Gradient Descent. Dr. Xiaowei Huang Gradient Descent Dr. Xiaowei Huang https://cgi.csc.liv.ac.uk/~xiaowei/ Up to now, Three machine learning algorithms: decision tree learning k-nn linear regression only optimization objectives are discussed,

More information

Special cases of linear programming

Special cases of linear programming Special cases of linear programming Infeasible solution Multiple solution (infinitely many solution) Unbounded solution Degenerated solution Notes on the Simplex tableau 1. The intersection of any basic

More information

Optimization. Totally not complete this is...don't use it yet...

Optimization. Totally not complete this is...don't use it yet... Optimization Totally not complete this is...don't use it yet... Bisection? Doing a root method is akin to doing a optimization method, but bi-section would not be an effective method - can detect sign

More information

Standard Form An LP is in standard form when: All variables are non-negativenegative All constraints are equalities Putting an LP formulation into sta

Standard Form An LP is in standard form when: All variables are non-negativenegative All constraints are equalities Putting an LP formulation into sta Chapter 4 Linear Programming: The Simplex Method An Overview of the Simplex Method Standard Form Tableau Form Setting Up the Initial Simplex Tableau Improving the Solution Calculating the Next Tableau

More information

Review of Optimization Methods

Review of Optimization Methods Review of Optimization Methods Prof. Manuela Pedio 20550 Quantitative Methods for Finance August 2018 Outline of the Course Lectures 1 and 2 (3 hours, in class): Linear and non-linear functions on Limits,

More information

Lecture 7: Minimization or maximization of functions (Recipes Chapter 10)

Lecture 7: Minimization or maximization of functions (Recipes Chapter 10) Lecture 7: Minimization or maximization of functions (Recipes Chapter 10) Actively studied subject for several reasons: Commonly encountered problem: e.g. Hamilton s and Lagrange s principles, economics

More information

39.1 Absolute maxima/minima

39.1 Absolute maxima/minima Module 13 : Maxima, Minima Saddle Points, Constrained maxima minima Lecture 39 : Absolute maxima / minima [Section 39.1] Objectives In this section you will learn the following : The notion of absolute

More information

Optimization Methods

Optimization Methods Optimization Methods Decision making Examples: determining which ingredients and in what quantities to add to a mixture being made so that it will meet specifications on its composition allocating available

More information

Gradient Descent. Sargur Srihari

Gradient Descent. Sargur Srihari Gradient Descent Sargur srihari@cedar.buffalo.edu 1 Topics Simple Gradient Descent/Ascent Difficulties with Simple Gradient Descent Line Search Brent s Method Conjugate Gradient Descent Weight vectors

More information

Scientific Computing: Optimization

Scientific Computing: Optimization Scientific Computing: Optimization Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course MATH-GA.2043 or CSCI-GA.2112, Spring 2012 March 8th, 2011 A. Donev (Courant Institute) Lecture

More information

Lecture 11 Linear programming : The Revised Simplex Method

Lecture 11 Linear programming : The Revised Simplex Method Lecture 11 Linear programming : The Revised Simplex Method 11.1 The Revised Simplex Method While solving linear programming problem on a digital computer by regular simplex method, it requires storing

More information

9.5 THE SIMPLEX METHOD: MIXED CONSTRAINTS

9.5 THE SIMPLEX METHOD: MIXED CONSTRAINTS SECTION 9.5 THE SIMPLEX METHOD: MIXED CONSTRAINTS 557 9.5 THE SIMPLEX METHOD: MIXED CONSTRAINTS In Sections 9. and 9., you looked at linear programming problems that occurred in standard form. The constraints

More information

4.4 The Simplex Method and the Standard Minimization Problem

4.4 The Simplex Method and the Standard Minimization Problem . The Simplex Method and the Standard Minimization Problem Question : What is a standard minimization problem? Question : How is the standard minimization problem related to the dual standard maximization

More information

Nonlinear Optimization

Nonlinear Optimization Nonlinear Optimization (Com S 477/577 Notes) Yan-Bin Jia Nov 7, 2017 1 Introduction Given a single function f that depends on one or more independent variable, we want to find the values of those variables

More information

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008.

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008. 1 ECONOMICS 594: LECTURE NOTES CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS W. Erwin Diewert January 31, 2008. 1. Introduction Many economic problems have the following structure: (i) a linear function

More information

x 1 2x 2 +x 3 = 0 2x 2 8x 3 = 8 4x 1 +5x 2 +9x 3 = 9

x 1 2x 2 +x 3 = 0 2x 2 8x 3 = 8 4x 1 +5x 2 +9x 3 = 9 Sec 2.1 Row Operations and Gaussian Elimination Consider a system of linear equations x 1 2x 2 +x 3 = 0 2x 2 8x 3 = 8 4x 1 +5x 2 +9x 3 = 9 The coefficient matrix of the system is The augmented matrix of

More information

1 Numerical optimization

1 Numerical optimization Contents 1 Numerical optimization 5 1.1 Optimization of single-variable functions............ 5 1.1.1 Golden Section Search................... 6 1.1. Fibonacci Search...................... 8 1. Algorithms

More information

MATH2070/2970 Optimisation

MATH2070/2970 Optimisation MATH2070/2970 Optimisation Introduction Semester 2, 2012 Lecturer: I.W. Guo Lecture slides courtesy of J.R. Wishart Course Information Lecture Information Optimisation: Weeks 1 7 Contact Information Email:

More information

Distributed Real-Time Control Systems. Lecture Distributed Control Linear Programming

Distributed Real-Time Control Systems. Lecture Distributed Control Linear Programming Distributed Real-Time Control Systems Lecture 13-14 Distributed Control Linear Programming 1 Linear Programs Optimize a linear function subject to a set of linear (affine) constraints. Many problems can

More information

Duality in LPP Every LPP called the primal is associated with another LPP called dual. Either of the problems is primal with the other one as dual. The optimal solution of either problem reveals the information

More information

Dr. S. Bourazza Math-473 Jazan University Department of Mathematics

Dr. S. Bourazza Math-473 Jazan University Department of Mathematics Dr. Said Bourazza Department of Mathematics Jazan University 1 P a g e Contents: Chapter 0: Modelization 3 Chapter1: Graphical Methods 7 Chapter2: Simplex method 13 Chapter3: Duality 36 Chapter4: Transportation

More information

MIT Manufacturing Systems Analysis Lecture 14-16

MIT Manufacturing Systems Analysis Lecture 14-16 MIT 2.852 Manufacturing Systems Analysis Lecture 14-16 Line Optimization Stanley B. Gershwin Spring, 2007 Copyright c 2007 Stanley B. Gershwin. Line Design Given a process, find the best set of machines

More information

Prelude to the Simplex Algorithm. The Algebraic Approach The search for extreme point solutions.

Prelude to the Simplex Algorithm. The Algebraic Approach The search for extreme point solutions. Prelude to the Simplex Algorithm The Algebraic Approach The search for extreme point solutions. 1 Linear Programming-1 x 2 12 8 (4,8) Max z = 6x 1 + 4x 2 Subj. to: x 1 + x 2

More information

Multivariate Newton Minimanization

Multivariate Newton Minimanization Multivariate Newton Minimanization Optymalizacja syntezy biosurfaktantu Rhamnolipid Rhamnolipids are naturally occuring glycolipid produced commercially by the Pseudomonas aeruginosa species of bacteria.

More information

Seminars on Mathematics for Economics and Finance Topic 5: Optimization Kuhn-Tucker conditions for problems with inequality constraints 1

Seminars on Mathematics for Economics and Finance Topic 5: Optimization Kuhn-Tucker conditions for problems with inequality constraints 1 Seminars on Mathematics for Economics and Finance Topic 5: Optimization Kuhn-Tucker conditions for problems with inequality constraints 1 Session: 15 Aug 2015 (Mon), 10:00am 1:00pm I. Optimization with

More information

Written Examination

Written Examination Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes

More information

6.2: The Simplex Method: Maximization (with problem constraints of the form )

6.2: The Simplex Method: Maximization (with problem constraints of the form ) 6.2: The Simplex Method: Maximization (with problem constraints of the form ) 6.2.1 The graphical method works well for solving optimization problems with only two decision variables and relatively few

More information

MA 123 (Calculus I) Lecture 13: October 19, 2017 Section A2. Professor Jennifer Balakrishnan,

MA 123 (Calculus I) Lecture 13: October 19, 2017 Section A2. Professor Jennifer Balakrishnan, Professor Jennifer Balakrishnan, jbala@bu.edu What is on today 1 Maxima and minima 1 1.1 Applications.................................... 1 2 What derivatives tell us 2 2.1 Increasing and decreasing functions.......................

More information

Math 354 Summer 2004 Solutions to review problems for Midterm #1

Math 354 Summer 2004 Solutions to review problems for Midterm #1 Solutions to review problems for Midterm #1 First: Midterm #1 covers Chapter 1 and 2. In particular, this means that it does not explicitly cover linear algebra. Also, I promise there will not be any proofs.

More information

NONLINEAR. (Hillier & Lieberman Introduction to Operations Research, 8 th edition)

NONLINEAR. (Hillier & Lieberman Introduction to Operations Research, 8 th edition) NONLINEAR PROGRAMMING (Hillier & Lieberman Introduction to Operations Research, 8 th edition) Nonlinear Programming g Linear programming has a fundamental role in OR. In linear programming all its functions

More information

Week_4: simplex method II

Week_4: simplex method II Week_4: simplex method II 1 1.introduction LPs in which all the constraints are ( ) with nonnegative right-hand sides offer a convenient all-slack starting basic feasible solution. Models involving (=)

More information

4.3 Minimizing & Mixed Constraints

4.3 Minimizing & Mixed Constraints Mathematics : Mattingly, Fall 6 8 4. Minimizing & Mixed Constraints So far, you have seen how to solve one type of problem: Standard Maximum. The objective function is to be maximized.. Constraints use..

More information

UNCONSTRAINED OPTIMIZATION

UNCONSTRAINED OPTIMIZATION UNCONSTRAINED OPTIMIZATION 6. MATHEMATICAL BASIS Given a function f : R n R, and x R n such that f(x ) < f(x) for all x R n then x is called a minimizer of f and f(x ) is the minimum(value) of f. We wish

More information

1 Numerical optimization

1 Numerical optimization Contents Numerical optimization 5. Optimization of single-variable functions.............................. 5.. Golden Section Search..................................... 6.. Fibonacci Search........................................

More information

Unconstrained optimization

Unconstrained optimization Chapter 4 Unconstrained optimization An unconstrained optimization problem takes the form min x Rnf(x) (4.1) for a target functional (also called objective function) f : R n R. In this chapter and throughout

More information

Lecture 11: Post-Optimal Analysis. September 23, 2009

Lecture 11: Post-Optimal Analysis. September 23, 2009 Lecture : Post-Optimal Analysis September 23, 2009 Today Lecture Dual-Simplex Algorithm Post-Optimal Analysis Chapters 4.4 and 4.5. IE 30/GE 330 Lecture Dual Simplex Method The dual simplex method will

More information

MULTIPLE CHOICE QUESTIONS DECISION SCIENCE

MULTIPLE CHOICE QUESTIONS DECISION SCIENCE MULTIPLE CHOICE QUESTIONS DECISION SCIENCE 1. Decision Science approach is a. Multi-disciplinary b. Scientific c. Intuitive 2. For analyzing a problem, decision-makers should study a. Its qualitative aspects

More information

Numerical Optimization

Numerical Optimization Constrained Optimization Computer Science and Automation Indian Institute of Science Bangalore 560 012, India. NPTEL Course on Constrained Optimization Constrained Optimization Problem: min h j (x) 0,

More information

In Chapters 3 and 4 we introduced linear programming

In Chapters 3 and 4 we introduced linear programming SUPPLEMENT The Simplex Method CD3 In Chapters 3 and 4 we introduced linear programming and showed how models with two variables can be solved graphically. We relied on computer programs (WINQSB, Excel,

More information

Understanding the Simplex algorithm. Standard Optimization Problems.

Understanding the Simplex algorithm. Standard Optimization Problems. Understanding the Simplex algorithm. Ma 162 Spring 2011 Ma 162 Spring 2011 February 28, 2011 Standard Optimization Problems. A standard maximization problem can be conveniently described in matrix form

More information

UNIT-4 Chapter6 Linear Programming

UNIT-4 Chapter6 Linear Programming UNIT-4 Chapter6 Linear Programming Linear Programming 6.1 Introduction Operations Research is a scientific approach to problem solving for executive management. It came into existence in England during

More information

Computational Optimization. Mathematical Programming Fundamentals 1/25 (revised)

Computational Optimization. Mathematical Programming Fundamentals 1/25 (revised) Computational Optimization Mathematical Programming Fundamentals 1/5 (revised) If you don t know where you are going, you probably won t get there. -from some book I read in eight grade If you do get there,

More information

Slack Variable. Max Z= 3x 1 + 4x 2 + 5X 3. Subject to: X 1 + X 2 + X x 1 + 4x 2 + X X 1 + X 2 + 4X 3 10 X 1 0, X 2 0, X 3 0

Slack Variable. Max Z= 3x 1 + 4x 2 + 5X 3. Subject to: X 1 + X 2 + X x 1 + 4x 2 + X X 1 + X 2 + 4X 3 10 X 1 0, X 2 0, X 3 0 Simplex Method Slack Variable Max Z= 3x 1 + 4x 2 + 5X 3 Subject to: X 1 + X 2 + X 3 20 3x 1 + 4x 2 + X 3 15 2X 1 + X 2 + 4X 3 10 X 1 0, X 2 0, X 3 0 Standard Form Max Z= 3x 1 +4x 2 +5X 3 + 0S 1 + 0S 2

More information

Optimality Conditions

Optimality Conditions Chapter 2 Optimality Conditions 2.1 Global and Local Minima for Unconstrained Problems When a minimization problem does not have any constraints, the problem is to find the minimum of the objective function.

More information

Practical Optimization: Basic Multidimensional Gradient Methods

Practical Optimization: Basic Multidimensional Gradient Methods Practical Optimization: Basic Multidimensional Gradient Methods László Kozma Lkozma@cis.hut.fi Helsinki University of Technology S-88.4221 Postgraduate Seminar on Signal Processing 22. 10. 2008 Contents

More information

Lesson 27 Linear Programming; The Simplex Method

Lesson 27 Linear Programming; The Simplex Method Lesson Linear Programming; The Simplex Method Math 0 April 9, 006 Setup A standard linear programming problem is to maximize the quantity c x + c x +... c n x n = c T x subject to constraints a x + a x

More information

Constrained optimization: direct methods (cont.)

Constrained optimization: direct methods (cont.) Constrained optimization: direct methods (cont.) Jussi Hakanen Post-doctoral researcher jussi.hakanen@jyu.fi Direct methods Also known as methods of feasible directions Idea in a point x h, generate a

More information

Math Models of OR: Some Definitions

Math Models of OR: Some Definitions Math Models of OR: Some Definitions John E. Mitchell Department of Mathematical Sciences RPI, Troy, NY 12180 USA September 2018 Mitchell Some Definitions 1 / 20 Active constraints Outline 1 Active constraints

More information

OPERATIONS RESEARCH. Linear Programming Problem

OPERATIONS RESEARCH. Linear Programming Problem OPERATIONS RESEARCH Chapter 1 Linear Programming Problem Prof. Bibhas C. Giri Department of Mathematics Jadavpur University Kolkata, India Email: bcgiri.jumath@gmail.com MODULE - 2: Simplex Method for

More information

Fundamental Theorems of Optimization

Fundamental Theorems of Optimization Fundamental Theorems of Optimization 1 Fundamental Theorems of Math Prog. Maximizing a concave function over a convex set. Maximizing a convex function over a closed bounded convex set. 2 Maximizing Concave

More information

Systems Analysis in Construction

Systems Analysis in Construction Systems Analysis in Construction CB312 Construction & Building Engineering Department- AASTMT by A h m e d E l h a k e e m & M o h a m e d S a i e d 3. Linear Programming Optimization Simplex Method 135

More information

Chapter 2: Unconstrained Extrema

Chapter 2: Unconstrained Extrema Chapter 2: Unconstrained Extrema Math 368 c Copyright 2012, 2013 R Clark Robinson May 22, 2013 Chapter 2: Unconstrained Extrema 1 Types of Sets Definition For p R n and r > 0, the open ball about p of

More information

Deep Learning. Authors: I. Goodfellow, Y. Bengio, A. Courville. Chapter 4: Numerical Computation. Lecture slides edited by C. Yim. C.

Deep Learning. Authors: I. Goodfellow, Y. Bengio, A. Courville. Chapter 4: Numerical Computation. Lecture slides edited by C. Yim. C. Chapter 4: Numerical Computation Deep Learning Authors: I. Goodfellow, Y. Bengio, A. Courville Lecture slides edited by 1 Chapter 4: Numerical Computation 4.1 Overflow and Underflow 4.2 Poor Conditioning

More information

Course Summary Math 211

Course Summary Math 211 Course Summary Math 211 table of contents I. Functions of several variables. II. R n. III. Derivatives. IV. Taylor s Theorem. V. Differential Geometry. VI. Applications. 1. Best affine approximations.

More information

We saw in the previous lectures that continuity and differentiability help to understand some aspects of a

We saw in the previous lectures that continuity and differentiability help to understand some aspects of a Module 3 : Differentiation and Mean Value Theorems Lecture 9 : Roll's theorem and Mean Value Theorem Objectives In this section you will learn the following : Roll's theorem Mean Value Theorem Applications

More information

Introduction to the Simplex Algorithm Active Learning Module 3

Introduction to the Simplex Algorithm Active Learning Module 3 Introduction to the Simplex Algorithm Active Learning Module 3 J. René Villalobos and Gary L. Hogg Arizona State University Paul M. Griffin Georgia Institute of Technology Background Material Almost any

More information

CSC373: Algorithm Design, Analysis and Complexity Fall 2017 DENIS PANKRATOV NOVEMBER 1, 2017

CSC373: Algorithm Design, Analysis and Complexity Fall 2017 DENIS PANKRATOV NOVEMBER 1, 2017 CSC373: Algorithm Design, Analysis and Complexity Fall 2017 DENIS PANKRATOV NOVEMBER 1, 2017 Linear Function f: R n R is linear if it can be written as f x = a T x for some a R n Example: f x 1, x 2 =

More information

Constrained optimization. Unconstrained optimization. One-dimensional. Multi-dimensional. Newton with equality constraints. Active-set method.

Constrained optimization. Unconstrained optimization. One-dimensional. Multi-dimensional. Newton with equality constraints. Active-set method. Optimization Unconstrained optimization One-dimensional Multi-dimensional Newton s method Basic Newton Gauss- Newton Quasi- Newton Descent methods Gradient descent Conjugate gradient Constrained optimization

More information

MATH 4211/6211 Optimization Linear Programming

MATH 4211/6211 Optimization Linear Programming MATH 4211/6211 Optimization Linear Programming Xiaojing Ye Department of Mathematics & Statistics Georgia State University Xiaojing Ye, Math & Stat, Georgia State University 0 The standard form of a Linear

More information

Chapter 1: Linear Programming

Chapter 1: Linear Programming Chapter 1: Linear Programming Math 368 c Copyright 2013 R Clark Robinson May 22, 2013 Chapter 1: Linear Programming 1 Max and Min For f : D R n R, f (D) = {f (x) : x D } is set of attainable values of

More information

Econ Slides from Lecture 14

Econ Slides from Lecture 14 Econ 205 Sobel Econ 205 - Slides from Lecture 14 Joel Sobel September 10, 2010 Theorem ( Lagrange Multipliers ) Theorem If x solves max f (x) subject to G(x) = 0 then there exists λ such that Df (x ) =

More information

CPS 616 ITERATIVE IMPROVEMENTS 10-1

CPS 616 ITERATIVE IMPROVEMENTS 10-1 CPS 66 ITERATIVE IMPROVEMENTS 0 - APPROACH Algorithm design technique for solving optimization problems Start with a feasible solution Repeat the following step until no improvement can be found: change

More information

Unconstrained Multivariate Optimization

Unconstrained Multivariate Optimization Unconstrained Multivariate Optimization Multivariate optimization means optimization of a scalar function of a several variables: and has the general form: y = () min ( ) where () is a nonlinear scalar-valued

More information

CS Algorithms and Complexity

CS Algorithms and Complexity CS 50 - Algorithms and Complexity Linear Programming, the Simplex Method, and Hard Problems Sean Anderson 2/15/18 Portland State University Table of contents 1. The Simplex Method 2. The Graph Problem

More information

Week 3: Simplex Method I

Week 3: Simplex Method I Week 3: Simplex Method I 1 1. Introduction The simplex method computations are particularly tedious and repetitive. It attempts to move from one corner point of the solution space to a better corner point

More information

MAT016: Optimization

MAT016: Optimization MAT016: Optimization M.El Ghami e-mail: melghami@ii.uib.no URL: http://www.ii.uib.no/ melghami/ March 29, 2011 Outline for today The Simplex method in matrix notation Managing a production facility The

More information

MATH2070 Optimisation

MATH2070 Optimisation MATH2070 Optimisation Nonlinear optimisation with constraints Semester 2, 2012 Lecturer: I.W. Guo Lecture slides courtesy of J.R. Wishart Review The full nonlinear optimisation problem with equality constraints

More information

Numerical Optimization

Numerical Optimization Linear Programming - Interior Point Methods Computer Science and Automation Indian Institute of Science Bangalore 560 012, India. NPTEL Course on Example 1 Computational Complexity of Simplex Algorithm

More information

8. Diagonalization.

8. Diagonalization. 8. Diagonalization 8.1. Matrix Representations of Linear Transformations Matrix of A Linear Operator with Respect to A Basis We know that every linear transformation T: R n R m has an associated standard

More information

Functions of Several Variables

Functions of Several Variables Functions of Several Variables Extreme Values Philippe B Laval KSU April 9, 2012 Philippe B Laval (KSU) Functions of Several Variables April 9, 2012 1 / 13 Introduction In Calculus I (differential calculus

More information

The Simplex Method of Linear Programming

The Simplex Method of Linear Programming The Simplex Method of Linear Programming Online Tutorial 3 Tutorial Outline CONVERTING THE CONSTRAINTS TO EQUATIONS SETTING UP THE FIRST SIMPLEX TABLEAU SIMPLEX SOLUTION PROCEDURES SUMMARY OF SIMPLEX STEPS

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 15: Nonlinear optimization Prof. John Gunnar Carlsson November 1, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 1, 2010 1 / 24

More information

1. Introduce slack variables for each inequaility to make them equations and rewrite the objective function in the form ax by cz... + P = 0.

1. Introduce slack variables for each inequaility to make them equations and rewrite the objective function in the form ax by cz... + P = 0. 3.4 Simplex Method If a linear programming problem has more than 2 variables, solving graphically is not the way to go. Instead, we ll use a more methodical, numeric process called the Simplex Method.

More information

Introduction to gradient descent

Introduction to gradient descent 6-1: Introduction to gradient descent Prof. J.C. Kao, UCLA Introduction to gradient descent Derivation and intuitions Hessian 6-2: Introduction to gradient descent Prof. J.C. Kao, UCLA Introduction Our

More information

School of Business. Blank Page

School of Business. Blank Page Maxima and Minima 9 This unit is designed to introduce the learners to the basic concepts associated with Optimization. The readers will learn about different types of functions that are closely related

More information

MVE165/MMG631 Linear and integer optimization with applications Lecture 13 Overview of nonlinear programming. Ann-Brith Strömberg

MVE165/MMG631 Linear and integer optimization with applications Lecture 13 Overview of nonlinear programming. Ann-Brith Strömberg MVE165/MMG631 Overview of nonlinear programming Ann-Brith Strömberg 2015 05 21 Areas of applications, examples (Ch. 9.1) Structural optimization Design of aircraft, ships, bridges, etc Decide on the material

More information

September Math Course: First Order Derivative

September Math Course: First Order Derivative September Math Course: First Order Derivative Arina Nikandrova Functions Function y = f (x), where x is either be a scalar or a vector of several variables (x,..., x n ), can be thought of as a rule which

More information

Professor Alan H. Stein October 31, 2007

Professor Alan H. Stein October 31, 2007 Mathematics 05 Professor Alan H. Stein October 3, 2007 SOLUTIONS. For most maximum problems, the contraints are in the form l(x) k, where l(x) is a linear polynomial and k is a positive constant. Explain

More information

The Steepest Descent Algorithm for Unconstrained Optimization

The Steepest Descent Algorithm for Unconstrained Optimization The Steepest Descent Algorithm for Unconstrained Optimization Robert M. Freund February, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 1 Steepest Descent Algorithm The problem

More information

Constrained and Unconstrained Optimization Prof. Adrijit Goswami Department of Mathematics Indian Institute of Technology, Kharagpur

Constrained and Unconstrained Optimization Prof. Adrijit Goswami Department of Mathematics Indian Institute of Technology, Kharagpur Constrained and Unconstrained Optimization Prof. Adrijit Goswami Department of Mathematics Indian Institute of Technology, Kharagpur Lecture - 01 Introduction to Optimization Today, we will start the constrained

More information

2.098/6.255/ Optimization Methods Practice True/False Questions

2.098/6.255/ Optimization Methods Practice True/False Questions 2.098/6.255/15.093 Optimization Methods Practice True/False Questions December 11, 2009 Part I For each one of the statements below, state whether it is true or false. Include a 1-3 line supporting sentence

More information

Introduction to Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Introduction to Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Introduction to Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Module - 03 Simplex Algorithm Lecture 15 Infeasibility In this class, we

More information

Lecture Notes for Chapter 9

Lecture Notes for Chapter 9 Lecture Notes for Chapter 9 Kevin Wainwright April 26, 2014 1 Optimization of One Variable 1.1 Critical Points A critical point occurs whenever the firest derivative of a function is equal to zero. ie.

More information

Week 4: Calculus and Optimization (Jehle and Reny, Chapter A2)

Week 4: Calculus and Optimization (Jehle and Reny, Chapter A2) Week 4: Calculus and Optimization (Jehle and Reny, Chapter A2) Tsun-Feng Chiang *School of Economics, Henan University, Kaifeng, China September 27, 2015 Microeconomic Theory Week 4: Calculus and Optimization

More information

A = Chapter 6. Linear Programming: The Simplex Method. + 21x 3 x x 2. C = 16x 1. + x x x 1. + x 3. 16,x 2.

A = Chapter 6. Linear Programming: The Simplex Method. + 21x 3 x x 2. C = 16x 1. + x x x 1. + x 3. 16,x 2. Chapter 6 Linear rogramming: The Simple Method Section The Dual roblem: Minimization with roblem Constraints of the Form Learning Objectives for Section 6. Dual roblem: Minimization with roblem Constraints

More information

MATH 445/545 Test 1 Spring 2016

MATH 445/545 Test 1 Spring 2016 MATH 445/545 Test Spring 06 Note the problems are separated into two sections a set for all students and an additional set for those taking the course at the 545 level. Please read and follow all of these

More information

1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by:

1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by: Newton s Method Suppose we want to solve: (P:) min f (x) At x = x, f (x) can be approximated by: n x R. f (x) h(x) := f ( x)+ f ( x) T (x x)+ (x x) t H ( x)(x x), 2 which is the quadratic Taylor expansion

More information

Ω R n is called the constraint set or feasible set. x 1

Ω R n is called the constraint set or feasible set. x 1 1 Chapter 5 Linear Programming (LP) General constrained optimization problem: minimize subject to f(x) x Ω Ω R n is called the constraint set or feasible set. any point x Ω is called a feasible point We

More information

EAD 115. Numerical Solution of Engineering and Scientific Problems. David M. Rocke Department of Applied Science

EAD 115. Numerical Solution of Engineering and Scientific Problems. David M. Rocke Department of Applied Science EAD 115 Numerical Solution of Engineering and Scientific Problems David M. Rocke Department of Applied Science Taylor s Theorem Can often approximate a function by a polynomial The error in the approximation

More information

MATH2070 Optimisation

MATH2070 Optimisation MATH2070 Optimisation Linear Programming Semester 2, 2012 Lecturer: I.W. Guo Lecture slides courtesy of J.R. Wishart Review The standard Linear Programming (LP) Problem Graphical method of solving LP problem

More information

g(x,y) = c. For instance (see Figure 1 on the right), consider the optimization problem maximize subject to

g(x,y) = c. For instance (see Figure 1 on the right), consider the optimization problem maximize subject to 1 of 11 11/29/2010 10:39 AM From Wikipedia, the free encyclopedia In mathematical optimization, the method of Lagrange multipliers (named after Joseph Louis Lagrange) provides a strategy for finding the

More information

Business Calculus

Business Calculus Business Calculus 978-1-63545-025-5 To learn more about all our offerings Visit Knewtonalta.com Source Author(s) (Text or Video) Title(s) Link (where applicable) OpenStax Senior Contributing Authors: Gilbert

More information

Suppose that the approximate solutions of Eq. (1) satisfy the condition (3). Then (1) if η = 0 in the algorithm Trust Region, then lim inf.

Suppose that the approximate solutions of Eq. (1) satisfy the condition (3). Then (1) if η = 0 in the algorithm Trust Region, then lim inf. Maria Cameron 1. Trust Region Methods At every iteration the trust region methods generate a model m k (p), choose a trust region, and solve the constraint optimization problem of finding the minimum of

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 6 Optimization Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 6 Optimization Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted

More information

The Simplex Method. Formulate Constrained Maximization or Minimization Problem. Convert to Standard Form. Convert to Canonical Form

The Simplex Method. Formulate Constrained Maximization or Minimization Problem. Convert to Standard Form. Convert to Canonical Form The Simplex Method 1 The Simplex Method Formulate Constrained Maximization or Minimization Problem Convert to Standard Form Convert to Canonical Form Set Up the Tableau and the Initial Basic Feasible Solution

More information

NATIONAL OPEN UNIVERSITY OF NIGERIA SCHOOL OF SCIENCE AND TECHNOLOGY COURSE CODE: MTH 309 COURSE TITLE: OPTIMIZATION THEORY

NATIONAL OPEN UNIVERSITY OF NIGERIA SCHOOL OF SCIENCE AND TECHNOLOGY COURSE CODE: MTH 309 COURSE TITLE: OPTIMIZATION THEORY NATIONAL OPEN UNIVERSITY OF NIGERIA SCHOOL OF SCIENCE AND TECHNOLOGY COURSE CODE: MTH 39 COURSE TITLE: OPTIMIZATION THEORY MTH 39 OPTIMIZATION THEORY Prof. U. A. Osisiogu August 28, 22 CONTENTS I 7 Linear

More information

Linear programming: algebra

Linear programming: algebra : algebra CE 377K March 26, 2015 ANNOUNCEMENTS Groups and project topics due soon Announcements Groups and project topics due soon Did everyone get my test email? Announcements REVIEW geometry Review geometry

More information

Contents. 4.5 The(Primal)SimplexMethod NumericalExamplesoftheSimplexMethod

Contents. 4.5 The(Primal)SimplexMethod NumericalExamplesoftheSimplexMethod Contents 4 The Simplex Method for Solving LPs 149 4.1 Transformations to be Carried Out On an LP Model Before Applying the Simplex Method On It... 151 4.2 Definitions of Various Types of Basic Vectors

More information