GENETIC ALGORlIHMS WITH AN APPLICATION TO NONLINEAR TRANSPORTATION PROBLEMS

Similar documents
Nov Julien Michel

Lecture 9 Evolutionary Computation: Genetic algorithms

CSC 4510 Machine Learning

Procedures for Computing Classification Consistency and Accuracy Indices with Multiple Categories

Presented by Arkajit Dey, Matthew Low, Efrem Rensi, Eric Prawira Tan, Jason Thorsen, Michael Vartanian, Weitao Wu.

Local Search & Optimization

Applying Metrics to Rule-Based Systems

Search. Search is a key component of intelligent problem solving. Get closer to the goal if time is not enough

Evolutionary Computation

Local Search & Optimization

Genetic Algorithm. Outline

OPTIMISATION PROCESSES IN TIDAL ANALYSIS

CS 331: Artificial Intelligence Local Search 1. Tough real-world problems

M A T R IX M U L T IP L IC A T IO N A S A T E C H N IQ U E O F P O P U L A T IO N AN A L Y S IS

Introd uction to Num erical Analysis for Eng ineers

Local Beam Search. CS 331: Artificial Intelligence Local Search II. Local Beam Search Example. Local Beam Search Example. Local Beam Search Example

Designing the Human Machine Interface of Innovative Emergency Handling Systems in Cars

APPLICATION OF AUTOMATION IN THE STUDY AND PREDICTION OF TIDES AT THE FRENCH NAVAL HYDROGRAPHIC SERVICE

(2009) Journal of Rem ote Sensing (, 2006) 2. 1 (, 1999), : ( : 2007CB714402) ;

Crossover Techniques in GAs

Chapter 8: Introduction to Evolutionary Computation

Artificial Intelligence (AI) Common AI Methods. Training. Signals to Perceptrons. Artificial Neural Networks (ANN) Artificial Intelligence

On the M in imum Spann ing Tree Determ ined by n Poin ts in the Un it Square

Intelligens Számítási Módszerek Genetikus algoritmusok, gradiens mentes optimálási módszerek

GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS

Lecture 15: Genetic Algorithms

Representation and Hidden Bias II: Eliminating Defining Length Bias in Genetic Search via Shuffle Crossover

Data Warehousing & Data Mining

The Story So Far... The central problem of this course: Smartness( X ) arg max X. Possibly with some constraints on X.

Chapter 5 Workshop on Fitting of Linear Data

Genetic Algorithms: Basic Principles and Applications

TIDAL PREDICTION WITH A SMALL PERSONAL COMPUTER

INVARIANT SUBSETS OF THE SEARCH SPACE AND THE UNIVERSALITY OF A GENERALIZED GENETIC ALGORITHM

V. Evolutionary Computing. Read Flake, ch. 20. Genetic Algorithms. Part 5A: Genetic Algorithms 4/10/17. A. Genetic Algorithms

Large chunks. voids. Use of Shale in Highway Embankments

2 Semester Final Exam Study Guide

Impact of Drink-drive Enforcement and Public Education Programs in Victoria, Australia

Some Filtering Techniques for Digital Image Processing

Chapter - 3. ANN Approach for Efficient Computation of Logarithm and Antilogarithm of Decimal Numbers

LOCAL SEARCH. Today. Reading AIMA Chapter , Goals Local search algorithms. Introduce adversarial search 1/31/14

Integer weight training by differential evolution algorithms

Using Evolutionary Techniques to Hunt for Snakes and Coils

A GA Mechanism for Optimizing the Design of attribute-double-sampling-plan

Genetic Algorithms and Genetic Programming Lecture 17

Light-absorbing capacity of phytoplankton in the Gulf of Gdańsk in May, 1987

Scaling Up. So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.).

The effect of pollutants of crude oil origin on the diffusive reflectance of the ocean*

V. Evolutionary Computing. Read Flake, ch. 20. Assumptions. Genetic Algorithms. Fitness-Biased Selection. Outline of Simplified GA

The M echanism of Factor VIII Inactivation by H um an Antibodies

Evolutionary computation

IV. Evolutionary Computing. Read Flake, ch. 20. Assumptions. Genetic Algorithms. Fitness-Biased Selection. Outline of Simplified GA

Design Optimization of an Electronic Component with an Evolutionary Algorithm Using the COMSOL-MATLAB LiveLink

Few thoughts on PFA, from the calorim etric point of view

O p tim ization o f P iezo electric A ctu a to r C onfigu ration on a F lexib le F in for V ib ration C ontrol U sin g G en etic A lgorith m s

Oscar Cubo M edina fi.upm.es)

M a rtin H. B r e e n, M.S., Q u i T. D a n g, M.S., J o se p h T. J a in g, B.S., G reta N. B o y d,

Effect of Methods of Platelet Resuspension on Stored Platelets

Computational Complexity and Genetic Algorithms

A GENETIC ALGORITHM FOR FINITE STATE AUTOMATA

Genetic Algorithms. Donald Richards Penn State University

Fundamentals of Genetic Algorithms

Rotary D ie-cut System RD series. RD series. Rotary D ie-cut System

An Effective Chromosome Representation for Evolving Flexible Job Shop Schedules

A WAY TO DEAL WITH THE PROJECT CRASHING PROBLEM

Artificial Intelligence Methods (G5BAIM) - Examination

The Effectiveness of the «Checkpoint Tennessee» Program

COMBINATION OF TAGUCHI METHOD AND ARTIFICIAL INTELLIGENCE TECHNIQUES FOR THE OPTIMAL DESIGN OF FLAT-PLATE COLLECTORS

Evolutionary Computation. DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia)

Crossover Gene Selection by Spatial Location

The Role of Crossover in Genetic Algorithms to Solve Optimization of a Function Problem Falih Hassan

A Statistical Genetic Algorithm

Department of Mathematics, Graphic Era University, Dehradun, Uttarakhand, India

Research Article A Novel Differential Evolution Invasive Weed Optimization Algorithm for Solving Nonlinear Equations Systems

Optimization and Evaluation of Cardiac Enzym es and Isoenzym es M easured on a Random Access Analyzer

C o r p o r a t e l i f e i n A n c i e n t I n d i a e x p r e s s e d i t s e l f

22c:145 Artificial Intelligence

Binary Particle Swarm Optimization with Crossover Operation for Discrete Optimization

С-4. Simulation of Smoke Particles Coagulation in the Exhaust System of Piston Engine

NORTHW ESTERN UNIVERSITY. A Case S tudy of Three Universities A DISSERTATION SUBMITTED TO THE GRADUATE SCHOOL

Lecture 22. Introduction to Genetic Algorithms

Chapter 4 Beyond Classical Search 4.1 Local search algorithms and optimization problems

To link to this article:

Proceedings of the Seventh Oklahoma Conference on Articial Intelligence, pp , November, Objective Function

Zebo Peng Embedded Systems Laboratory IDA, Linköping University

Introduction to Evolutionary Computation

PROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE

Implicit Formae in Genetic Algorithms

Havrda and Charvat Entropy Based Genetic Algorithm for Traveling Salesman Problem

Boris Backovic, B.Eng.

Unit 1A: Computational Complexity

CHAPTER 4 INTRODUCTION TO DISCRETE VARIABLE OPTIMIZATION

Thermal Unit Commitment

LSU Historical Dissertations and Theses

Author's personal copy

Drugs other than alcohol (medicines and illicit drugs) in people involved in fatal road accidents in Spain

Evolving Presentations of Genetic Information: Motivation, Methods, and Analysis

NAVIGATIONAL CHART OF THE NORTH ATLANTIC USING AN OBLIQUE CONFORMAL MAP PROJECTION

Genetic Algorithms & Modeling

Determination of Optimal Tightened Normal Tightened Plan Using a Genetic Algorithm

PDF hosted at the Radboud Repository of the Radboud University Nijmegen

Transcription:

101 GENETIC ALGORlIHMS WITH AN APPLICATION TO NONLINEAR TRANSPORTATION PROBLEMS MATTHEW HOBBS ISOR, VICTORIA UNIVERSITY WELLINGTON Abstract In this paper I will describe the theory and implementation of Genetic Algorithms. This is followed by a description of an attempt to solve the nonlinear transportation problem using a genetic algorithm approach, (Michalewicz, Vignaux, & Hobbs [1990]). This is a continuation of earlier research in which a cv«tem was built for the standard linear transportation problem (Vignaux & Michalewicz [1990]). Results of this technique are promising. 1 G enetic Program m ing Much of problem solving in O perations Research involves optim ization using one of a variety of techniques. The m ost extensive algorithm in practice is that of Linear Programming. Like LP, m ost algorithm s apply only to relatively sm all problem domains. Som e algorithm s guarantee an optim al solution while others are heuristics and aim to find an approxim ate or good solution. T his still leaves a wide range of problems for which there are no useful algorithm s. In general, algorithm s take advantage of particular structural properties of the problem in order to solve it. This approach fails for m ost com plex problem s. In may cases the information m ay be little more than the payoff of any particular solution (p.? cost) and the expression of the solution (e.g.? p"'h ct:cn schedule). In this case the procedure for payoff optim ization is unclear. If we have a given solution, w ith a determ ined payoff, how do we know if it is optim al? How do we determine what part of the solution expression should be changed if it is not? W ith only the one solution we can usually answer none of these questions. The key is to incorporate comparisons with other solutions into the algorithm - of tbe type described above are usually probabilistic search tech niques. In the sim plest case (with the least number of assum ptions m ade) com pletely random search operates by randomly selecting another solution (independent to the history of the search) and remembering only the best solution so far. The obvious

102 disadvantage is the tim e costs involved (when the feasible set is of any size). A more sensible approach is to som ehow bias the random search to home in on the best solutions, quickly. An area where random search can be improved is in usiii6 of the search to select new (trial) solutions. The questions becom es one of asking how do we m ost efficiently use m e history to produce the trial solutions? Sim ulated Annealing is an algorithm whereby the history at any stage in the algorithm is the current solution. A trial solution is chosen by m odifying the current solution in som e way (m utation; and the new current solution is chosen probabilistically between the trial and current solutions, according to their relative payoff. In Sim ulated A nnealing the search history used at any step is only the current solution, previous solutions are discarded. G enetic Algorithm s use the idea that previous solutions should not be discarded im m ediately and that useful in fo ;:.. can be obtained by m aintaining groups of solutions as the search history. The prim ary problem is how to com bine the pool of solutions to create new (better) solutions. T he concept of m aintaining a pool of solutions in order to generate new trial solutions is analogous to that of natural evolution with genetic code being the expression of an individual solution. New solutions are created by com bining code from existing solutions (crossover). The optim izational tendency occurs when current solutions are discarded from the pool (e.g. die) according to lack of payoff. A lgorithm s that include ideas such as these are called G enetic A lgorithm s (Holland [1975], D avis [1987], G oldberg [1989]). A typical G enetic A lgorithm starts w ith a population of random ly generated solutions (the initial population) and repeatedly applies genetic operators m odeled on natural genetic processes (e.g. crossover, m utation) to the population. Consider this extract from D avis [1987]:... the m etaphor underlying genetic algorithm s is that of natural evolution. In evolution, the problem each species faces is one of searching for beneficial adaptions to a com plicated and changing environm ent. T he knowledge that each species has gained is em bodied in the m akeup of the chrom osom es of its m em bers. The operations that alter this chrom osom al m akeup are applied when parents reproduce; am ong them are random m utation, inversion of chrom osom al m aterial, and crossover - exchange of chrom osom al m aterial oetween two parents chrom osom es." Theory Any run of a genetic algorithm is essentially a sim ulation of the evolution of a set of solutions exposed to constraints and performance (the environm ent). Those individuals (solutions) better suited to the environm ent, in general, will have a greater chance of survival and therefore have a greater chance of producing offspring. A lthough this m ay seem initially as a hill-clim bing process, the algorithm (at the sam e tim e) m aintains population diversity within the search space. The ability to m aintain diversity enables the algorithm to escape local optim a and ie a p tall buildings w ithout too much ado. Conversely, near a global optim um, the algorithm would have difficulty hom ing in to the precise solutions, due to the blindness of the random m utations.

103 The expression of a solution ( chromosome) for various problems will vary considerably but can be considered as a collection of elem ents (genes). Ideally we would like to be able to treat each gene separately, for optim ization purposes. T his is not usually feasible due to the often highly nonlinear or constrained nature of the problem. The only information we have m ay be the payoff of the chrom osom e jt = whole. A feature of the genetic algorithm is in that although it appears to involve only com petition between chrom osom es, due to the nature of the genetic operators (e.g. crossover) - that m anipulate genes directly - in fact com petition occurs at the gene level. This makes sense when we consider that new solutions are created with only a subset of genes differing from their parent(s). A useful term to use is that of a schema - see Holland [1975] or G oldberg [1989]. A schem a (in this context) can be considered as a sim ilarity tem plate for chrom o somes defining a set of fixed genes. For exam ple fixing gene number 3 at yes' and gene number 7 at no (and leaving all other genes free) defines a schem a for the chrom osom es. To be m ore accurate than the previous paragraph, com petition is on the schem a level. A lthough a genetic algorithm processes only n structures per generation, it also processes of th e order of n3 schem ata. Holland has nam ed tlys im portant result implicit parallelism. It is generally accepted that for any problem to be attem pted by a genetic algorithm there must be five com ponents, as follows. 1. A genetic representation of solutions to the problem (chrom osom e), 2. A way to crcatc random solutions (for the initialization). 3. An evaluation function (the environm ent ), rating solutions by fitness. 4. G enetic operators to alter child com position during reproduction, and 5. Values for the param eters that the genetic algorithm uses (population size, probabilities of applying genetic operators, etc.) For a variety of reasons m ost applications of the genetic algorithm have used binary (bit string) representations for the solutions (chromosomes). But the -im plicit parallelism result is not restricted to bit strings (see A ntonisse [1989]). The application in this paper uses a m atrix of real numbers. Although it is true that any structure that can be put on com puter can be represented as a binary vector (string), the difference is that the schem a for the richer structure are much m ore relevant, and there are much less of them. Im plem entation A genetic algorithm will typically be of the following form. 1. create an initial set of random solutions (the initial population); 2. evaluate the solutions ^see now good they are); 3. put the better solutions into a set:

104 4. try to combine these solutions to produce child solutions; 5. discard poorer solutions to m aintain population size; 6. repeat the above three steps until we get a good solution or resources run out. There are two main genetic operators, mutation and crossover. The m utation operator arbitrarily alters one or m ore com ponents of a selected structure this increases the variability of the population (introducing new - or lost - schem a). In general, each gene of each chrom osom e in the population undergoes a random change with a probability equal to the m utation rate. The crossover operator com bines the features of two parent structures to form two sim ilar offspring. It operates by sw apping corresponding segm ents of a chrom osom e representing the parent solut ions (w ith the ob jective of com bining good schem ata to produce better schem ata - with more fixed genes). W hen problem s are found to be too specialized or com plex for the standard algorithm s it is necessary to take a heuristic approach to the problem. G enetic algorithm s can be a powerful tool in such cases due to the fact that it can com bine the inform ation from various solutions (and at the sam e tim e, m aintain the diversity required to try to span the search space). Currently there is no program ming language specific to this problem dom ain so system s have had to be set up in other languages as available. It should be noted that when the param eters to the problem are changed (e.g. a different problem of the sam e type) the programs set up will require very little change. For exam ple if the objective function (goal) of the problem changes it is only necessary to change the evaluation procedure and no other. Such system s are m ore flexible to changes than m ost. For exam ple a problem beine solved using Linear Program m ing is fine until nonlinearity is introduced, in which case another algorithm is required. G enetic algorithm s can com pete very effectively against standard nonlinear algorithm s on nonlinear problems. T hey m ay be best used in com bination with nonlinear algorithm s for such problem s. For exam ple using a nonlinear algorithm to home-in on the local optim a. There are a number of applications for which genetic algorithm s have been applied including: physical design of circuit boards; travelling salesm an (nodecovering) problem; com binatorial problem s in general; gam e strategy; nonlinear problem s; optical design; keyboard configuration; m achine learning; sim ulation of evolution: and gas pipeline optim ization. In fact any problem that m eets the five properties listed previously can be attem pted by a genetic algorithm. 2 A G enetic Algorithm for the Nonlinear Transportation Problem In testin g the use of the genetic algorithm on the linear transportation problem (see V ignaux Iz M ichalewicz [1989],[1990]) it is possible to compare its solution with the known optim um found using the standard algorithm and therefore to determ ine how efficient or otherw ise the genetic algorithm is in absolute term s. Once we m ove to nonlinear objective functions, the optim um m ay not be known. Testing is reduced to com paring the results with those of other nonlinear solution m ethods that may them selves have converged to a local optim um.

105 The genetic algorithm will be com pared with the GAM S nonlinear system (w ith MINOS optim izer) as a typical exam ple of an industry-standard efficient m ethod of solution. T his system, being essentially a gradient-driven m ethod, found som e of the problem s set up difficult or im possible to solve. In these cases m odifications to the objective functions were m ade so that the m ethod could at least find an approximate solution. T he genetic system itself was w ritten by Z. M ichalewicz in the CT programming language. The param eters required include (as well as the problem description): number of iterations, population size, m utation and crossover rates, and random number starting seed. The behaviour of nonlinear optim ization algorithm s depend m arkedly on the form of the objective function. It is clear that different solution techniques m ay behave differently. In the experim ents the overall ob jective function is the sum of the arc objective m ncuons, thus there were no cross term s. Six different arc objective functions were used including: step function (w ith 5 equal steps): discretely changing slope (3 zones of a particular gradient); square; square-root: a function w ith a peak; and a linearly increasing sine function. T hey were used in conjunction w ith five problem structures in which the supply and dem and vectors and the corresponding parameter m atrix were randomly generated. T he solutions to these problem s are unknown. The objective function for the transportation problem was thus of the form / < * «) where f(x) is one of the six arc objective functions. Experiments and results The population size was fixed at 40. T he m utation rate was 20% w ith the proportion of mutation-1 being 50%, and the crossover rate was 5%. Problem s were run for 10,000 generations. The system s were tested on the six functions for five random ly generated cost matrices. See the second graph page for a graphical display of the results. The genetic system was run on SPARC sta tio n /S U N term inals while GAM S was run on an O livetti 386 with m ath co-processor. A lthough tim e com parisons between the two m achines are difficult to m ake it should be noted that in general GAM S finished each run well before the genetic system. An exception is case A (in which GAMS evaluates numerous arc-tangent functions) when the genetic algorithm took no more than 15 m inutes to com plete while GAM S averaged at about tw ice that. For cases A,B, and D. where the GAM S param eter m eant that m ultiple runs had to be performed to find the best G A M 6! solution, the genetic system overall was much faster. Conclusions The transportation problem was chosen as it provided a relatively sim ple convex feasible set. This means that it is easier to ensure feasibility in the solutions. The

106 procedure was then to look at the effects that the objective function alone lias on the solving of the problem. For the class of practical problem s the genetic system is, on average, better than G A M :.j_, ij% in case A and by 11.6% in case B. For the reasonable functions the results were different. In case C (the square function), the genetic system performed worse by 7.6% while in case D (the square-root function), the genetic system was worse by only... F r the other of functions the genetic system is dom inant. The genetic system resulted in im provem ents of 32.9% and 55.1% over GAM S, averaging over the five problems. T his dem onstrates the superiority of the genetic m ethod over other system s which are very often lim ited to certain classes of problem functions. G AM S did well on the sm ooth /m on otonic (reasonable) functions, it is these cases where the gradient m easurem ent techniques axe m ost apt. In case C GAMS bettered the genetic system with much less cost in solving tim e. For the practical problem s, the gradient techniques have difficulty seeing around the corner to new zones of better costs. The genetic algorithm, taking a more structural approach, is able to jum p betw een zones readily, resulting in much better solutions. For the other problems, although they are both =mooth, have significant structural features. Like the practical problem s, but even m ore so, the genetic system did much better the GAM S. REFER ENCES A ntonisse, J., A New Interpretation of Schem a Notation that Overturns the Binary Encoding Constraint, Proceedings of the Third International Conference on G enetic A lgorithm s (J. David Schaffer, editor), George Mason University, June 4-7. 1989. pp.86-91. D avis, L., (editor), G enetic A lgorithm s and Sim ulated A nnealing, Pitm an, L ondon. 1987. Goldberg, D.E., G enetic Algorithm s in Search. O ptim ization and Machine Learning, Addison W esley, 1989. H olland, J., "A daptation in Natural <uiu A u ih cim S ystem s, Ann Arbor: University of M ichigan Press, 1975. M ichalew icz, Z., V ignaux, G.A., Hobbs, M. A G enetic A lgorithm for the Nonlinear Transportation Problem, subm itted to the O perations Research Society of A m erica (O R SA ) Journal on C om puting. Vignaux, G.A., M ichalewicz, Z., G enetic Algorithm s for the Transportation Problem, Proceedings of the 4th International Sym posium on M ethodologies for Intelligent System s, Charlotte, October 12-14, 1989. Vignaux, G.A., M ichalewicz, Z., A G enetic Algorithm for the Linear Transportation Problem, subm itted to IEEE Transactions on M an, System s, and Cybernetics.