Chapter 8: Introduction to Evolutionary Computation

Similar documents
Lecture 9 Evolutionary Computation: Genetic algorithms

CSC 4510 Machine Learning

Genetic Algorithms & Modeling

Lecture 22. Introduction to Genetic Algorithms

Fundamentals of Genetic Algorithms

how should the GA proceed?

Evolutionary Computation

Evolutionary computation

Genetic Algorithms. Donald Richards Penn State University

Computational statistics

Evolutionary computation

Genetic Algorithm for Solving the Economic Load Dispatch

Development. biologically-inspired computing. lecture 16. Informatics luis rocha x x x. Syntactic Operations. biologically Inspired computing

Evolutionary Computation. DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia)

Evolutionary Computation: introduction

Scaling Up. So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.).

Reproduction- passing genetic information to the next generation

Lecture 15: Genetic Algorithms

Evolution. Just a few points

[Read Chapter 9] [Exercises 9.1, 9.2, 9.3, 9.4]

Statistical Models in Evolutionary Biology An Introductory Discussion

V. Evolutionary Computing. Read Flake, ch. 20. Genetic Algorithms. Part 5A: Genetic Algorithms 4/10/17. A. Genetic Algorithms

Mechanisms of Evolution

Crossover Techniques in GAs

EVOLUTION. HISTORY: Ideas that shaped the current evolutionary theory. Evolution change in populations over time.

V. Evolutionary Computing. Read Flake, ch. 20. Assumptions. Genetic Algorithms. Fitness-Biased Selection. Outline of Simplified GA

IV. Evolutionary Computing. Read Flake, ch. 20. Assumptions. Genetic Algorithms. Fitness-Biased Selection. Outline of Simplified GA

Evolutionary Algorithms

EVOLUTION change in populations over time

Intelligens Számítási Módszerek Genetikus algoritmusok, gradiens mentes optimálási módszerek

EVOLUTION change in populations over time

What is Natural Selection? Natural & Artificial Selection. Answer: Answer: What are Directional, Stabilizing, Disruptive Natural Selection?

Multi-objective genetic algorithm

Genetic Algorithms: Basic Principles and Applications

Lesson 4: Understanding Genetics

GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS

Evolutionary Design I

Evolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction

Haploid & diploid recombination and their evolutionary impact

Evolution & Natural Selection

Genetic Algorithm. Outline

Data Warehousing & Data Mining

Microevolution Changing Allele Frequencies

Evolution of Populations. Chapter 17

Search. Search is a key component of intelligent problem solving. Get closer to the goal if time is not enough

Life Cycles, Meiosis and Genetic Variability24/02/2015 2:26 PM

Objectives for Chapter: 22

Evolutionary Algorithms: Introduction. Department of Cybernetics, CTU Prague.

APES C4L2 HOW DOES THE EARTH S LIFE CHANGE OVER TIME? Textbook pages 85 88

NEUROEVOLUTION. Contents. Evolutionary Computation. Neuroevolution. Types of neuro-evolution algorithms

Quantitative Genetics & Evolutionary Genetics

DETECTING THE FAULT FROM SPECTROGRAMS BY USING GENETIC ALGORITHM TECHNIQUES

Introduction to Evolutionary Computation

I N N O V A T I O N L E C T U R E S (I N N O l E C) Petr Kuzmič, Ph.D. BioKin, Ltd. WATERTOWN, MASSACHUSETTS, U.S.A.

AGENDA Go Over DUT; offer REDO opportunity Notes on Intro to Evolution Cartoon Activity

What is a sex cell? How are sex cells made? How does meiosis help explain Mendel s results?

Theory of Evolution. Chapter 15

Darwin s Theory of Evolution. The Puzzle of Life s Diversity

The European Applied Business Research Conference Rothenburg, Germany 2002

Local Search & Optimization

Computer Simulations on Evolution BiologyLabs On-line. Laboratory 1 for Section B. Laboratory 2 for Section A

Major questions of evolutionary genetics. Experimental tools of evolutionary genetics. Theoretical population genetics.

Q2 (4.6) Put the following in order from biggest to smallest: Gene DNA Cell Chromosome Nucleus. Q8 (Biology) (4.6)

Individual learning and population evolution

CHAPTER 13 MEIOSIS AND SEXUAL LIFE CYCLES. Section A: An Introduction to Heredity

Enduring Understanding: Change in the genetic makeup of a population over time is evolution Pearson Education, Inc.

Evolutionary Algorithms: Introduction. The Czech Institute of Informatics, Robotics and Cybernetics CTU Prague

Genes and DNA. 1) Natural Selection. 2) Mutations. Darwin knew this

Population Genetics: a tutorial

Processes of Evolution

Chapter 2 Section 1 discussed the effect of the environment on the phenotype of individuals light, population ratio, type of soil, temperature )

Unit 2 Lesson 4 - Heredity. 7 th Grade Cells and Heredity (Mod A) Unit 2 Lesson 4 - Heredity

ScienceFusion correlated to the

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

UNIT 3: GENETICS 1. Inheritance and Reproduction Genetics inheritance Heredity parent to offspring chemical code genes specific order traits allele

1. T/F: Genetic variation leads to evolution. 2. What is genetic equilibrium? 3. What is speciation? How does it occur?

Chapter 4 Lesson 1 Heredity Notes

Heredity and Genetics WKSH

1.A- Natural Selection

Using Evolutionary Techniques to Hunt for Snakes and Coils

4. Identify one bird that would most likely compete for food with the large tree finch. Support your answer. [1]

Biology 11 UNIT 1: EVOLUTION LESSON 2: HOW EVOLUTION?? (MICRO-EVOLUTION AND POPULATIONS)

VERY SIMPLY PUT-- Evolution is. change in a species over time.

Theory a well supported testable explanation of phenomenon occurring in the natural world.

Chapter 16: Evolutionary Theory

Parallel Genetic Algorithms

NOTES Ch 17: Genes and. Variation

Homework Assignment, Evolutionary Systems Biology, Spring Homework Part I: Phylogenetics:

Computational intelligence methods

EQ: How are genetic variations caused and how do they lead to natural selection?

Haploid-Diploid Algorithms

Darwin s Observations & Conclusions The Struggle for Existence

Zebo Peng Embedded Systems Laboratory IDA, Linköping University

AP Biology Essential Knowledge Cards BIG IDEA 1

António Manso Luís Correia

Biology. Slide 1 of 41. End Show. Copyright Pearson Prentice Hall

The Role of Crossover in Genetic Algorithms to Solve Optimization of a Function Problem Falih Hassan

MS-LS3-1 Heredity: Inheritance and Variation of Traits

Chapter 2 Evolution: Constructing a Fundamental Scientific Theory

Guided Reading Chapter 1: The Science of Heredity

Transcription:

Computational Intelligence: Second Edition

Contents

Some Theories about Evolution Evolution is an optimization process: the aim is to improve the ability of an organism to survive in dynamically changing and competitive environments Two main theories on biological evolution: Lamarckian (1744-1829) view Darwinian (1809-1882) view

Lamarckian Evolution Heredity, i.e. the inheritance of acquired traits Individuals adapt during their lifetimes, and transmit their traits to their offspring Offspring continue to adapt Method of adaptation rests on the concept of use and disuse Over time individuals lose characteristics they do not require and develop those which are useful by exercising them

Darwinian Evolution Theory of natural selection and survival of the fittest In a world with limited resources and stable populations, each individual competes with others for survival Those individuals with the best characteristics (traits) are more likely to survive and to reproduce, and those characteristics will be passed on to their offspring These desirable characteristics are inherited by the following generations, and (over time) become dominant among the population During production of a child organism, random events cause random changes to the child organism s characteristics What about Alfred Wallace?

Evolutionary Computation Evolutionary computation (EC) refers to computer-based problem solving systems that use computational models of evolutionary processes, such as natural selection, survival of the fittest, and reproduction, as the fundamental components of such computational systems

Main Components of an Evolutionary Algorithm Evolution via natural selection of a randomly chosen population of individuals can be thought of as a search through the space of possible chromosome values In this sense, an evolutionary algorithm (EA) is a stochastic search for an optimal solution to a given problem Main components of an EA: an encoding of solutions to the problem as a chromosome; a function to evaluate the fitness, or survival strength of individuals; initialization of the initial population; selection operators; and reproduction operators.

: Algorithm 8.1 Let t = 0 be the generation counter; Create and initialize an n x -dimensional population, C(0), to consist of n s individuals; while stopping condition(s) not true do Evaluate the fitness, f (x i (t)), of each individual, x i (t); Perform reproduction to create offspring; Select the new population, C(t + 1); Advance to the new generation, i.e. t = t + 1; end

Evolutionary Computation Paradigms Genetic algorithms model genetic evolution. Genetic programming, based on genetic algorithms, but individuals are programs. Evolutionary programming, derived from the simulation of adaptive behavior in evolution (i.e. phenotypic evolution). Evolution strategies, geared toward modeling the strategic parameters that control variation in evolution, i.e. the evolution of evolution. Differential evolution, similar to genetic algorithms, differing in the reproduction mechanism used. Cultural evolution, which models the evolution of culture Co-evolution, in competition or cooperation

The Chromosome Characteristics of individuals are represented by long strings of information contained in the chromosomes of the organism Chromosomes are structures of compact intertwined molecules of DNA, found in the nucleus of organic cells Each chromosome contains a large number of genes, where a gene is the unit of heredity An alternative form of a gene is referred to as an allele

The Artificial Chromosome In the context of EC, the following notation is used: Chromosome (genome) Gene Allelle Candidate solution A single variable to be optimized Assignment of a value to a variable Two classes of evolutionary information: A genotype describes the genetic composition of an individual, as inherited from its parents A phenotype is the expressed behavioral traits of an individual in a specific environment

Binary Binary-valued variables: x i {0, 1} nx, i.e. each x ij {0, 1} Nominal-valued variables: each nominal value can be encoded as an n d -dimensional bit vector where 2 n d is the total number of discrete nominal values for that variable Continuous-valued variables: map the continuous search space to a binary-valued search space: Each continuous-valued variable is mapped to an n d -dimensional bit vector φ : R (0, 1) n d

Binary (cont) Transform each individual x = (x 1,..., x j,..., x nx ) with x j R to the binary-valued individual, b = (b 1,..., b j,..., b nx ) where b j = (b (j 1)nd +1,..., b jnd ) with b l {0, 1} and the total number of bits, n b = n x n d

Binary (cont) Decoding each b j back to a floating-point representation: Φ j (b) = x min,j + x max,j x min,j 2 n d 1 That is, use the mapping, ( nd 1 l=1 Φ j : {0, 1} n d [x min,j, x max,j ] b j(nd l)2 l )

Binary (cont) Problems with using a binary representation: Conversion form a floating-point value to a bitstring of n d bits has the maximum attainable accuracy x max,j x min,j 2 n d 1 for each vector component, j = 1,..., n x Binary coding introduces Hamming cliffs: Hamming cliff is formed when two numerically adjacent values have bit representations that are far apart 7 10 = 0111 2 vs 8 10 = 1000 2 Large change in solution is needed for small change in fitness

Gray Coding In Gray coding, the Hamming distance between the representation of successive numerical values is one For 3-bit representations: Converting binary strings to Gray bit Binary Gray strings: 0 000 000 1 001 001 g 1 = b 1 2 010 011 g l = b l 1 b l + b l 1 b l 3 011 010 4 6 100 110 110 101 5 7 101 111 111 100 where b l is bit l of the binary number b 1 b 2 b nb Computational Intelligence: Second Edition b is the most significant bit

Gray Coding (cont) A Gray code representation, b j can be converted to a floating-point representation using Φ j (b) = x min,j + x n max,j x min,j d 1 n d l 2 n b d 1 (j 1)nd +q mod 2 2 l l=1 q=1

Other s Other representations: Real-valued representations, where x ij R Integer-valued representations, where x ij Z Discrete-valued representations, where x ij dom(x j ) Tree-based representations as used in Genetic Programming Mixed representations

Initialization A population of candidate solutions is maintained Values are randomly assigned to each gene from the domain of the corresponding variable Uniform random initialization is used to ensure that the initial population is a uniform representation of the entire search space What are the consequences of the size of the initial population? Computational complexity per generation Number of generations to converge Quality of solutions obtained Exploration ability

Survival of the Fittest Based on the Darwinian model, individuals with the best characteristics have the best chance to survive and to reproduce A fitness function is used to determine the ability of an individual of an EA to survive The fitness function, f, maps a chromosome representation into a scalar value: f : Γ nx R where Γ represents the data type of the elements of an n x -dimensional chromosome

Objective Function The fitness function represents the objective function, Ψ, which represents the optimization problem Sometimes the chromosome representation does not correspond to the representation expected by the objective function, in which case f : S C Φ SX Ψ R Υ R+ S C represents the search space of the objective function Φ represents the chromosome decoding function Ψ represents the objective function Υ represents a scaling function For example: f : {0, 1} n b Φ R n x Ψ R Υ R+

Objective Function Types Types of objective functions, resulting in different problem types: Unconstrained objective functions, but still subject to boundary constraints Constrained objective functions Multi-objective functions, where more than one objective has to be optimized Dynamic or noisy objective functions

Operators Relates directly to the concept of survival of the fittest The main objective of selection operators is to emphasize better solutions steps in an EA: of the new population of parents during reproduction Selective pressure: Takeover time Relates to the time it requires to produce a uniform population Defined as the speed at which the best solution will occupy the entire population by repeated application of the selection operator alone High selective pressure reduces population diversity

Random Each individual has a probability of 1 n s to be selected n s is the size of the population No fitness information is used Random selection has the lowest selective pressure

Proportional Biases selection towards the most-fit individuals A probability distribution proportional to the fitness is created, and individuals are selected by sampling the distribution, ϕ s (x i (t)) = f Υ (x i (t)) ns l=1 f Υ(x l (t)) n s is the total number of individuals in the population ϕ s (x i ) is the probability that x i will be selected f Υ (x i ) is the scaled fitness of x i, to produce a positive floating-point value

Proportional : Sampling Methods Roulette wheel selection: Assuming maximization and normalized fitness values (Algorithm 8.2): Let i = 1, where i denotes the chromosome index; Calculate ϕ s (x i ); sum = ϕ s (x i ); Choose r U(0, 1); while sum < r do i = i + 1, i.e. advance to the next chromosome; sum = sum + ϕ s (x i ); end Return x i as the selected individual; High selective pressure

Proportional : Stochastic Universal Sampling Uses a single random value to sample all of the solutions by choosing them at evenly spaced intervals Like roulette wheel sampling, construct a wheel with sections for each of the n s chromosomes Instead of one hand that is spun once for each sample, a multi-hand is spunt just once The hand has n s arms, equally spaced The number of times chromosome i is sampled is the number of arms that fall into the chromosomes section of the wheel

Tournament Selects a group of n ts individuals randomly from the population, where n ts < n s Select the best individual from this tournament For crossover with two parents, tournament selection is done twice Tournament selection limits the chance of the best individual to dominate Large n ts versus small n ts Selective pressure depends on the value of n ts

Rank-Based Uses the rank ordering of fitness values to determine the probability of selection is independent of absolute fitness Non-deterministic linear sampling: Selects an individual, x i, such that i U(0, U(0, n s 1)) The individuals are sorted in decreasing order of fitness value Rank 0 is the best individual Rank n s 1 is the worst individual

Rank-Based (cont) Linear ranking: Assumes that the best individual creates ˆλ offspring, and the worst individual λ 1 ˆλ 2 and λ = 2 ˆλ The selection probability of each individual is calculated as ϕ s (x i (t)) = λ + (f r (x i (t))/(n s 1))(ˆλ λ) n s where f r (x i (t)) is the rank of x i (t)

Rank-Based (cont) Nonlinear ranking: ϕ s (x i (t)) = 1 e fr (x i (t)) β ϕ s (x i ) = ν(1 ν) np 1 fr (x i ) f r (x i ) is the rank of x i β is a normalization constant ν indicates the probability of selecting the next individual Rank-based selection operators may use any sampling method to select individuals

Boltzmann Based on the thermodynamical principles of simulated annealing probability: ϕ(x i (t)) = 1 1 + e f (x i (t))/t (t) T (t) is the temperature parameter An initial large value ensures that all individuals have an equal probability of being selected As T (t) becomes smaller, selection focuses more on the good individuals Can use any sampling method

Boltzmann (cont) Boltzmann selection can be used to select between two individuals For example, if U(0, 1) > 1 1 + e (f (x i (t)) f (x i (t)))/t (t) then x i (t) is selected; otherwise, x i(t) is selected

(µ +, λ)- Deterministic rank-based selection methods used in evolutionary strategies µ indicates the number of parents λ is the number of offspring produced from each parent (µ, λ)-selection selects the best µ offspring for the next population (µ + λ)-selection selects the best µ individuals from both the parents and the offspring

Elitism Ensures that the best individuals of the current population survive to the next generation The best individuals are copied to the new population without being mutated The more individuals that survive to the next generation, the less the diversity of the new population

Hall of Fame Similar to the list of best players of an arcade game For each generation, the best individual is selected to be inserted into the hall of fame Contain an archive of the best individuals found from the first generation The hall of fame can be used as a parent pool for the crossover operator

Types of Operators Reproduction: process of producing offspring from selected parents by applying crossover and/or mutation operators Crossover is the process of creating one or more new individuals through the combination of genetic material randomly selected from two or more parents Mutation: The process of randomly changing the values of genes in a chromosome The main objective is to introduce new genetic material into the population, thereby increasing genetic diversity Mutation probability and step sizes should be small Proportional to the fitness of the individual? Start with large mutation probability, decreased over time?

When to Stop Limit the number of generations, or fitness evaluations Stop when population has converged: Terminate when no improvement is observed over a number of consecutive generations Terminate when there is no change in the population Terminate when an acceptable solution has been found Terminate when the objective function slope is approximately zero

EC vs CO The search process: CO uses deterministic rules to move from one point in the search space to the next point EC uses probabilistic transition rules EC applies a parallel search of the search space, while CO uses a sequential search Search surface information: CO uses derivative information, usually first-order or second-order, of the search space to guide the path to the optimum EC uses no derivative information, but fitness information