Local Search & Optimization

Similar documents
Local Search & Optimization

Local search algorithms. Chapter 4, Sections 3 4 1

A.I.: Beyond Classical Search

CS 380: ARTIFICIAL INTELLIGENCE

Local Search and Optimization

Chapter 4 Beyond Classical Search 4.1 Local search algorithms and optimization problems

Local search algorithms. Chapter 4, Sections 3 4 1

Summary. AIMA sections 4.3,4.4. Hill-climbing Simulated annealing Genetic algorithms (briey) Local search in continuous spaces (very briey)

22c:145 Artificial Intelligence

School of EECS Washington State University. Artificial Intelligence

CS 331: Artificial Intelligence Local Search 1. Tough real-world problems

Pengju

New rubric: AI in the news

LOCAL SEARCH. Today. Reading AIMA Chapter , Goals Local search algorithms. Introduce adversarial search 1/31/14

Scaling Up. So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.).

Local and Online search algorithms

Iterative improvement algorithms. Beyond Classical Search. Outline. Example: Traveling Salesperson Problem

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

Local search algorithms

The Story So Far... The central problem of this course: Smartness( X ) arg max X. Possibly with some constraints on X.

Local Beam Search. CS 331: Artificial Intelligence Local Search II. Local Beam Search Example. Local Beam Search Example. Local Beam Search Example

Methods for finding optimal configurations

Local and Stochastic Search

CSC242: Artificial Intelligence. Lecture 4 Local Search

CSC 4510 Machine Learning

Finding optimal configurations ( combinatorial optimization)

PROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE

Computational statistics

Local search and agents

CSC242: Intro to AI. Lecture 5. Tuesday, February 26, 13

Beyond Classical Search

Simulated Annealing. Local Search. Cost function. Solution space

Lecture 9 Evolutionary Computation: Genetic algorithms

Methods for finding optimal configurations

Introduction to Simulated Annealing 22c:145

Local search algorithms

Random Search. Shin Yoo CS454, Autumn 2017, School of Computing, KAIST

Lecture 15: Genetic Algorithms

Bounded Approximation Algorithms

Local Search. Shin Yoo CS492D, Fall 2015, School of Computing, KAIST

Optimization Methods via Simulation

Zebo Peng Embedded Systems Laboratory IDA, Linköping University

Motivation, Basic Concepts, Basic Methods, Travelling Salesperson Problem (TSP), Algorithms

Search. Search is a key component of intelligent problem solving. Get closer to the goal if time is not enough

Incremental Stochastic Gradient Descent

Artificial Intelligence Heuristic Search Methods

Metaheuristics and Local Search

Artificial Intelligence Methods (G5BAIM) - Examination

Evolutionary computation

Metaheuristics. 2.3 Local Search 2.4 Simulated annealing. Adrian Horga

Biology 11 UNIT 1: EVOLUTION LESSON 2: HOW EVOLUTION?? (MICRO-EVOLUTION AND POPULATIONS)

Metaheuristics and Local Search. Discrete optimization problems. Solution approaches

Artificial Intelligence Methods (G5BAIM) - Examination

x 2 i 10 cos(2πx i ). i=1

Constraint satisfaction search. Combinatorial optimization search.

CS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash

Genetic Algorithms & Modeling

Single Solution-based Metaheuristics

Chapter 8: Introduction to Evolutionary Computation

Unit 1A: Computational Complexity

Evolutionary Computation

5. Simulated Annealing 5.1 Basic Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini

Optimization. Announcements. Last Time. Searching: So Far. Complete Searching. Optimization

how should the GA proceed?

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria

Stochastic Search: Part 2. Genetic Algorithms. Vincent A. Cicirello. Robotics Institute. Carnegie Mellon University

Genetic Algorithms: Basic Principles and Applications

Artificial Intelligence

Lecture 4: Simulated Annealing. An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved

5. Simulated Annealing 5.2 Advanced Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini

Intelligens Számítási Módszerek Genetikus algoritmusok, gradiens mentes optimálási módszerek

7.1 Basis for Boltzmann machine. 7. Boltzmann machines

GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS

Ant Colony Optimization: an introduction. Daniel Chivilikhin

Computational Intelligence in Product-line Optimization

Adaptive Generalized Crowding for Genetic Algorithms

Fundamentals of Genetic Algorithms

Lecture H2. Heuristic Methods: Iterated Local Search, Simulated Annealing and Tabu Search. Saeed Bastani

Evolutionary Algorithms

Computer simulation can be thought as a virtual experiment. There are

Evolutionary Computation. DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia)

How to Treat Evolutionary Algorithms as Ordinary Randomized Algorithms

Genetic Algorithm. Outline

Crossover Techniques in GAs

Algorithms and Complexity theory

Searching in non-deterministic, partially observable and unknown environments

Design and Analysis of Algorithms

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring

Genetic Algorithms and Genetic Programming Lecture 17

Searching in non-deterministic, partially observable and unknown environments

V. Evolutionary Computing. Read Flake, ch. 20. Genetic Algorithms. Part 5A: Genetic Algorithms 4/10/17. A. Genetic Algorithms

Evolutionary Computation: introduction

LandscapeEC: Adding Geographical Structure to Cellular Evolutionary Algorithms

Genetic Algorithms. Seth Bacon. 4/25/2005 Seth Bacon 1

Research Article A Novel Differential Evolution Invasive Weed Optimization Algorithm for Solving Nonlinear Equations Systems

Last time: Summary. Last time: Summary

Genetic Algorithms. Donald Richards Penn State University

Hill climbing: Simulated annealing and Tabu search

( ) ( ) ( ) ( ) Simulated Annealing. Introduction. Pseudotemperature, Free Energy and Entropy. A Short Detour into Statistical Mechanics.

The Genetic Algorithm is Useful to Fitting Input Probability Distributions for Simulation Models

Transcription:

Local Search & Optimization CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2017 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition, Chapter 4

Outline Local search & optimization algorithms Hill-climbing search Simulated annealing search Local beam search Genetic algorithms Searching in continuous spaces 2

Sample problems for local & systematic search Path to goal is important Theorem proving Route finding 8-Puzzle Chess Goal state itself is important 8 Queens TSP VLSI Layout Job-Shop Scheduling Automatic program generation 3

State-space landscape Local search algorithms explore the landscape Solution:A state with the optimal value of the objective function 2-d state space 4

Example: n-queens Put n queens on an n n board with no two queens on the same row, column, or diagonal What is state-space? What is objective function? 5

Local search: 8-queens problem States: 8 queens on the board, one per column (8 8 17 million) Successors(s): all states resulted from s by moving a single queen to another square of the same column (8 7 = 56) Cost function h(s): number of queen pairs that are attacking each other, directly or indirectly Global minimum: h s = 0 h(s) = 17 successors objective values 6 Red: best successors

Hill-climbing search Node only contains the state and the value of objective function in that state (not path) Search strategy: steepest ascent among immediate neighbors until reaching a peak 7 Current node is replaced by the best successor (if it is better than current node)

Hill-climbing search is greedy Greedy local search: considering only one step ahead and select the best successor state (steepest ascent) Rapid progress toward a solution Usually quite easy to improve a bad solution 8 Optimal when starting in one of these states

Hill-climbing search problems Local maxima: a peak that is not global max Plateau: a flat area (flat local max, shoulder) Ridges: a sequence of local max that is very 9 difficult for greedy algorithm to navigate

Hill-climbing search problem: 8-queens From random initial state, 86% of the time getting stuck on average, 4 steps for succeeding and 3 steps for getting stuck h(s) = 17 h(s) = 1 Five steps 10

Hill-climbing search problem: TSP Start with any complete tour, perform pairwise exchanges Variants of this approach get within 1% of optimal very quickly with thousands of cities 11

Variants of hill-climbing Trying to solve problem of hill-climbing search Sideways moves Stochastic hill climbing First-choice hill climbing Random-restart hill climbing 12

Sideways move Sideways move: plateau may be a shoulder so keep going sideways moves when there is no uphill move Problem: infinite loop where flat local max Solution: upper bound on the number of consecutive sideways moves Result on 8-queens: Limit = 100 for consecutive sideways moves 94% success instead of 14% success on average, 21 steps when succeeding and 64 steps when failing 13

Stochastic hill climbing Randomly chooses among the available uphill moves according to the steepness of these moves P(S ) is an increasing function of h(s ) h(s) First-choice hill climbing: generating successors randomly until one better than the current state is found Good when number of successors is high 14

Random-restart hill climbing All previous versions are incomplete Getting stuck on local max while state goal do run hill-climbing search from a random initial state p: probability of success in each hill-climbing search Expected no of restarts = 1/p Expected no of steps = 1 p 1 n f + n s n f : average number of steps in a failure n s : average number of steps for the success Result on 8-queens: 15 p 0.14 1/p 7 restarts 7 1 3 + 4 22 steps For 3 10 6 queens needs less than 1 minute Using also sideways moves: p 0.94 1/p 1.06 iterations 1.06 1 64 + 21 26 steps

Effect of land-scape shape on hill climbing Shape of state-space land-scape is important: Few local max and platea: random-restart is quick Real problems land-scape is usually unknown a priori NP-Hard problems typically have an exponential number of local maxima Reasonable solution can be obtained after a small no of restarts 16

Simulated Annealing (SA) Search Hill climbing: move to a better state Efficient, but incomplete (can stuck in local maxima) Random walk: move to a random successor Asymptotically complete, but extremely inefficient Idea: Escape local maxima by allowing some "bad" moves but gradually decrease their frequency. More exploration at start and gradually hill-climbing become more frequently selected strategy 17

SA relation to annealing in metallurgy In SA method, each state s of the search space is analogous to a state of some physical system E(s) to be minimized is analogous to the internal energy of the system The goal is to bring the system, from an arbitrary initial state, to an equilibrium state with the minimum possible energy. 18

Pick a random successor of the current state If it is better than the current state go to it Otherwise, accept the transition with a probability T(t) = schedule[t] is a decreasing series E(s): objective function 19

Probability of state transition A successor of s P s, s, t = α 1 if E s > E(s) e (E(s ) E(s))/T(t) o. w. Probability of un-optimizing ( E = E s E s < 0) random movements depends on badness of move and temperature Badness of movement: worse movements get less probability Temperature High temperature at start: higher probability for bad random moves Gradually reducing temperature: random bad movements become more unlikely and thus hill-climbing moves increase 20

SA as a global optimization method Theoretical guarantee: If T decreases slowly enough, simulated annealing search will converge to a global optimum (with probability approaching 1) Practical? Time required to ensure a significant probability of success will usually exceed the time of a complete search 21

Local beam search Keep track of k states Instead of just one in hill-climbing and simulated annealing Start with k randomly generated states Loop: All the successors of all k states are generated If any one is a goal state then stop else select the k best successors from the complete list of successors and repeat. 22

Local beam search Is it different from running high-climbing with k random restarts in parallel instead of in sequence? Passing information among parallel search threads Problem: Concentration in a small region after some iterations Solution: Stochastic beam search Choose k successors at random with probability that is an increasing function of their objective value 23

Genetic Algorithms A variant of stochastic beam search Successors can be generated by combining two parent states rather than modifying a single state 24

Natural Selection There is competition among living things More are born than survive and reproduce Selection determines which individuals enter the adult breeding population This selection is done by the environment Those which are best suited or fitted reproduce They pass these well suited characteristics on to their young Reproduction occurs with variation This variation is heritable Survival of the fittest means those who have the most offspring that reproduce 25

Natural Selection Natural Selection: Variations occur in reproduction and will be preserved in successive generations approximately in proportion to their effect on reproductive fitness 26

Genetic Algorithms: inspiration by natural selection State: organism Objective value: fitness (populate the next generation according to its value) Successors: offspring 27

Genetic Algorithm (GA) A state (solution) is represented as a string over a finite alphabet Like a chromosome containing genes Start with k randomly generated states (population) Evaluation function to evaluate states (fitness function) Higher values for better states Combining two parent states and getting offsprings (cross-over) Cross-over point can be selected randomly Reproduced states can be slightly modified (mutation) The next generation of states is produced by selection (based on fitness function), crossover, and mutation 28

Chromosome & Fitness: 8-queens Describe the individual (or state) as a string 2 4 7 4 8 5 5 2 Fitness function: number of non-attacking pairs of queens 24 for above figure 29

Genetic operators: 8-queens Cross-over: To select some part of the state from one parent and the rest from another. 6 7 2 4 7 5 8 8 7 5 2 5 1 4 4 7 6 7 2 5 1 4 4 7 30

Genetic operators: 8-queens Cross-over: To select some part of the state from one parent and the rest from another. 6 7 2 4 7 5 8 8 7 5 2 5 1 4 4 7 6 7 2 5 1 4 4 7 31

Genetic operators: 8-queens Mutation: To change a small part of one state with a small probability. 6 7 2 5 1 4 4 7 6 7 2 5 1 3 4 7 32

Genetic operators: 8-queens Mutation: To change a small part of one state with a small probability. 6 7 2 5 1 4 4 7 6 7 2 5 1 3 4 7 33

34

A Genetic algorithm diagram Start Generate initial population 6 7 2 4 7 5 8 8 3 1 2 8 2 5 6 6 8 1 4 2 5 3 7 1 Individual Evaluation Stop Criteria? Yes Solution Selection Crossover Mutation 35

A variant of genetic algorithm: 8-queens Fitness function: number of non-attacking pairs of queens min = 0, max = 8 7/2 = 28 n Reproduction rate(i) = fitness(i)/ k=1 fitness(k) e.g., 24/(24+23+20+11) = 31% 36

A variant of genetic algorithm: 8-queens Fitness function: number of non-attacking pairs of queens min = 0, max = 8 7/2 = 28 n Reproduction rate(i) = fitness(i)/ k=1 fitness(k) e.g., 24/(24+23+20+11) = 31% 37

Genetic algorithm properties Why does a genetic algorithm usually take large steps in earlier generations and smaller steps later? Initially, population individuals are diverse Cross-over operation on different parent states can produce a state long a way from both parents More similar individuals gradually appear in the population Cross-over as a distinction property of GA Ability to combine large blocks of genes evolved independently Representation has an important role in benefit of incorporating crossover operator in GA 38

Local search in continuous spaces Infinite number of successor states E.g., select locations for 3 airports such that sum of squared distances from each city to its nearest airport is minimized (x 1, y 1 ), (x 2, y 2 ), (x 3, y 3 ) 3 F(x 1, y 1, x 2, y 2, x 3, y 3 ) = i=1 c Ci (x i x c ) 2 +(y i y c ) 2 Approach 1: Discretization Just change variable by ± E.g., 6 2 actions for airport example Approach 2: Continuous optimization f = 0 (only for simple cases) Gradient ascent 1 t t t x x f ( x ) f f f f,,..., x 1 x 2 x d

Gradient ascent t 1 t t x x f ( x ) Local search problems also in continuous spaces Random restarts and simulated annealing can be useful Higher dimensions raises the rate of getting lost 40

Gradient ascent (step size) Adjusting α in gradient descent Line search Newton-Raphson t 1 t 1 t t x x H ( x ) f ( x ) f H f x x 2 ij i j 41

Local search vs. systematic search Systematic search Local search Solution Path from initial state to the goal Solution state itself Method Systematically trying different paths from an initial state Keeping a single or more "current" states and trying to improve them State space Usually incremental Complete configuration Memory Usually very high Usually very little (constant) Time Finding optimal solutions in small state spaces Finding reasonable solutions in large or infinite (continuous) state spaces Scope Search Search & optimization problems 42