Zebo Peng Embedded Systems Laboratory IDA, Linköping University

Similar documents
Lecture 9 Evolutionary Computation: Genetic algorithms

Lin-Kernighan Heuristic. Simulated Annealing

CSC 4510 Machine Learning

PROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE

Hill climbing: Simulated annealing and Tabu search

Simulated Annealing. Local Search. Cost function. Solution space

Metaheuristics and Local Search

Computational statistics

Single Solution-based Metaheuristics

Methods for finding optimal configurations

Design and Analysis of Algorithms

Metaheuristics and Local Search. Discrete optimization problems. Solution approaches

5. Simulated Annealing 5.1 Basic Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini

Local and Stochastic Search

Local Search & Optimization

Genetic Algorithm. Outline

Introduction to Simulated Annealing 22c:145

SIMU L TED ATED ANNEA L NG ING

Decision Procedures An Algorithmic Point of View

Lecture 23 Branch-and-Bound Algorithm. November 3, 2009

1 Heuristics for the Traveling Salesman Problem

An artificial chemical reaction optimization algorithm for. multiple-choice; knapsack problem.

Motivation, Basic Concepts, Basic Methods, Travelling Salesperson Problem (TSP), Algorithms

Artificial Intelligence Heuristic Search Methods

Local Search & Optimization

3.4 Relaxations and bounds

Unit 1A: Computational Complexity

Optimization Methods via Simulation

Lecture 4: Simulated Annealing. An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved

Integer Programming. Wolfram Wiesemann. December 6, 2007

Finding optimal configurations ( combinatorial optimization)

Integer Linear Programming

3.10 Column generation method

Week Cuts, Branch & Bound, and Lagrangean Relaxation

Spin Glas Dynamics and Stochastic Optimization Schemes. Karl Heinz Hoffmann TU Chemnitz

CS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash

3.10 Column generation method

A.I.: Beyond Classical Search

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria

Local Search and Optimization

where X is the feasible region, i.e., the set of the feasible solutions.

Algorithms and Complexity theory

Multi-objective branch-and-cut algorithm and multi-modal traveling salesman problem

Chapter 8: Introduction to Evolutionary Computation

CS 331: Artificial Intelligence Local Search 1. Tough real-world problems

Adaptive Generalized Crowding for Genetic Algorithms

LOCAL SEARCH. Today. Reading AIMA Chapter , Goals Local search algorithms. Introduce adversarial search 1/31/14

Artificial Intelligence Methods (G5BAIM) - Examination

Artificial Intelligence Methods (G5BAIM) - Examination

Methods for finding optimal configurations

SOLVING INTEGER LINEAR PROGRAMS. 1. Solving the LP relaxation. 2. How to deal with fractional solutions?

Bounded Approximation Algorithms

DISTRIBUTION SYSTEM OPTIMISATION

Part III: Traveling salesman problems

5. Simulated Annealing 5.2 Advanced Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini

A GA Mechanism for Optimizing the Design of attribute-double-sampling-plan

Recoverable Robustness in Scheduling Problems

Branch-and-Bound. Leo Liberti. LIX, École Polytechnique, France. INF , Lecture p. 1

Theory and Applications of Simulated Annealing for Nonlinear Constrained Optimization 1

Distributed Optimization. Song Chong EE, KAIST

MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms

Tabu Search. Biological inspiration is memory the ability to use past experiences to improve current decision making.

Part III: Traveling salesman problems

23. Cutting planes and branch & bound

7.1 Basis for Boltzmann machine. 7. Boltzmann machines

Lecture H2. Heuristic Methods: Iterated Local Search, Simulated Annealing and Tabu Search. Saeed Bastani

Fundamentals of Genetic Algorithms

Algorithmic Game Theory and Applications. Lecture 5: Introduction to Linear Programming

Numerical Optimization. Review: Unconstrained Optimization

Algorithms. Outline! Approximation Algorithms. The class APX. The intelligence behind the hardware. ! Based on

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

3E4: Modelling Choice. Introduction to nonlinear programming. Announcements

A new ILS algorithm for parallel machine scheduling problems

CSC 8301 Design & Analysis of Algorithms: Lower Bounds

Flow Shop and Job Shop Models

The Quadratic Assignment Problem

Intelligens Számítási Módszerek Genetikus algoritmusok, gradiens mentes optimálási módszerek

MVE165/MMG630, Applied Optimization Lecture 6 Integer linear programming: models and applications; complexity. Ann-Brith Strömberg

Outline. Outline. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Scheduling CPM/PERT Resource Constrained Project Scheduling Model

Introduction to Mathematical Programming IE406. Lecture 21. Dr. Ted Ralphs

Evolutionary computation

Integer programming: an introduction. Alessandro Astolfi

Resource Constrained Project Scheduling Linear and Integer Programming (1)

Discrete (and Continuous) Optimization WI4 131

Scaling Up. So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.).

CS Data Structures and Algorithm Analysis

Computational Intelligence in Product-line Optimization

Introduction to Integer Linear Programming

Introduction to Integer Programming

A pruning pattern list approach to the permutation flowshop scheduling problem

Constraint satisfaction search. Combinatorial optimization search.

Autumn Coping with NP-completeness (Conclusion) Introduction to Data Compression

15.083J/6.859J Integer Optimization. Lecture 10: Solving Relaxations

Heuristic Optimisation

6 Markov Chain Monte Carlo (MCMC)

Data Mining Part 5. Prediction

Algorithms & Data Structures 2

GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS

In complexity theory, algorithms and problems are classified by the growth order of computation time as a function of instance size.

Genetic Algorithms: Basic Principles and Applications

Transcription:

TDTS 01 Lecture 8 Optimization Heuristics for Synthesis Zebo Peng Embedded Systems Laboratory IDA, Linköping University Lecture 8 Optimization problems Heuristic techniques Simulated annealing Genetic algorithms Zebo Peng, IDA, LiTH 2 1

Mathematical Optimization Many design and synthesis tasks can be formulated as: Minimize f(x) Subject to g i (x) b i ; i = 1, 2,..., m; where x is a vector of decision variables (x 0); f is the cost (objective) function; g i s are a set of constraints. If f and g i are linear functions, we have a linear programming (LP) problem. LP can be solved by the simplex algorithm, which is an exact method. It will always identify the optimal solution if it exists. Zebo Peng, IDA, LiTH 3 Combinational Optimization (CO) In many design problems, the decision variables are discrete. The solution is a set, or a sequence, of integers or other discrete objects. Ex. A feasible solution for the graph coloring problem (with n nodes and m colors) is represented as x i = j; j {1, 2,..., m}, i = 1, 2,..., n. Such problems are called combinatorial optimization problems. Zebo Peng, IDA, LiTH 4 2

ILP Formulation of Scheduling RC-scheduling is to decide on x i,j such that x i,j equals 1 if operation i is scheduled at step j, and 0 otherwise (i = 0, 1,..., n-1; j = 1, 2,..., m). A set of constraints can be formulated: C1: The start time of each operation should be unique: m x i j j1, 1 The start time of an operation is consequnetly given as (i = 0,1,..., n-1): t i m j1 j x i, j Zebo Peng, IDA, LiTH 5 Constraints for Scheduling C2: The presedence relation must be preserved: m m j xp, j j xq, j dq, p, q 0,1,... n 1 : ( vq, vp ) E j1 j1 - C3: The resource bounds must be met at every schedule step, i.e., the number of all operations at step j of type k must not larger than a k : i: ( vi ) k j x i, p p jdi1 a, k k 0,1,... n res j 1,2,..., m Zebo Peng, IDA, LiTH 6 3

ILP Formulation The resource-constrained scheduling problem can be stated as: To minimize c T t such that the three constrains C1, C2, and C3 are satisfied, t is the vector whose entries are the start times of the operations, i.e., t = [t 0, t 1,..., t n ]. If c = [0, 0,..., 0, 1] T, this corresponds to minimizing the lentacy of the schedule, since c T t = t n (t n is the start time of the last operation). If c = [1, 1,..., 1] T, it coresponds to finding the earliest start times of all operations. Zebo Peng, IDA, LiTH 7 Features of CO Problems Most CO problems, e.g., system partitioning with constraints, for digital system designs are NP-compete. The time needed to solve an NP-compete problem grows exponentially with respect to the problem size n. For example, to enumerate all feasible solutions for a scheduling problem (all possible permutation), we have: 20 tasks in 1 hour (assumption); 21 tasks in 20 hour; 22 tasks in 17.5 days;... 25 tasks in 6 centuries. Zebo Peng, IDA, LiTH 8 4

NP-Complete Problems Approaches for solving such problems exactly are usually based on implicit enumeration of the feasible solutions. They require, in the worst-case, an exponential number of steps. When the dimensionality n of the problem instance becomes large, as in most real-world applications, no exponential algorithms could be of practical use. They are typically called intractable problems. Zebo Peng, IDA, LiTH 9 Lecture 8 Optimization problems Heuristic techniques Simulated annealing Genetic algorithms Zebo Peng, IDA, LiTH 10 5

Heuristics A heuristic seeks near-optimal solutions at a reasonable computational cost without being able to guarantee either feasibility or optimality. Motivations: Many exact algorithms involve a huge amount of computation effort. The decision variables have frequently complicated interdependencies. We have often nonlinear cost functions and constraints, t even no mathematical functions. Ex. The cost function f can, for example, be defined by a computer program (e.g., for power estimation). Zebo Peng, IDA, LiTH 11 Heuristic Approaches to CO uctive Constr Problem specific List scheduling Left-edge algorithm Generic methods Divide and conquer Branch and bound Transforma ational Iterative impr ovement Kernighan-Lin algorithm for graph partitioning Neighborhood search Simulated annealing Tabu search Genetic algorithms Zebo Peng, IDA, LiTH 12 6

Branch-and-Bound Traverse an implicit tree to find the best leaf (solution). 1 3 0 4-City TSP 0 1 2 3 41 0 0 3 6 41 40 1 2 0 40 5 0 4 3 4 2 3 0 Total cost of this solution = 88 Zebo Peng, IDA, LiTH 13 0 1 2 3 0 1 2 3 0 3 6 41 0 40 5 0 4 0 Branch-and-Bound Ex Low-bound on the cost function. Search strategy {0} L 0 {0,1} L 3 {0,2} L 6 {0,3} L 41 {0,1,2}, L 43 {0,1,3}, L 8 {0,2,1}, L 46 {0,2,3}, {0,3,1}, {0,3,2}, L 10 L 46 L 45 {0,1,2,3} L = 88 {0,1,3,2} L = 18 {0,2,1,3} L = 92 {0,2,3,1} L = 18 {0,3,1,2} {0,3,2,1} L = 92 L = 88 Zebo Peng, IDA, LiTH 14 7

Neighborhood Search Method Step 1 (Initialization) (A) Select a starting solution x now X. (B) x best = x now, best_cost = c(x best ). Step 2 (Choice and termination) Choose a solution x next N(x now ), in the neighborhood of x now. If no solution can be selected, or the terminating criteria apply, then the algorithm terminates. Step 3 (Update) Re-set x now = x next. If c(x now ) < best_cost, perform Step 1(B). Goto Step 2. Zebo Peng, IDA, LiTH 15 Step 1 (Initialization) Step 2 (Choice and termination) The Descent Method Choose x next N(x now ) such that c(x next )<c(x now ), and terminate if no such x next can be found. Step 3 (Update) The descent process can easily be stuck at a local optimum. Cost Solutions Zebo Peng, IDA, LiTH 16 8

Dealing with Local Optimality Enlarge the neighborhood. Start with different initial solutions. To allow uphill moves : Simulated annealing Tabu search Cost X Solutions Zebo Peng, IDA, LiTH 17 Lecture 8 Optimization problems Heuristic techniques Simulated annealing Genetic algorithms Zebo Peng, IDA, LiTH 18 9

Annealing Annealing is the slow cooling of metallic material after heating: A solid material is heated pass its melting point. Cooling is then slowly done. When cooling stops, the material settles usually into a low energy state (e.g., a stable structure). In this way, the properties of the materials are improved. Zebo Peng, IDA, LiTH 19 Annealing The annealing process can be viewed intuitively as: At high temperature, the atoms are randomly oriented due to their high energy caused by heat. When the temperature reduces, the atoms tend to line up with their neighbors, but different regions may have different directions. If cooling is done slowly, the final frozen state will have a nearminimal energy state. Zebo Peng, IDA, LiTH 20 10

Simulated Annealing Annealing can be simulated using computer simulation: Generate a random perturbation of the atom orientations and calculates the resulting energy change. If the energy has decreased, the system moves to this new state. If energy has increased, the new state is accepted according to the laws of thermodynamics: At temperature t, the probability of an increase in energy of magnitude EE is given by p( E) 1 ( e k E ) t where k is called the Boltzmann s constant. Zebo Peng, IDA, LiTH 21 Probability of Accepting Higher-Energy States p(e) 1 0,9 08 0,8 0,7 0,6 0,5 0,4 0,3 0,2 01 0,1 0 p( E) ( 1E e k E ) t 0,2 0,4 0,6 0,8 1 1,2 1,4 1,6 1,8 2 2,2 2,4 2,6 2,8 3 3,2 3,4 3,6 3,8 E Zebo Peng, IDA, LiTH 22 11

Simulated Annealing for CO The SA algorithm could be applied to combinatorial optimization: Thermodynamic simulation System states Energy Change of state Temperature Frozen state Combinatorial optimization Feasible solutions Cost Moving to a neighboring solution Control parameter Final solution Zebo Peng, IDA, LiTH 23 Simulated Annealing for CO SA is a modified descent method for neighborhood search. It allows uphill moves in a controlled way. 1 0,9 0,8 0,7 0,6 0,5 0,4 0,3 0,2 0,1 0 0,2 0,4 0,6 0,8 1 1,2 1,4 1,6 1,8 2 2,2 2,4 2,6 2,8 3 3,2 3,4 3,6 3,8 The frequency of accepting uphill moves is governed by a probability function which changes as the algorithm progresses. Zebo Peng, IDA, LiTH 24 12

The SA Algorithm Select an initial solution x now X; Select an initial temperature t > 0; Select a temperature reduction function ; Repeat Repeat Randomly select x next N(x now ); = cost(x next ) - cost(x now ); If < 0 then x now = x next else generate randomly p in the range (0, 1) uniformly; If p < exp(-/t) then x now = x next ; Until iteration_count = nrep; Set t = (t); Until stopping condition = true. Return x now as the approximation to the optimal solution. Zebo Peng, IDA, LiTH 25 A HW/SW Partitioning Example 75000 70000 optimum at iteration 1006 65000 60000 Cost function value 55000 50000 45000 40000 35000 0 200 400 600 800 1000 1200 1400 Number of iterations Zebo Peng, IDA, LiTH 26 13

Initial temperature (IT): The Cooling Schedule I IT must be "hot" enough to allow an almost free exchange of neighborhood h solutions, if the final solution is to be independent of the starting one. Simulating the heating process: A system can be first heated rapidly until the proportion of accepted moves to rejected moves reaches a given value; e.g., when 80% of moves leading to higher costs will be accepted. Cooling will then start. Zebo Peng, IDA, LiTH 27 Temperature reduction scheme: The Cooling Schedule II A large number of iterations at few temperatures or a small number of iterations at many temperatures. Typically (t) = a x t, where a < 1; a should be large, usually between 0.8 and 0.99. For better results, the reduction rate should be slower in middle temperature ranges. Stopping conditions: Zero temperature - the theoretical requirement. A number of iterations or temperatures has passed without any acceptance of moves. A given total number of iterations have been completed (or a fixed amount of execution time). Zebo Peng, IDA, LiTH 28 14

Problem-Specific Decisions The neighborhood structure should be defined such that: All solutions should be reachable from each other. Easy to generate randomly a neighboring feasible solution. Penalty for infeasible solutions, if the solution space is strongly constrained. The cost difference between s and s 0 should be able to be efficiently calculated. The size of the neighborhood should be kept reasonably small. Many decision parameters must be fine-tuned based on experimentation on typical data. Zebo Peng, IDA, LiTH 29 Lecture 8 Optimization problems Heuristic techniques Simulated annealing Genetic algorithms Zebo Peng, IDA, LiTH 30 15

Genetic Algorithms Evolution of a population: mutation Evolution: Selection and Alternation crossover current generation next generation Zebo Peng, IDA, LiTH 31 The Simple Genetic Algorithm begin t = 0; Initialize P(t); while termination criterion not reached do begin Evaluate P(t); Select P(t + 1) from P(t); Alter P(t + 1); t = t + 1; end end Evaluation: Based on the cost function. Selection: Proportionate selection is usually used. Zebo Peng, IDA, LiTH 32 16

The Roulette Wheel Selection Scheme Zebo Peng, IDA, LiTH 33 Genetic Operator I Crossover of chromosomes, controlled by a crossover rate: 0 1 1 0 1 0 0 0 1 0 1 0 1 0 1 0 1 1 1 1 1 1 0 0 1 1 1 0 1 0 0 0 1 0 0 0 1 0 1 1 11001000101010101111 1 0 1 0 0 0 1 0 1 0 1 0 1 0 1 1 1 1 0 1 1 0 1 1 1 0 1 0 0 0 1 0 0 0 1 0 1 1 Zebo Peng, IDA, LiTH 34 17

Mutation, controlled by a mutation rate: 0 1 1 0 1 0 0 0 1 0 1 0 1 0 1 0 1 1 1 1 Genetic Operator II 0 1 1 0 1 1 0 0 1 0 1 0 1 0 1 0 1 0 1 1 Survival, a fixed number of best individuals will copy themselves. Zebo Peng, IDA, LiTH 35 Population size: typical: 50 200 trade-off: small v. large population size Crossover rate: 0.6-0.9 Mutation rate: 0.001-0.01 Stopping conditions: Numbers of generations Progress Diversity of population Selection mechanisms: Experiments Analysis Adapting parameter settings Control Parameters Zebo Peng, IDA, LiTH 36 18

Summary Design and synthesis tasks are often optimization problems. Due to the complexity of the optimization problem, heuristic algorithms are widely used. Many general heuristics are based on neighborhood search principles. SA is applicable to almost any combinatorial optimization problem, and very simple to implement. Genetic algorithms simulate the evolution process of nature and are widely applicable. Zebo Peng, IDA, LiTH 37 19