A Brief Introduction to Multiobjective Optimization Techniques

Similar documents
Multiobjective Optimisation An Overview

Chapter 8: Introduction to Evolutionary Computation

Multiobjective Evolutionary Algorithms. Pareto Rankings

Generalization of Dominance Relation-Based Replacement Rules for Memetic EMO Algorithms

Evolutionary computation

A Novel Multiobjective Formulation of the Robust Software Project Scheduling Problem

Lecture 9 Evolutionary Computation: Genetic algorithms

Robust Multi-Objective Optimization in High Dimensional Spaces

Evolutionary Computation. DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia)

Multi-Objective Optimisation. using Sharing in Swarm Optimisation Algorithms

Genetic Algorithms: Basic Principles and Applications

AMULTIOBJECTIVE optimization problem (MOP) can

Evaluation and Validation

Zebo Peng Embedded Systems Laboratory IDA, Linköping University

CSC 4510 Machine Learning

Effects of the Use of Non-Geometric Binary Crossover on Evolutionary Multiobjective Optimization

Expected Running Time Analysis of a Multiobjective Evolutionary Algorithm on Pseudo-boolean Functions

Multiobjective Optimization

Optimizare multiobiectiv utilizand algoritmi genetici. Multiobjective optimization using genetic algorithms

Covariance Matrix Adaptation in Multiobjective Optimization

Behavior of EMO Algorithms on Many-Objective Optimization Problems with Correlated Objectives

TIES598 Nonlinear Multiobjective Optimization A priori and a posteriori methods spring 2017

DESIGN OF OPTIMUM CROSS-SECTIONS FOR LOAD-CARRYING MEMBERS USING MULTI-OBJECTIVE EVOLUTIONARY ALGORITHMS

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. XX, NO.X, XXXX 1

MVE165/MMG631 Linear and integer optimization with applications Lecture 13 Overview of nonlinear programming. Ann-Brith Strömberg

New Reference-Neighbourhood Scalarization Problem for Multiobjective Integer Programming

Decoding Strategies for the 0/1 Multi-objective Unit Commitment Problem

Multi Objective Optimization

An Interactive Reference Direction Algorithm of the Convex Nonlinear Integer Multiobjective Programming

Handling Uncertainty in Indicator-Based Multiobjective Optimization

GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS

Fundamentals of Genetic Algorithms

Genetic Algorithms & Modeling

Computationally Expensive Multi-objective Optimization. Juliane Müller

Crossover Techniques in GAs

Multi-Objective Optimization Methods for Optimal Funding Allocations to Mitigate Chemical and Biological Attacks

Quad-trees: A Data Structure for Storing Pareto-sets in Multi-objective Evolutionary Algorithms with Elitism

Evolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction

An Interactive Reference Direction Algorithm of Nonlinear Integer Multiobjective Programming*

Multicriteria Optimization and Decision Making

CHAPTER 11. A Revision. 1. The Computers and Numbers therein

Scaling Up. So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.).

Constrained Real-Parameter Optimization with Generalized Differential Evolution

A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem

Smart Hill Climbing Finds Better Boolean Functions

Chapter 2 An Overview of Multiple Criteria Decision Aid

Bi-objective Portfolio Optimization Using a Customized Hybrid NSGA-II Procedure

Runtime Analyses for Using Fairness in Evolutionary Multi-Objective Optimization

Performance Assessment of Generalized Differential Evolution 3 with a Given Set of Constrained Multi-Objective Test Problems

Fundamental Theorems of Optimization

Evolutionary Computation: introduction

International Journal of Information Technology & Decision Making c World Scientific Publishing Company

Metaheuristics and Local Search

Unit 2: Problem Classification and Difficulty in Optimization

DISTRIBUTION SYSTEM OPTIMISATION

A New Method for Solving Bi-Objective Transportation Problems

Multi-objective optimization of high speed vehicle-passenger catamaran by genetic algorithm

Evolutionary Computation and Convergence to a Pareto Front

An Analysis on Recombination in Multi-Objective Evolutionary Optimization

TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

Scalarizing Problems of Multiobjective Linear Integer Programming

Numerical Methods. King Saud University

Evolutionary computation

A Non-Parametric Statistical Dominance Operator for Noisy Multiobjective Optimization

Hill climbing: Simulated annealing and Tabu search

The polynomial solvability of selected bicriteria scheduling problems on parallel machines with equal length jobs and release dates

Evolutionary Multiobjective. Optimization Methods for the Shape Design of Industrial Electromagnetic Devices. P. Di Barba, University of Pavia, Italy

The Pickup and Delivery Problem: a Many-objective Analysis

Inter-Relationship Based Selection for Decomposition Multiobjective Optimization

Efficient Non-domination Level Update Method for Steady-State Evolutionary Multi-objective. optimization

The Edgeworth-Pareto Principle in Decision Making

PAijpam.eu OBTAINING A COMPROMISE SOLUTION OF A MULTI OBJECTIVE FIXED CHARGE PROBLEM IN A FUZZY ENVIRONMENT

Principles of Pattern Recognition. C. A. Murthy Machine Intelligence Unit Indian Statistical Institute Kolkata

Plateaus Can Be Harder in Multi-Objective Optimization

Decomposition and Metaoptimization of Mutation Operator in Differential Evolution

PBEE Design Methods KHALID M. MOSALAM, PROFESSOR & SELIM GÜNAY, POST-DOC UNIVERSITY OF CALIFORNIA, BERKELEY

Microeconomics. Joana Pais. Fall Joana Pais

PIBEA: Prospect Indicator Based Evolutionary Algorithm for Multiobjective Optimization Problems

Metaheuristics and Local Search. Discrete optimization problems. Solution approaches

Evolution Strategies for Constants Optimization in Genetic Programming

Structural Properties of Utility Functions Walrasian Demand

Introduction to Black-Box Optimization in Continuous Search Spaces. Definitions, Examples, Difficulties

Set-based Min-max and Min-min Robustness for Multi-objective Robust Optimization

Multi-objective genetic algorithm

Analyses of Guide Update Approaches for Vector Evaluated Particle Swarm Optimisation on Dynamic Multi-Objective Optimisation Problems

Multiple Criteria Optimization: Some Introductory Topics

3.4 Relaxations and bounds

Interactive Multi-Objective Optimization (MOO) using a Set of Additive Value Functions

Włodzimierz Ogryczak. Warsaw University of Technology, ICCE ON ROBUST SOLUTIONS TO MULTI-OBJECTIVE LINEAR PROGRAMS. Introduction. Abstract.

Worst case analysis for a general class of on-line lot-sizing heuristics

Multicriteria Decision Making Achievements and Directions for Future Research at IIT-BAS

OPTIMAL CAPACITOR PLACEMENT USING FUZZY LOGIC

A New Framework for Solving En-Route Conflicts

Multiobjective Optimization of Cement-bonded Sand Mould System with Differential Evolution

Multi-Objective Optimization of Two Dynamic Systems

Research Article A Compensatory Approach to Multiobjective Linear Transportation Problem with Fuzzy Cost Coefficients

Superiorized Inversion of the Radon Transform

MULTIOBJECTIVE EVOLUTIONARY ALGORITHM FOR INTEGRATED TIMETABLE OPTIMIZATION WITH VEHICLE SCHEDULING ASPECTS

Adaptive Dynamic Cost Updating Procedure for Solving Fixed Charge Network Flow Problems.

Introduction to Lexicographic Reverse Search: lrs

Transcription:

Università di Catania Dipartimento di Ingegneria Informatica e delle Telecomunicazioni A Brief Introduction to Multiobjective Optimization Techniques Maurizio Palesi Maurizio Palesi [mpalesi@diit.unict.it] 1

Introduction Most real-world engineering optimization problems are multiobjective in nature Objectives are often conflicting Performance vs. Silicon area Quality vs. Cost Efficiency vs. Portability The notion of optimum has to be redefined Maurizio Palesi [mpalesi@diit.unict.it] 2

Statement of the Problem Multiobjective optimization (multicriteria, multiperformance, vector optimization) Find a vector of decision variables which satisfies constraints and optimizes a vector function whose elements represent the objective functions Objectives are usually in conflict with each other Optimize: finding solutions which would give the values of all the objective functions acceptable to the designer Maurizio Palesi [mpalesi@diit.unict.it] 3

Mathematical Formulation Find the vector x * = [ ] * * * x, x, K, T 1 2 x n Which will satisfy the m inequality constraints The p equality constraints ( x) 0 i = 1,2, K m g i, ( x) = 0 i = 1,2, K p h i, And optimizes the vector function ( x) = [ f ( x) f ( x), f ( )] T k x f, 1, 2 K Maurizio Palesi [mpalesi@diit.unict.it] 4

Feasible Region ( x) 0 i = 1,2, K m g i, ( x) = 0 i = 1,2, K p h i, Define the feasible region F F F F F Convex sets Non-convex sets Maurizio Palesi [mpalesi@diit.unict.it] 5

Meaning of Optimum We rarely have an x* such that f ( * x ) f ( x) x F, i = 1,2, K k i i, f 2 f 2 (x*) f 1 (x*) We have to estabilish a certain criteria to determine what would be considered as an optimal solution f 1 Maurizio Palesi [mpalesi@diit.unict.it] 6

Pareto Optimum Fornulated by Vilfredo Pareto in the XIX century A point x F is Pareto optimal if for every either x * F ( ) ( * x = f x ), i = 1,2 k fi i,..., or, there is at least one such that f ( ) ( * x f x ) i > i i { 1,2,..., k} Vilfredo Pareto 1848-1923 Maurizio Palesi [mpalesi@diit.unict.it] 7

Pareto Optimum In words, this definition says that is Pareto optimal if there exists no feasible vector of decision variables x * F which would decrease some criterion without causing a simultaneous increase in at least one other criterion Vilfredo Pareto 1848-1923 x * Maurizio Palesi [mpalesi@diit.unict.it] 8

Pareto Optimum A solution x F is said to dominate y F if x is better or equal to y in all attributes x is strictly better than y in at least one attribute Formally, x dominate y (x f y) ( x) f ( y), i =1,2 k fi i,..., Vilfredo Pareto 1848-1923 j {1,2,..., k} : f ( x) f ( y) j < j The Pareto set consists of solutions that are not dominated by any other solutions Maurizio Palesi [mpalesi@diit.unict.it] 9

Pareto Front x 2 f(x) f 2 f 1 * F f* f 2 * x 1 f 1 Pareto front Maurizio Palesi [mpalesi@diit.unict.it] 10

Current State of the Area Currently, there are over 30 mathematical programming techniques for multiobjective optimization However, these techniques tend to generate elements of the Pareto optimal set one at a time Additionally, most of them are very sensitive to the shape of the Pareto front (e.g., they do not work when the Pareto front is concave or when the front is disconnected) Maurizio Palesi [mpalesi@diit.unict.it] 11

General Optimization Methods Derivative methods Optimization methods Non-derivative methods More suitable for general engineering design problems Do not require any derivatives of the objective function Black-box methods More likely to find a global optima Simulated Annealing Genetic Algorithms Random Search Tabu Search Complex/ Simplex Maurizio Palesi [mpalesi@diit.unict.it] 12

Why Evolutionary Algorithms? Most suitable to handle multi modal function landscapes and to identify multiple optima in a robust manner They are an adaptive approach They are of general application and do not require detailed knowledge of the problem A GA can be said to build itself a representation of the problem and solve it They learn by experience They solve a problem by successive refinement. Starting from a set of random solutions, the process of evolution generates a learning process They are efficient at solving complex problems Growing interest from researchers with various backgrounds to solve problems of all kinds and levels of complexity Parallelization The complexity of the approach often lies in evaluation of the fitness functions of the individuals in each generation. This procedure can be parallelised quite simply, as it is possible to assign individual fitness values independently, so concurrent execution of this operation does not cause conflict Simple to execute them parallely even on a loosely coupled NOW (Network Of Workstations) without much communication overhead Maurizio Palesi [mpalesi@diit.unict.it] 13

Multiobjective Optimization Based on GA Naïve approaches Weighted Sum Approach Goal Programming Goal Attainment The ε-constraint Method Non-Aggregating Approaches that are not Pareto- Based VEGA Lexicographic Ordering Game Theory Pareto-Based Approaches SPEA2 Maurizio Palesi [mpalesi@diit.unict.it] 14

Naïve Approaches A GA needs scalar fitness information to work Combine all the objectives into a single one Provide accurate scalar information about the range of the objectives Avoid having one of them to dominate the others Behaviour of each of the objective functions Very expensive process! No further interaction with the decision maker is required Maurizio Palesi [mpalesi@diit.unict.it] 15

Weighted Sum Approach The MOO problem is transformed into a scalar optimization problem min i= 1 Where w i 0 are the weighting coefficients representing the relative importance of the objectives It is usually assumed that k w i k f i= 1 i ( x) w i = 1 Maurizio Palesi [mpalesi@diit.unict.it] 16

Weighted Sum Approach (cnt d) min k i= 1 w i f i ( x) Results can vary significantly as the weighting coefficients change How to choose these coefficients? Solve the same problem for many different values of w i Interaction with the decision maker is required f 2 f 1 Maurizio Palesi [mpalesi@diit.unict.it] 17

Weighted Sum Approach (cnt d) min i= 1 ( x) w i do not reflect the importance of the objectives All functions should be expressed in units of approximately the same numerical values k w i f i min ( x) c i The best results are usually obtained if k w i=1 i f i c = i 0 1/ fi Maurizio Palesi [mpalesi@diit.unict.it] 18

Weighted Sum Approach (cnt d) Very efficient (computationally speaking) Generates a strongly non-dominated solution that can be used as an initial solution for other techniques Problems Solution quite often appear only in some parts of the Pareto front, while no solutions are obtained in other parts Cannot find solution on non-convex parts of the Pareto front Convex combination of objectives where the sum of all weights is costant and negative weights are not allowed Maurizio Palesi [mpalesi@diit.unict.it] 19

Goal Programming The DM has to assign targets or goal (T) that he/she wishes to achieve for each objective Minimize the absolute deviations from the targets to the objectives min k i= 1 ( ) f x T i i Maurizio Palesi [mpalesi@diit.unict.it] 20

Goal Programming (cnt d) min i= 1 Very efficient (computationally speaking) because it does not require any non-dominance comparison Problems Definition of the goals k ( ) f x T i i Will yield a dominated solution if the goal point is chosen in the feasible domain Not being able to deal with non-convex search spaces Maurizio Palesi [mpalesi@diit.unict.it] 21

Goal Attainment In addition to the goal vector b, a vector of weights w has to be elicited from the DM subject to: minα ( x) 0, j = 1,2 m g j,..., ( x), i =1,2 k b +α i wi fi,..., Maurizio Palesi [mpalesi@diit.unict.it] 22

Goal Attainment (cnt d) f 2 f 2 * α* b+αw α is not limited in sign α > 0 the goal is unattainable α < 0 the goal is attainable and an improved solution will be obtained w b f 1 * f 1 Maurizio Palesi [mpalesi@diit.unict.it] 23

Goal Attainment (cnt d) f 2 f(x) f(y) Misleading selection pressure x and y have the same goalattainment: for the GA none of them will be better than the other b+αw α* w b ( x), i =1,2 k bi +α wi fi,..., f 1 Maurizio Palesi [mpalesi@diit.unict.it] 24

The ε Constraint Method Minimization of one (primary) objective function, and considering the other objectives as constraints bound by some allowable levels ε i f r ( ) x * = min f ( x) x F subject to additional constraints of the form r f i ( x) ε i = 1,2,..., k andi r i Repeat with different values of ε i Maurizio Palesi [mpalesi@diit.unict.it] 25

The ε Constraint Method (cnt d) Several heuristics to set the ε i Using upper and lower bound Problems ε i ε i f f i i ( * x ) i i = 1,2,..., r 1, r + 1,..., k ( * x ) i = 1,2,..., r 1, r + 1,..., k Very time-consuming approach r Maurizio Palesi [mpalesi@diit.unict.it] 26

Non-Aggregating Approaches that are not Pareto-Based Born to overcome the difficulties involved in the aggregating approaches Based on Population policies Special handling of objectives Maurizio Palesi [mpalesi@diit.unict.it] 27

VEGA Vector Evaluated Genetic Algorithm Extended the Grefenstette s GENESIS Selection For a problem with K objectives K sub-populations of size N/K would be generated Sub-population will be shuffled together to obtain a new population of size N Crossover and mutation operators are applied in the usual way Maurizio Palesi [mpalesi@diit.unict.it] 28

VEGA (cnt d) Speciation problem This technique selects individual who excel in one dimension (without looking at the other dimensions) Loss of many compromise solutions Has been proved that shuffling sub-population corresponds to averaging the fitness components associated to each objective Linear combination of the objectives Weights depend on the distribution of the population at each generation Maurizio Palesi [mpalesi@diit.unict.it] 29

Lexicographic Ordering Objectives are ranked in order of importance Minimize the objective functions starting from the most important and proceeding to the assigned order of importance Maurizio Palesi [mpalesi@diit.unict.it] 30

Lexicographic Ordering (cnt d) Let f 1 (x) and f k (x) denote the most and least important objective functions, respectively Subject to And its solution x 1 * and f 1 *=f 1 (x 1 *) is obtained. Then the second problem is formulated as Subject to min f ( x) 1 ( x) 0 j =1,2 m g j,..., 1 ( x) min g j 0 f = ( ) * x f 1 f 2 j ( x) = 1,2,..., m And the solution of the problem is obtained as x 2 * and f 2 *=f 2 (x 2 *). This procedure is repeated until all k objectives have been considered Maurizio Palesi [mpalesi@diit.unict.it] 31

Lexicographic Ordering (cnt d) Often the population converge to a particular part of Pareto front Maurizio Palesi [mpalesi@diit.unict.it] 32

Use of Game Theory x 2 x 2 * Constant f 1 contours N f f 1 2 ( * *) ( * x ) 1, x2 f1 x1, x2 ( * *) ( * x, x f x, x ) 1 2 2 1 2 O 1 O 2 Constant f 2 contours Increasing f 1 Increasing f 2 No player can deviate unilaterally from this point for further improvement his own criterion x 1 * x 1 Maurizio Palesi [mpalesi@diit.unict.it] 33

Pareto-Based Approaches Calculating an individual s fitness on the basis of Pareto dominance Dominance Rank: Number of individuals by which an individual is dominated Dominance Depth: Population is divided in into several fronts and the depth reflects to which front an individual belongs to Dominance Count: Number of individuals dominated by a certain individual The fitness is related to the whole population In contrast to aggregation-based methods which calculate an individual s raw fitness value indipendently of other individuals Maurizio Palesi [mpalesi@diit.unict.it] 34

SPEA2 Strength Pareto Evolutionary Algorithm P t : Population at generation t A t : Archive (external set) at generation t Fitness Assignment Environmental Selection Maurizio Palesi [mpalesi@diit.unict.it] 35

SPEA2 Fitness Assignment Each individual i in A t and P t is assigned a strength value S(i) S(i): number of solutions it dominates S () i = { j j P A i f j} t t Maurizio Palesi [mpalesi@diit.unict.it] 36

SPEA2 Fitness Assignment (cnt d) On the basis of S values, the raw fitness R(i) of an individual i is calculated R ( i) = S( j) j Pt jfi A That is the raw fitness is determined by the strengths of its dominators in both archive and population t Maurizio Palesi [mpalesi@diit.unict.it] 37

SPEA2 Fitness Assignment (cnt d) f 2 0 Non-dominated Dominated 2 0 12 9 5 0 19 14 f 1 Maurizio Palesi [mpalesi@diit.unict.it] 38

SPEA2 Fitness Assignment (cnt d) Problems when most individuals do not dominate each other Adding density information to discriminate between individuals having the same raw fitness The density at any point is a decreasing function of the distance to the k-th nearest data point D 1 σ + 2 () i = i Pt At k i where σ i k denotes the distance of i to its k-th nearest neighbor F ( i) = R( i) + D( i) i P t At Maurizio Palesi [mpalesi@diit.unict.it] 39

SPEA2 Environmental Selection A Start { i i P A F() 1} t+ 1 = t t i < T A t+1 < N F Copy the best N- A t+1 dominated individuals in P t and A t to A t+1 F A t+1 > N T Stop Remove an individual which as a minimum distance to another individual Maurizio Palesi [mpalesi@diit.unict.it] 40

Maurizio Palesi [mpalesi@diit.unict.it] 41 Truncation Technique Truncation Technique ( ) [ ] k j k i l j l i t k j k i t d k l A k A k j i σ σ σ σ σ σ = = < < < < = < < + + : 0 : 0 : 0 : 1 1 f 1 f 2 f 1 f 2 2 3 1

SPEA2 Main Loop Initialization: generate an initial population P 0 and create the empty archive A 0 =Ø. Set t=0. Fitness assignment: Calculate fitness values of individuals in P t and A t. Environmental selection: Copy all nondominate individuals in P t and A t to A t+1. If size of A t+1 exceeds N then reduce A t+1 by means of the truncation operator, otherwise if size of A t+1 is less then N then fill A t+1 with dominated individual in P t and A t. Termination: If t T or another stopping criterion is satisfied then set A* to the set of decision vectors represented by the nondominated individuals in A t+1. Stop. Mating selection: Perform binary tournament selection with replacement on A t+1 in order to fill the mating pool Variation: Apply recombination and mutation operators to the mating pool and set P t+1 to the resulting population. Increment generation counter (t=t+1) and go Step 2. Maurizio Palesi [mpalesi@diit.unict.it] 42

Quality Assessment Notion of performance Computational resources needed Number of fitness evaluations Overall run-time No difference between single- and multi-objective optimization Quality of the solutions Single-objective: the smaller (or larger) the value of the objective function, the better the solution Multi-objective:? Maurizio Palesi [mpalesi@diit.unict.it] 43

Quality Assessment (cnt d) Compare 2 solutions x 1 and x 2 x 1 is better then x 2 if x 1 fx 2 Compare two sets of solutions Closeness to the optimal Pareto surface? Coverage of a wide range of diverse solutions? Other properties? f 2 f 1 Maurizio Palesi [mpalesi@diit.unict.it] 44

Quality Assessment (cnt d) How to best summarize Pareto set approximations by means of a few characteristic numbers Crucial point: does not lose the information one is interested in Some quality measures Average distance from Pareto-optimal front Hypervolume measure Diversity Spread Cardinality Maurizio Palesi [mpalesi@diit.unict.it] 45

Quality Assessment (cnt d) Quality of a Pareto set approximation cannot be completely described by a (finite) set of distinct criteria Average distance, Diversity, Cardinality f 2 f 2 S S T T f 1 f 1 S f T S is better then T S f T S is worst then T Maurizio Palesi [mpalesi@diit.unict.it] 46

Binary ε-quality Measure Let S and T two Pareto-set Binary ε-quality measure is the minimum ε R such that any b T is ε-dominated by at least one a S I ε ( S, T ) { ε R b T a S : a f b} = min ε I ε (S,T)<1: All solutions in T are dominated by a solution in S I ε (S,T)=1 I ε (T,S)=1: T and S represent the same Pareto front approximation I ε (S,T)>1 I ε (S,T)>1: T and S are incomparable Maurizio Palesi [mpalesi@diit.unict.it] 47

Reference architecture (HPL-PD) L2 L2 Unified Cache Prefetch Cache Prefetch Unit Predicate Registers Branch Unit Fetch Unit Branch Registers Integer Unit General Prupose Registers Instruction Queue Floating Point Registers Floating Point Unit Decode and Control Logic Control Registers Load/Store Unit L1 L1 Data Data Cache Cache L1 L1 Instruction Cache Cache Maurizio Palesi [mpalesi@diit.unict.it] 48

Pareto Set (G721 encode) 13% Maurizio Palesi [mpalesi@diit.unict.it] 49

Resources Programmin libraries GALibs (http://lancet.mit.edu/ga/) MOMHLib++ (http://wwwidss.cs.put.poznan.pl/~jaszkiewicz/momhlib/) PISA (http://www.tik.ee.ethz.ch/pisa/) References EMOO (http://www.lania.mx/~ccoello/emoo/) Maurizio Palesi [mpalesi@diit.unict.it] 50