Computationally Expensive Multi-objective Optimization. Juliane Müller
|
|
- Ernest Shelton
- 5 years ago
- Views:
Transcription
1 Computationally Expensive Multi-objective Optimization Juliane Müller Lawrence Berkeley National Lab IMA Research Collaboration Workshop University of Minnesota, February 23, 2016 J. Müller Surrogate Models and MOP 1/30
2 Outline Multi-objective optimization problem J. Müller Surrogate Models and MOP 2/30
3 Outline Multi-objective optimization problem J. Müller Surrogate Models and MOP 3/30
4 Examples of multi-objective optimization Design optimization: airfoil design maximize lift minimize drag s 1 (0,0) (x 1, y 1 ) (x 1, y 1 ) (x 2, y 2 ) (x 2, y 2 ) s 2 (1,0) J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 4/30
5 Examples of multi-objective optimization Design optimization: airfoil design maximize lift minimize drag s 1 (0,0) (x 1, y 1 ) (x 1, y 1 ) (x 2, y 2 ) (x 2, y 2 ) s 2 (1,0) Flight choice minimize ticket price minimize number/time of layovers J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 4/30
6 Multi-objective optimization problem Minimize f(x) = [f 1 (x), f 2 (x),..., f k (x)] T < x l j x j x u j <, j = 1,..., d, (1a) (1b) where f i : R d R, i = 1,..., k, k 2 f i are generally in conflict (improving one objective will impair another objective) There does not exist one optimal solution Goal: find the best trade-off (Pareto-optimal) solutions J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 5/30
7 Pareto front (objective space), biobjective problem The points x A and x B that correspond to A and B, respectively are non-dominated. The point x C that corresponds to C is dominated. J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 6/30
8 Decision vector domination and Pareto optimality Decision vector domination Decision vector x 1 dominates x 2 (x 1 x 2 ) iff f i (x 1 ) f i (x 2 ) i = 1,..., k and m = 1,..., k : f m (x 1 ) < f m (x 2 ) Pareto-optimal set P = {x : x such that x x } Pareto-optimal front P F = {f = [f 1 (x ), f 2 (x ),..., f k (x )] T : x P } J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 7/30
9 Difficulties of solving problem (1) Evaluating the objective functions is computationally extremely expensive (one evaluation may take several hours) f i are computed by black-box simulations: we do not have an analytic description and no derivative information Traditional methods for solving (1) generally require thousands of function evaluations J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 8/30
10 Traditional solution methods using reformulation For computationally cheap problems Linear scalarization of (1a): min x ɛ-constraint method k w i f i (x), w i > 0 (2) i=1 min x f m (x) (3a) s.t. f i (x) ɛ i, i m Solve to optimality and obtain one Pareto-optimal point Not efficient for computationally expensive problems (3b) J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 9/30
11 Goal of this research Devise an efficient algorithm that finds a good approximation of the Pareto front and that only uses few hundred function evaluations J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 10/30
12 Outline Multi-objective optimization problem J. Müller Surrogate Models and MOP 11/30
13 (single objective) Use surrogate models to approximate the expensive objective functions f 1,..., f k : f i (x) = s i (x) + e i (x), i = 1,..., k, f i (x): the ith computationally-expensive objective function s i (x): the ith surrogate model output (cheap to evaluate) e i (x): the difference between both J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 12/30
14 Need some initial data to fit a surrogate model S = {x 1,..., x n } f i (x 1 ),..., f i (x n ) i = 1,..., k Fit a surrogate model s i to each of the data pairs (x l, f i (x l )), l = 1,..., n, i = 1,..., k J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 13/30
15 Need some initial data to fit a surrogate model S = {x 1,..., x n } f i (x 1 ),..., f i (x n ) i = 1,..., k Fit a surrogate model s i to each of the data pairs (x l, f i (x l )), l = 1,..., n, i = 1,..., k 15 s 1 (x) sample data f 1 (x) x J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 13/30
16 Need some initial data to fit a surrogate model S = {x 1,..., x n } f i (x 1 ),..., f i (x n ) i = 1,..., k Fit a surrogate model s i to each of the data pairs (x l, f i (x l )), l = 1,..., n, i = 1,..., k s 1 (x) s 2 (x) sample data f 1 (x) sample data f (x) x J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 13/30
17 Radial basis function (RBF) surrogate model where s(x) = n λ l φ( x x l 2 ) + p(x), (4) l=1 φ(r) = r 3 the cubic radial basis function p(x) = a + b T x the polynomial tail x l, l = 1,..., n, the already evaluated points determine parameters λ l R, l = 1,..., n, a R, and b = [b 1,..., b d ] T R d by solving a linear system of equations J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 14/30
18 Outline Multi-objective optimization problem J. Müller Surrogate Models and MOP 15/30
19 Overview Sampling strategies Algorithm steps 1: Initial experimental design, x 1,..., x n0. 2: Computationally expensive objective function evaluations, f i (x 1 ),..., f i (x n0 ), i = 1,..., k. 3: Identify the non-dominated points. 4: Fit the RBF surrogate models to the data. 5. Select the new sample point x new : Balance between local and global search 5: Evaluate f i (x new ), i = 1,..., k and go to Step 3. J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 16/30
20 Overview Sampling strategies Algorithm steps 1: Initial experimental design, x 1,..., x n0. 2: Computationally expensive objective function evaluations, f i (x 1 ),..., f i (x n0 ), i = 1,..., k. 3: Identify the non-dominated points. 4: Fit the RBF surrogate models to the data. 5. Select the new sample point x new : Balance between local and global search 5: Evaluate f i (x new ), i = 1,..., k and go to Step 3. J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 16/30
21 Overview Sampling strategies Sampling based on decision and objective space J. Müller Surrogate Models and MOP 17/30
22 Overview Sampling strategies Target value sampling objective space Approximate the Pareto front with a piecewise linear function Find the point x for which f 1 (x) = 2 and f 2 (x) = 20 J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 18/30
23 Overview Sampling strategies Target value sampling decision space Target values (t1, t2) = (2, 20) Find the point x for which f 1 (x) = 2 and f 2 (x) = 20 Use the computationally cheap surrogate models s 1 and s 2 and solve a computationally cheap multi-objective optimization problem min x [ s 1 (x) t1, s 2 (x) t2 ] T (5a) < x l j x j x u j <, j = 1,..., d. (5b) J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 19/30
24 Overview Sampling strategies Perturbation of non-dominated points 1 Decision space x Sample points Non dominated points Perturbation points x J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 20/30
25 Overview Sampling strategies Minimum point of each surrogate model Find the minimum point of each surrogate model (computationally cheap) i: min x s i (x) s.t. x l j x j x u j, j = 1,..., d s 1 (x) s (x) 2 sample data f 1 (x) sample data f 2 (x) x J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 21/30
26 Overview Sampling strategies Stochastic sampling and scoring Randomly generate points in the variable domain Score each random point and select the best one: Use the surrogate models s1,..., s k to predict the objective function values (surrogate model score) Compute the distance of each random point to the set of already evaluated points (distance score) Compute a weighted sum of both scores J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 22/30
27 Overview Sampling strategies Solve the surrogate multi-objective problem Use a multi-objective genetic algorithm to find the Pareto-optimal points of the approximation problem Minimize s(x) = [s 1 (x), s 2 (x),..., s k (x)] T < x l j x j x u j <, j = 1,..., d, May generate a large set of new sample points (6a) (6b) J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 23/30
28 Outline Multi-objective optimization problem J. Müller Surrogate Models and MOP 24/30
29 Example with d = 2, k = 2 min f(x) = [f 1 (x), f 2 (x)] T 2 f 1 (x 1, x 2 ) = 1 exp j=1 2 f 2 (x 1, x 2 ) = 1 exp j=1 4 x 1, x 2 4 [ x j 1 [ x j + 1 ] 2 2 ] 2 2 (7a) (7b) (7c) (7d) J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 25/30
30 Example with d = 2, k = 2 f 1 1 f x x x x 1 5 J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 26/30
31 Example with d = 2, k = 2 1 Objective space 0.8 minimize f minimize f 1 J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 26/30
32 Example with d = 2, k = 2 1 Objective space 0.8 minimize f minimize f 1 J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 26/30
33 Example with d = 2, k = 2 1 Objective space 0.8 minimize f minimize f 1 J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 26/30
34 Example with d = 2, k = 2 1 Objective space 0.8 minimize f minimize f 1 J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 26/30
35 Example with d = 2, k = 2 1 Objective space 0.8 minimize f minimize f 1 J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 26/30
36 Example with d = 2, k = 2 1 Objective space 0.8 minimize f minimize f 1 J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 26/30
37 Example with d = 2, k = 2 1 Objective space 0.8 minimize f minimize f 1 J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 26/30
38 Example with d = 2, k = 2 1 Objective space 0.8 minimize f minimize f 1 J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 26/30
39 Example with d = 2, k = 2 1 Objective space 0.8 minimize f minimize f 1 J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 26/30
40 Example with d = 2, k = 2 1 Objective space 0.8 minimize f minimize f 1 J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 26/30
41 Example with d = 2, k = 2 1 Objective space 0.8 minimize f minimize f 1 J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 26/30
42 Example with d = 2, k = 2 1 Objective space 0.8 minimize f minimize f 1 J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 26/30
43 Comparison to MO-Genetic Algorithm (MOGA) We compared the algorithm on 58 benchmark problems, one application from airfoil design, one application from structural optimization 1-35 variables, 2-10 objective functions Pareto fronts: convex connected, concave connected, disconnected, unknown J. Müller Surrogate Models and MOP 27/30
44 Comparison to MO-Genetic Algorithm (MOGA) Performance measures (a) Number of non-dominated solutions (b) Set coverage metric: the proportion of solutions from MOGA that are weakly dominated by solutions from SOCEMO and vice versa (c) hypervolume: the size of the space covered J. Müller Surrogate Models and MOP 28/30
45 Comparison to MO-Genetic Algorithm (MOGA) Large numbers are better SOCEMO MOGA SOCEMO MOGA SOCEMO MOGA (a) (b) (c) (a) Number of non-dominated solutions (b) Set coverage metric (c) Hypervolume J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 29/30
46 Conclusions We developed an algorithm for computationally expensive black-box multi-objective optimization problems We use computationally cheap surrogate models to approximate the expensive objective functions We use a combination of local and global search strategies showed that SOCEMO is an efficient and effective approach J. Müller (julianemueller@lbl.gov) Surrogate Models and MOP 30/30
Multi Objective Optimization
Multi Objective Optimization Handout November 4, 2011 (A good reference for this material is the book multi-objective optimization by K. Deb) 1 Multiple Objective Optimization So far we have dealt with
More informationA New Trust Region Algorithm Using Radial Basis Function Models
A New Trust Region Algorithm Using Radial Basis Function Models Seppo Pulkkinen University of Turku Department of Mathematics July 14, 2010 Outline 1 Introduction 2 Background Taylor series approximations
More informationDecoding Strategies for the 0/1 Multi-objective Unit Commitment Problem
for the mono-objective UCP Decoding Strategies for the 0/1 Multi-objective Unit Commitment Problem S. Jacquin 1,2 Lucien Mousin 1 I. Machado 3 L. Jourdan 1,2 E.G. Talbi 1,2 1 Inria Lille - Nord Europe,
More informationCIS 520: Machine Learning Oct 09, Kernel Methods
CIS 520: Machine Learning Oct 09, 207 Kernel Methods Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture They may or may not cover all the material discussed
More informationA Brief Introduction to Multiobjective Optimization Techniques
Università di Catania Dipartimento di Ingegneria Informatica e delle Telecomunicazioni A Brief Introduction to Multiobjective Optimization Techniques Maurizio Palesi Maurizio Palesi [mpalesi@diit.unict.it]
More informationBi-objective Portfolio Optimization Using a Customized Hybrid NSGA-II Procedure
Bi-objective Portfolio Optimization Using a Customized Hybrid NSGA-II Procedure Kalyanmoy Deb 1, Ralph E. Steuer 2, Rajat Tewari 3, and Rahul Tewari 4 1 Department of Mechanical Engineering, Indian Institute
More informationSupport Vector Machines
Support Vector Machines Here we approach the two-class classification problem in a direct way: We try and find a plane that separates the classes in feature space. If we cannot, we get creative in two
More informationEfficient Nonlinear Optimizations of Queuing Systems
Efficient Nonlinear Optimizations of Queuing Systems Mung Chiang, Arak Sutivong, and Stephen Boyd Electrical Engineering Department, Stanford University, CA 9435 Abstract We present a systematic treatment
More informationSVMs: Non-Separable Data, Convex Surrogate Loss, Multi-Class Classification, Kernels
SVMs: Non-Separable Data, Convex Surrogate Loss, Multi-Class Classification, Kernels Karl Stratos June 21, 2018 1 / 33 Tangent: Some Loose Ends in Logistic Regression Polynomial feature expansion in logistic
More information7.1 Sampling Error The Need for Sampling Distributions
7.1 Sampling Error The Need for Sampling Distributions Tom Lewis Fall Term 2009 Tom Lewis () 7.1 Sampling Error The Need for Sampling Distributions Fall Term 2009 1 / 5 Outline 1 Tom Lewis () 7.1 Sampling
More informationarxiv: v1 [cs.ne] 9 May 2016
Anytime Bi-Objective Optimization with a Hybrid Multi-Objective CMA-ES (HMO-CMA-ES) arxiv:1605.02720v1 [cs.ne] 9 May 2016 ABSTRACT Ilya Loshchilov University of Freiburg Freiburg, Germany ilya.loshchilov@gmail.com
More informationSupport Vector Machines
Support Vector Machines INFO-4604, Applied Machine Learning University of Colorado Boulder September 28, 2017 Prof. Michael Paul Today Two important concepts: Margins Kernels Large Margin Classification
More informationA Non-Parametric Statistical Dominance Operator for Noisy Multiobjective Optimization
A Non-Parametric Statistical Dominance Operator for Noisy Multiobjective Optimization Dung H. Phan and Junichi Suzuki Deptartment of Computer Science University of Massachusetts, Boston, USA {phdung, jxs}@cs.umb.edu
More informationCalibrating Environmental Engineering Models and Uncertainty Analysis
Models and Cornell University Oct 14, 2008 Project Team Christine Shoemaker, co-pi, Professor of Civil and works in applied optimization, co-pi Nikolai Blizniouk, PhD student in Operations Research now
More informationEmpirical Risk Minimization
Empirical Risk Minimization Fabrice Rossi SAMM Université Paris 1 Panthéon Sorbonne 2018 Outline Introduction PAC learning ERM in practice 2 General setting Data X the input space and Y the output space
More informationCubic Splines MATH 375. J. Robert Buchanan. Fall Department of Mathematics. J. Robert Buchanan Cubic Splines
Cubic Splines MATH 375 J. Robert Buchanan Department of Mathematics Fall 2006 Introduction Given data {(x 0, f(x 0 )), (x 1, f(x 1 )),...,(x n, f(x n ))} which we wish to interpolate using a polynomial...
More informationShape of the return probability density function and extreme value statistics
Shape of the return probability density function and extreme value statistics 13/09/03 Int. Workshop on Risk and Regulation, Budapest Overview I aim to elucidate a relation between one field of research
More informationCS412: Introduction to Numerical Methods
CS412: Introduction to Numerical Methods MIDTERM #1 2:30PM - 3:45PM, Tuesday, 03/10/2015 Instructions: This exam is a closed book and closed notes exam, i.e., you are not allowed to consult any textbook,
More informationDistortion Risk Measures: Coherence and Stochastic Dominance
Distortion Risk Measures: Coherence and Stochastic Dominance Dr. Julia L. Wirch Dr. Mary R. Hardy Dept. of Actuarial Maths Dept. of Statistics and and Statistics Actuarial Science Heriot-Watt University
More informationEfficient Hessian Calculation using Automatic Differentiation
25th AIAA Applied Aerodynamics Conference, 25-28 June 2007, Miami Efficient Hessian Calculation using Automatic Differentiation Devendra P. Ghate and Michael B. Giles Oxford University Computing Laboratory,
More informationRobust Pareto Design of GMDH-type Neural Networks for Systems with Probabilistic Uncertainties
. Hybrid GMDH-type algorithms and neural networks Robust Pareto Design of GMDH-type eural etworks for Systems with Probabilistic Uncertainties. ariman-zadeh, F. Kalantary, A. Jamali, F. Ebrahimi Department
More informationMathematical Economics: Lecture 3
Mathematical Economics: Lecture 3 Yu Ren WISE, Xiamen University October 7, 2012 Outline 1 Example of Graphing Example 3.1 Consider the cubic function f (x) = x 3 3x. Try to graph this function Example
More informationCurve Sketching. Warm up
Curve Sketching Warm up Below are pictured six functions: f,f 0,f 00,g,g 0, and g 00. Pick out the two functions that could be f and g, andmatchthemtotheir first and second derivatives, respectively. (a)
More informationA Study on Non-Correspondence in Spread between Objective Space and Design Variable Space in Pareto Solutions
A Study on Non-Correspondence in Spread between Objective Space and Design Variable Space in Pareto Solutions Tomohiro Yoshikawa, Toru Yoshida Dept. of Computational Science and Engineering Nagoya University
More informationEx 1: Identify the open intervals for which each function is increasing or decreasing.
MATH 2040 Notes: Unit 4 Page 1 5.1/5.2 Increasing and Decreasing Functions Part a Relative Extrema Ex 1: Identify the open intervals for which each In algebra we defined increasing and decreasing behavior
More informationProbabilistic Inverse Simulation and Its Application in Vehicle Accident Reconstruction
ASME 2013 IDETC/CIE 2013 Paper number: DETC2013-12073 Probabilistic Inverse Simulation and Its Application in Vehicle Accident Reconstruction Xiaoyun Zhang Shanghai Jiaotong University, Shanghai, China
More informationTIES598 Nonlinear Multiobjective Optimization A priori and a posteriori methods spring 2017
TIES598 Nonlinear Multiobjective Optimization A priori and a posteriori methods spring 2017 Jussi Hakanen jussi.hakanen@jyu.fi Contents A priori methods A posteriori methods Some example methods Learning
More information7) Important properties of functions: homogeneity, homotheticity, convexity and quasi-convexity
30C00300 Mathematical Methods for Economists (6 cr) 7) Important properties of functions: homogeneity, homotheticity, convexity and quasi-convexity Abolfazl Keshvari Ph.D. Aalto University School of Business
More informationIntroduction to Machine Learning
1, DATA11002 Introduction to Machine Learning Lecturer: Teemu Roos TAs: Ville Hyvönen and Janne Leppä-aho Department of Computer Science University of Helsinki (based in part on material by Patrik Hoyer
More informationOptimization-based Modeling and Analysis Techniques for Safety-Critical Software Verification
Optimization-based Modeling and Analysis Techniques for Safety-Critical Software Verification Mardavij Roozbehani Eric Feron Laboratory for Information and Decision Systems Department of Aeronautics and
More informationMultivariate Optimization of High Brightness High Current DC Photoinjector. Ivan Bazarov
Multivariate Optimization of High Brightness High Current DC Photoinjector Ivan Bazarov Talk Outline: Motivation Evolutionary Algorithms Optimization Results 1 ERL Injector 750 kv DC Gun Goal: provide
More informationMulti-Objective Autonomous Spacecraft Motion Planning around Near-Earth Asteroids using Machine Learning
Multi-Objective Autonomous Spacecraft Motion Planning around Near-Earth Asteroids using Machine Learning CS 229 Final Project - Fall 2018 Category: Physical Sciences Tommaso Guffanti SUNetID: tommaso@stanford.edu
More informationSchool of Business. Blank Page
Maxima and Minima 9 This unit is designed to introduce the learners to the basic concepts associated with Optimization. The readers will learn about different types of functions that are closely related
More informationMassive Experiments and Observational Studies: A Linearithmic Algorithm for Blocking/Matching/Clustering
Massive Experiments and Observational Studies: A Linearithmic Algorithm for Blocking/Matching/Clustering Jasjeet S. Sekhon UC Berkeley June 21, 2016 Jasjeet S. Sekhon (UC Berkeley) Methods for Massive
More informationChapter 6 Continuous Probability Distributions
Continuous Probability Distributions Learning Objectives 1. Understand the difference between how probabilities are computed for discrete and continuous random variables. 2. Know how to compute probability
More informationShort Course Robust Optimization and Machine Learning. 3. Optimization in Supervised Learning
Short Course Robust Optimization and 3. Optimization in Supervised EECS and IEOR Departments UC Berkeley Spring seminar TRANSP-OR, Zinal, Jan. 16-19, 2012 Outline Overview of Supervised models and variants
More informationHandling Uncertainty in Indicator-Based Multiobjective Optimization
International Journal of Computational Intelligence Research. ISSN 0973-1873 Vol.2, No.3 (2006), pp. 255 272 c Research India Publications http://www.ijcir.info Handling Uncertainty in Indicator-Based
More informationStatistical Methods for SVM
Statistical Methods for SVM Support Vector Machines Here we approach the two-class classification problem in a direct way: We try and find a plane that separates the classes in feature space. If we cannot,
More informationApproximate dynamic programming for stochastic reachability
Approximate dynamic programming for stochastic reachability Nikolaos Kariotoglou, Sean Summers, Tyler Summers, Maryam Kamgarpour and John Lygeros Abstract In this work we illustrate how approximate dynamic
More informationShape Optimisation of Axisymmetric Scramjets
Shape Optimisation of Axisymmetric Scramjets Hideaki Ogawa 11 th International Workshop on Shock Tube Technology Australian Hypersonics Workshop 2011 University of Queensland, Brisbane 23 rd March 2011
More informationSemidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization
Semidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization Instructor: Farid Alizadeh Author: Ai Kagawa 12/12/2012
More informationLecture 18: Kernels Risk and Loss Support Vector Regression. Aykut Erdem December 2016 Hacettepe University
Lecture 18: Kernels Risk and Loss Support Vector Regression Aykut Erdem December 2016 Hacettepe University Administrative We will have a make-up lecture on next Saturday December 24, 2016 Presentations
More informationMonotonicity Analysis, Evolutionary Multi-Objective Optimization, and Discovery of Design Principles
Monotonicity Analysis, Evolutionary Multi-Objective Optimization, and Discovery of Design Principles Kalyanmoy Deb and Aravind Srinivasan Kanpur Genetic Algorithms Laboratory (KanGAL) Indian Institute
More informationConvex optimization problems. Optimization problem in standard form
Convex optimization problems optimization problem in standard form convex optimization problems linear optimization quadratic optimization geometric programming quasiconvex optimization generalized inequality
More informationLearning Tetris. 1 Tetris. February 3, 2009
Learning Tetris Matt Zucker Andrew Maas February 3, 2009 1 Tetris The Tetris game has been used as a benchmark for Machine Learning tasks because its large state space (over 2 200 cell configurations are
More informationLinear Scalarized Knowledge Gradient in the Multi-Objective Multi-Armed Bandits Problem
ESANN 04 proceedings, European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning. Bruges (Belgium), 3-5 April 04, i6doc.com publ., ISBN 978-8749095-7. Available from
More informationBasics of Multivariate Modelling and Data Analysis
Basics of Multivariate Modelling and Data Analysis Kurt-Erik Häggblom 6. Principal component analysis (PCA) 6.1 Overview 6.2 Essentials of PCA 6.3 Numerical calculation of PCs 6.4 Effects of data preprocessing
More informationAnalyses of Guide Update Approaches for Vector Evaluated Particle Swarm Optimisation on Dynamic Multi-Objective Optimisation Problems
WCCI 22 IEEE World Congress on Computational Intelligence June, -5, 22 - Brisbane, Australia IEEE CEC Analyses of Guide Update Approaches for Vector Evaluated Particle Swarm Optimisation on Dynamic Multi-Objective
More informationMath 1311 Section 5.5 Polynomials and Rational Functions
Math 1311 Section 5.5 Polynomials and Rational Functions In addition to linear, exponential, logarithmic, and power functions, many other types of functions occur in mathematics and its applications. In
More informationMulti-objective genetic algorithm
Multi-objective genetic algorithm Robin Devooght 31 March 2010 Abstract Real world problems often present multiple, frequently conflicting, objectives. The research for optimal solutions of multi-objective
More informationConvex relaxations of chance constrained optimization problems
Convex relaxations of chance constrained optimization problems Shabbir Ahmed School of Industrial & Systems Engineering, Georgia Institute of Technology, 765 Ferst Drive, Atlanta, GA 30332. May 12, 2011
More informationGeneralization of Dominance Relation-Based Replacement Rules for Memetic EMO Algorithms
Generalization of Dominance Relation-Based Replacement Rules for Memetic EMO Algorithms Tadahiko Murata 1, Shiori Kaige 2, and Hisao Ishibuchi 2 1 Department of Informatics, Kansai University 2-1-1 Ryozenji-cho,
More informationAppendix II Testing for consistency
Appendix II Testing for consistency I. Afriat s (1967) Theorem Let (p i ; x i ) 25 i=1 be the data generated by some individual s choices, where pi denotes the i-th observation of the price vector and
More informationOn Kusuoka Representation of Law Invariant Risk Measures
MATHEMATICS OF OPERATIONS RESEARCH Vol. 38, No. 1, February 213, pp. 142 152 ISSN 364-765X (print) ISSN 1526-5471 (online) http://dx.doi.org/1.1287/moor.112.563 213 INFORMS On Kusuoka Representation of
More informationMath 1101 Chapter 2 Review Solve the equation. 1) (y - 7) - (y + 2) = 4y A) B) D) C) ) 2 5 x x = 5
Math 1101 Chapter 2 Review Solve the equation. 1) (y - 7) - (y + 2) = 4y A) - 1 2 B) - 9 C) - 9 7 D) - 9 4 2) 2 x - 1 3 x = A) -10 B) 7 C) -7 D) 10 Find the zero of f(x). 3) f(x) = 6x + 12 A) -12 B) -2
More informationData-Driven Optimization under Distributional Uncertainty
Data-Driven Optimization under Distributional Uncertainty Postdoctoral Researcher Electrical and Systems Engineering University of Pennsylvania A network of physical objects - Devices - Vehicles - Buildings
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationOPTIMALITY AND STABILITY OF SYMMETRIC EVOLUTIONARY GAMES WITH APPLICATIONS IN GENETIC SELECTION. (Communicated by Yang Kuang)
MATHEMATICAL BIOSCIENCES doi:10.3934/mbe.2015.12.503 AND ENGINEERING Volume 12, Number 3, June 2015 pp. 503 523 OPTIMALITY AND STABILITY OF SYMMETRIC EVOLUTIONARY GAMES WITH APPLICATIONS IN GENETIC SELECTION
More informationarxiv: v1 [math.oc] 26 Dec 2016
arxiv:1612.08412v1 [math.oc] 26 Dec 2016 Multi-Objective Simultaneous Optimistic Optimization Abdullah Al-Dujaili S. Suresh December 28, 2016 Abstract Optimistic methods have been applied with success
More informationLinear Models for Regression. Sargur Srihari
Linear Models for Regression Sargur srihari@cedar.buffalo.edu 1 Topics in Linear Regression What is regression? Polynomial Curve Fitting with Scalar input Linear Basis Function Models Maximum Likelihood
More informationminimize x subject to (x 2)(x 4) u,
Math 6366/6367: Optimization and Variational Methods Sample Preliminary Exam Questions 1. Suppose that f : [, L] R is a C 2 -function with f () on (, L) and that you have explicit formulae for
More informationA Reference Vector Guided Evolutionary Algorithm for Many-objective Optimization
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. XX, NO. X, XXXX XXXX A Reference Vector Guided Evolutionary Algorithm for Many-objective Optimization Ran Cheng, Yaochu Jin, Fellow, IEEE, Markus Olhofer
More informationConvex Optimization & Lagrange Duality
Convex Optimization & Lagrange Duality Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Convex optimization Optimality condition Lagrange duality KKT
More informationProbability Distributions: Continuous
Probability Distributions: Continuous INFO-2301: Quantitative Reasoning 2 Michael Paul and Jordan Boyd-Graber FEBRUARY 28, 2017 INFO-2301: Quantitative Reasoning 2 Paul and Boyd-Graber Probability Distributions:
More informationLinear Models for Regression CS534
Linear Models for Regression CS534 Example Regression Problems Predict housing price based on House size, lot size, Location, # of rooms Predict stock price based on Price history of the past month Predict
More informationSection 4.2 Polynomial Functions of Higher Degree
Section 4.2 Polynomial Functions of Higher Degree Polynomial Function P(x) P(x) = a degree 0 P(x) = ax +b (degree 1) Graph Horizontal line through (0,a) line with y intercept (0,b) and slope a P(x) = ax
More informationMax-Min Surrogate-Assisted Evolutionary Algorithm for Robust Design
Paper Submitted to IEEE TEC 1 Max-Min Surrogate-Assisted Evolutionary Algorithm for Robust Design Y. S. Ong, Member IEEE, P. B. Nair, K.Y. Lum ABSTRACT Solving design optimization problems using evolutionary
More informationSubmodularity in Machine Learning
Saifuddin Syed MLRG Summer 2016 1 / 39 What are submodular functions Outline 1 What are submodular functions Motivation Submodularity and Concavity Examples 2 Properties of submodular functions Submodularity
More informationPermanent Magnet Wind Generator Technology for Battery Charging Wind Energy Systems
Permanent Magnet Wind Generator Technology for Battery Charging Wind Energy Systems Casper J. J. Labuschagne, Maarten J. Kamper Electrical Machines Laboratory Dept of Electrical and Electronic Engineering
More informationDistributionally Robust Discrete Optimization with Entropic Value-at-Risk
Distributionally Robust Discrete Optimization with Entropic Value-at-Risk Daniel Zhuoyu Long Department of SEEM, The Chinese University of Hong Kong, zylong@se.cuhk.edu.hk Jin Qi NUS Business School, National
More informationMeasuring the Robustness of Neural Networks via Minimal Adversarial Examples
Measuring the Robustness of Neural Networks via Minimal Adversarial Examples Sumanth Dathathri sdathath@caltech.edu Stephan Zheng stephan@caltech.edu Sicun Gao sicung@ucsd.edu Richard M. Murray murray@cds.caltech.edu
More informationA short introduction to supervised learning, with applications to cancer pathway analysis Dr. Christina Leslie
A short introduction to supervised learning, with applications to cancer pathway analysis Dr. Christina Leslie Computational Biology Program Memorial Sloan-Kettering Cancer Center http://cbio.mskcc.org/leslielab
More informationSlide05 Haykin Chapter 5: Radial-Basis Function Networks
Slide5 Haykin Chapter 5: Radial-Basis Function Networks CPSC 636-6 Instructor: Yoonsuck Choe Spring Learning in MLP Supervised learning in multilayer perceptrons: Recursive technique of stochastic approximation,
More informationSTRIKE SLIP SPLAY USING DYNAMIC RUPTURE MODELS
STRIKE SLIP SPLAY USING DYNAMIC RUPTURE MODELS Julian Lozos (PEER, UC Berkeley) SWUS GMC Workshop #2, Berkeley, CA, 24 October 2013 Outline Issues in the dynamics of fault branches. Advantages and disadvantages
More informationMultiobjective Optimization
Multiobjective Optimization MTH6418 S Le Digabel, École Polytechnique de Montréal Fall 2015 (v2) MTH6418: Multiobjective 1/36 Plan Introduction Metrics BiMADS Other methods References MTH6418: Multiobjective
More informationChallenges and Complexity of Aerodynamic Wing Design
Chapter 1 Challenges and Complexity of Aerodynamic Wing Design Kasidit Leoviriyakit and Antony Jameson Department of Aeronautics and Astronautics Stanford University, Stanford CA kasidit@stanford.edu and
More informationApplied Numerical Analysis Homework #3
Applied Numerical Analysis Homework #3 Interpolation: Splines, Multiple dimensions, Radial Bases, Least-Squares Splines Question Consider a cubic spline interpolation of a set of data points, and derivatives
More informationUncertain Quantities 7.1 DISCRETE UNCERTAIN QUANTITIES
Uncertain Quantities 7 7.1 DISCRETE UNCERTAIN QUANTITIES Discrete Uncertain Quantity (UQ): a few, distinct values Assign probability mass to each value (probability mass function). A corporate planner
More informationUncertainty-based multidisciplinary design optimization of lunar CubeSat missions
4th Interplanetary CubeSat Workshop Uncertainty-based multidisciplinary design optimization of lunar CubeSat missions Xingzhi Hu 3 rd year PhD xh269@cam.ac.uk, huxingzhi@nudt.edu.cn Supervisor: Prof. Geoffrey
More informationIntroduction to Black-Box Optimization in Continuous Search Spaces. Definitions, Examples, Difficulties
1 Introduction to Black-Box Optimization in Continuous Search Spaces Definitions, Examples, Difficulties Tutorial: Evolution Strategies and CMA-ES (Covariance Matrix Adaptation) Anne Auger & Nikolaus Hansen
More informationInter-Relationship Based Selection for Decomposition Multiobjective Optimization
Inter-Relationship Based Selection for Decomposition Multiobjective Optimization Ke Li, Sam Kwong, Qingfu Zhang, and Kalyanmoy Deb Department of Electrical and Computer Engineering Michigan State University,
More informationEvolutionary Multiobjective. Optimization Methods for the Shape Design of Industrial Electromagnetic Devices. P. Di Barba, University of Pavia, Italy
Evolutionary Multiobjective Optimization Methods for the Shape Design of Industrial Electromagnetic Devices P. Di Barba, University of Pavia, Italy INTRODUCTION Evolutionary Multiobjective Optimization
More informationTerminology Suppose we have N observations {x(n)} N 1. Estimators as Random Variables. {x(n)} N 1
Estimation Theory Overview Properties Bias, Variance, and Mean Square Error Cramér-Rao lower bound Maximum likelihood Consistency Confidence intervals Properties of the mean estimator Properties of the
More informationStat542 (F11) Statistical Learning. First consider the scenario where the two classes of points are separable.
Linear SVM (separable case) First consider the scenario where the two classes of points are separable. It s desirable to have the width (called margin) between the two dashed lines to be large, i.e., have
More informationSupport Vector Machines and Kernel Methods
2018 CS420 Machine Learning, Lecture 3 Hangout from Prof. Andrew Ng. http://cs229.stanford.edu/notes/cs229-notes3.pdf Support Vector Machines and Kernel Methods Weinan Zhang Shanghai Jiao Tong University
More informationDecoding Revisited: Easy-Part-First & MERT. February 26, 2015
Decoding Revisited: Easy-Part-First & MERT February 26, 2015 Translating the Easy Part First? the tourism initiative addresses this for the first time the die tm:-0.19,lm:-0.4, d:0, all:-0.65 tourism touristische
More informationMidterm Review CS 6375: Machine Learning. Vibhav Gogate The University of Texas at Dallas
Midterm Review CS 6375: Machine Learning Vibhav Gogate The University of Texas at Dallas Machine Learning Supervised Learning Unsupervised Learning Reinforcement Learning Parametric Y Continuous Non-parametric
More informationMATH ASSIGNMENT 07 SOLUTIONS. 8.1 Following is census data showing the population of the US between 1900 and 2000:
MATH4414.01 ASSIGNMENT 07 SOLUTIONS 8.1 Following is census data showing the population of the US between 1900 and 2000: Years after 1900 Population in millions 0 76.0 20 105.7 40 131.7 60 179.3 80 226.5
More informationQ 0 x if x 0 x x 1. S 1 x if x 1 x x 2. i 0,1,...,n 1, and L x L n 1 x if x n 1 x x n
. - Piecewise Linear-Quadratic Interpolation Piecewise-polynomial Approximation: Problem: Givenn pairs of data points x i, y i, i,,...,n, find a piecewise-polynomial Sx S x if x x x Sx S x if x x x 2 :
More informationA Polynomial-time Nash Equilibrium Algorithm for Repeated Games
A Polynomial-time Nash Equilibrium Algorithm for Repeated Games Michael L. Littman mlittman@cs.rutgers.edu Rutgers University Peter Stone pstone@cs.utexas.edu The University of Texas at Austin Main Result
More informationModeling and Predicting Chaotic Time Series
Chapter 14 Modeling and Predicting Chaotic Time Series To understand the behavior of a dynamical system in terms of some meaningful parameters we seek the appropriate mathematical model that captures the
More informationChoked By Red Tape? The Political Economy of Wasteful Trade Barriers
Choked By Red Tape? The Political Economy of Wasteful Trade Barriers Giovanni Maggi, Monika Mrázová and J. Peter Neary Yale, Geneva and Oxford Research Workshop on the Economics of International Trade
More informationRobust Integer Programming
Robust Integer Programming Technion Israel Institute of Technology Outline. Overview our theory of Graver bases for integer programming (with Berstein, De Loera, Hemmecke, Lee, Romanchuk, Rothblum, Weismantel)
More informationDover- Sherborn High School Mathematics Curriculum Probability and Statistics
Mathematics Curriculum A. DESCRIPTION This is a full year courses designed to introduce students to the basic elements of statistics and probability. Emphasis is placed on understanding terminology and
More informationMulticriteria Optimization and Decision Making
Multicriteria Optimization and Decision Making Principles, Algorithms and Case Studies Michael Emmerich and André Deutz LIACS Master Course: Autumn/Winter 2014/2015 Contents 1 Introduction 5 1.1 Viewing
More informationLocal search algorithms. Chapter 4, Sections 3 4 1
Local search algorithms Chapter 4, Sections 3 4 Chapter 4, Sections 3 4 1 Outline Hill-climbing Simulated annealing Genetic algorithms (briefly) Local search in continuous spaces (very briefly) Chapter
More informationFrederick Sander. Variable Grouping Mechanisms and Transfer Strategies in Multi-objective Optimisation
Frederick Sander Variable Grouping Mechanisms and Transfer Strategies in Multi-objective Optimisation Intelligent Cooperative Systems Master Thesis Variable Grouping Mechanisms and Transfer Strategies
More informationWeb-based Supplementary Materials for A Robust Method for Estimating. Optimal Treatment Regimes
Biometrics 000, 000 000 DOI: 000 000 0000 Web-based Supplementary Materials for A Robust Method for Estimating Optimal Treatment Regimes Baqun Zhang, Anastasios A. Tsiatis, Eric B. Laber, and Marie Davidian
More informationSwitched systems: stability
Switched systems: stability OUTLINE Switched Systems Stability of Switched Systems OUTLINE Switched Systems Stability of Switched Systems a family of systems SWITCHED SYSTEMS SWITCHED SYSTEMS a family
More informationx 2 i 10 cos(2πx i ). i=1
CHAPTER 2 Optimization Written Exercises 2.1 Consider the problem min, where = 4 + 4 x 2 i 1 cos(2πx i ). i=1 Note that is the Rastrigin function see Section C.1.11. a) What are the independent variables
More information