Computational Intelligence in Product-line Optimization
|
|
- Berenice Marjory Watts
- 6 years ago
- Views:
Transcription
1 Computational Intelligence in Product-line Optimization Simulations and Applications Peter Kurz June 2017 Restricted use
2 Restricted use
3 Computational
4 Intelligence in
5 Product-line
6 Optimization
7 Restricted use
8 Simulations and Applications Peter Kurz Head of Research & Development Applied Marketing Science KANTAR TNS June 2017 Portfolio optimization is one important topic in the context of new product development. In market research discrete choice models are one of the main techniques to determine the preference of new products. The poster compare particle swarm, genetic, ant colony, simulated annealing and multi-verse optimizer concerning their performance and ability to find the optimal portfolio. The results of the work are based on simulated datasets and one real empirical dataset conducted in the German mobile phone market. All datasets have nested attribute structures that results in high complexity. The number of possible concepts (search space) varies between 4.2*10 4 and 8.3*10 42 in case of our simulated datasets and between 2.12 * and 4.57*10 17 in our empirical data.
9 Motivation Product-Line-Optimization is one of the most important problems in marketing (Green/Krieger 1985) New Product Development is key for profitability of a company (Hauser 2011) Due to the high pressure of globalization, the fast technological development and shorter lifecycles of the products the pressure for the companies get larger and larger to improve product lines (Tsafarikis et. All 2011) In new product development 65-79% of all new products brought to the market fail (Herrmann 2006, Tacke et. Al. 2014)
10 Aims and Resulting Models Product-line: Bundle of Products of at least two products from same style with at least on different feature (one different attribute level) Product- line optimization problems based on discrete choice experiment data are so called NP-hard problems (Kohli/Krishnamurti 1989) As a consequence, finding a polynomial algorithm to solve any NP-hard problem would give polynomial algorithms for all the problems in NP, which is unlikely as many of them are considered hard. Normally you end up with millions of possible combinations of product levels and attributes, therefore the search space is too large for exhaustive product searches. Usually brute force techniques fail to find the optima within an acceptable time. 10
11 Maximizing Marketshares
12 Maximizing Marketshares (2) Under the following restrictions: Number of possible product lines (N Number of all possible products; E.. Number products within one product line): Example: Product consisting of 6 Attributes with 5 Level each, 5 products in each product line 7.75*10 18
13 Selected Algorithms Inspired from Nature: Gradient free optimization algorithms mimic mechanisms observed in nature or use heuristics. Gradient free methods are not necessarily guaranteed to find the true global optimal solution, but they are able to find many good solutions. Genetic Algorithm (Holland 1975): Simulation of Selection (survival of the fittest), recombination (crossover) and mutation (variation) like in the evolution. Particle Swarm Optimization (Kennedy/Eberhart 1995): Stochastic, population-based computer algorithm that applies to swarm intelligence (simulation of fish- or bird-swarms) Ant-Colony optimization (Colori et. Al. 1991): Motivated by the search for an optimal path in a graph, based on the behavior of ants seeking a path between their colony and a source of food. Simulated Annealing (Kirkpattrick et. All. 1983): inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. Multiverse optimizer (Mirjalili et al. 2015): Main inspirations of this algorithm are based on three concepts in cosmology: white hole, black hole, and wormhole.
14 Genetic Algorithm A typical genetic algorithm requires: a genetic representation of the solution domain and a fitness function to evaluate the solution domain and a termination criterion. Each solution represents one chromosome, constructed from genes (attributes and levels) Crossbreeding of chromosomes with a certain probability Each chromosome has a certain probability of mutations The higher the target function value of the chromosome, the higher the probability to reproduce. Balakrishnan/Jakob (1996), Tsafarakis et. al.(2011)
15 Particle Swarm Optimization Each agent (particle) represents a design point and moves in n-dimensional space looking for the best solution. Each agent represents one solution of the problem. Each agent adjusts ist movements according to the effects of self experience and social interaction. Speed and position of an agent within the solution space is responsible for the quality of the solution Each particle fixes his speed and position concerning his best and the best overall solution of the swarm. V id (t+1) = V id (t) + c 1 rand 1 (P id -s id (t)) + c 2 rand 2 (P gd -s id (t)) s id (t+1) = s id (t) + v id (t+1) i particle, v dimension, t iteration Tsafarakas et.al. (2011)
16 Ant Colony Optimization Ant behavior was the inspiration for the optimization technique Dynamic of the pheromone spore from the Ant Colony to the food source Way of a single ant thru the decision network (Source: Albritton/McMullen (2007))
17 Ant colony optimization (2) With an ACO algorithm, the shortest path in a graph, between two points A and B, is built from a combination of several paths. It is not easy to give a precise definition of what algorithm is or is not an ant colony, because the definition may vary according to the authors and uses. Each solution is represented by an ant moving in the search space. Ants mark (with pheromone) the best solutions and take account of previous markings to optimize their search. They can be seen as probabilistic multi-agent algorithms using a probability distribution to make the transition between each iteration. In their versions for combinatorial problems, they use an iterative construction of solutions. It is possible that the best solution eventually be found, even though no ant would prove effective (see graphic).
18 Simulated Annealing Metaheuristic probabilistic technique for approximating the global optimum of a given function in a large search space The simulation of annealing minimize a function with a large number of variables to the statistical mechanics of equilibration (annealing) of the mathematically equivalent artificial multi-atomic system The slow cooling implemented in the Simulated Annealing algorithm is interpreted as a slow decrease in the probability of accepting worse solutions as the solution space is explored. Minimization of a energy function over a cooling schedule with Values C P(AcceptanceChange) = exp( K accept all changes of Level that gain the energy function Randomly accept also solutions with a worse energy level Belloni et.al (2008) )
19 Multiverse Optimizer (1) Multi-verse theory is a well-known theory between physicists. It is believed in this theory that there are more than one big bang and each big bang causes the birth of a universe. The term multi-verse stands opposite of universe, which refers to the existence of other universes in addition to the universe that we all are living in. Multiple universes interact and might even collide with each other in the multi-verse theory. The multi-verse theory also suggests that there might be different physical laws in each of the universes. The three main concepts of the multi-verse theory are chosen as the inspiration for the MVO algorithm: white holes, black holes, and wormholes.
20 Multiverse Optimizer (2) A white hole has never seen in our universe, but physicists think that the big bang can be considered as a white hole and may be the main component for the birth of a universe. It is also argued in the cyclic model of multi-verse theory that big bangs/white holes are created where the collisions between parallel universes occur. Black holes, which have been observed frequently, behave completely in contrast to white wholes. They attract everything including light beams with their extremely high gravitational force. Wormholes are those holes that connect different parts of a universe together. The wormholes in the multi-verse theory act as time/space travel tunnels where objects are able to travel instantly between any corners of a universe (or even from one universe to another). Every universe has an inflation rate that causes its expansion through space. Inflation speed of a universe is very important in terms of forming stars, planets, asteroids, black holes, white holes, wormholes, physical laws, and suitability for life.
21 Multiverse Optimizer (3) During optimization, the following rules are applied to the universes of MVO: 1. The higher inflation rate, the higher probability of having white hole. 2. The higher inflation rate, the lower probability of having black holes. 3. Universes with higher inflation rate tend to send objects through white holes. 4. Universes with lower inflation rate tend to receive more objects through black holes. 5. The objects in all universes may face random movement towards the best universe via wormholes regardless of the inflation rate. Two main factors of the MVO algorithm: wormhole existence probability (WEP) and travelling distance rate (TDR). WEB define the probability of wormhole s existence in universes. It is required to increase linearly over the iterations in order to emphasize exploitation as the progress of optimization process. TDR is a factor to define the variation that an object can be teleported by a wormhole around the best universe obtained so far. In contrast to WEP, TDR is increased over the iterations to have more precise exploitation/local search around the best obtained universe. (Source: Mirjalili et.al. (2015))
22
23 5 Anwendungsbeispiel 23
24 Simulation Study (1) Simulation of part-worth utilities from different Discrete Choice Experiments based on normal distributions Factors of the Experiment: Table: Factors and Levels of the Simulation Study Generation of a random status-quo market Number of Datasets used for simulation: 20* 3 * 3 * 2 * 2 = 720 Each of the CI approaches generate 20 solutions: 720*20 = 14,400 comparisons.
25 Simulation Study (2) Table shows: three blocks each contain 12 Data sets (6 ones with 100 respondents and 6 one with 500 respondents) Each data set was replicated 20 times (runs) and each algorithm provides 20 solutions (iterations) best: best solution for each of the algorithms over 20 runs, 20 iterations for each of the 12 data sets divided by the global optimal solution (known from simulation). avg: artitmetic mean of the solutions over 20 runs * 20 iterations for 12 datasets per block, divided by number of data sets time: average computational time each algorithm needs for one run in one iteration
26 Simulation Study (3) Results: Simulated annealing (SA) performed best in our simulation study, followed by multiverse optimization (MVO) and genetic algorithm (GA). In our small problems (< 3.8*10 10 ) all algorithms are able to find a global optimal solution. In our large problems (> 1,9*10 23 ) ant colony(aco) and particle swarm(pso) optimization failed to find acceptable global optimal solutions. Computational time for ACO and PSO are much larger compared to the other algorithms The pretty simple algorithm SA performed best either in computational time or finding the global optimal solution.
27 Conjoint Study Mobile Phone Tariff Study oversight: Study conducted in 2014 KANTAR TNS/Lightspeed Germany 3677 respondents 24 Attributes (94 Level) 2,256 parameter per respondent 11 Attributes pre-paid-tariff 12 attributes post-paid-tariff 1 ASC (pre-/post-paid) 2,941 respondents prefer pre-paid 1,295 respondents prefer post-paid Attribute-Structure Prepaid Duration Prepaid Data Speed Prepaid Prepaid SMS Prepaid Data Tariff Postpaid Duration Postpaid Data speed Postpaid Postpaid SMS Postpaid Data Costs ( 500MB) Costs (<500MB) Costs ( 500MB) Costs (<500MB) Prepaid Flatrate Prepaid Provider Postpaid Flatrate Postpaid Provider Prepaid Calls Fixed Line Prepaid Anytime minutes Postpaid Calls Fixed line Postpaid Anytime minutes Prepaid Calls offnet Postpaid Calls off-net Postpaid Calls on-net
28 Structure of the product portfolio optimization Two different branches of the market: pre-paid and post-paid The client searches for a product-portfolio with the highest market shares The portfolio should consist of three pre-paid and three post-pad tariff concepts Which product line maximizes the clients market share Number of possible solutions: pre-paid market: 2.12 * post-paid market: 4,57 * Status quo market: 7 competitors with 3 post-paid and 3 prepaid tariffs each actual market (starting point) consist of 48 concepts Competitive products are generated by using 600 random product concepts and use the 42 best
29 Results post-paid pre-paid best in % avg in % time in s best in % avg in % time in s GA 12, , ,85 13, , ,65 PSO 12, , ,90 13, , ,20 ACO 12, , ,90 13, , ,15 SA 12, , ,85 13, , ,15 MVO 13, , ,80 13, , ,85 Our empirical data show a completely different picture compared to the simulated data. The complex structure of the choice model seems to stress the algorithms more Clear winner is the newly developed Multiverse Optimizer (MVO) highest market shares for all runs and all iterations. Disadvantage is that the computational time is 14 to 15 times longer than for the fastest algorithm. As we don t know the global optima in a real study, the performance is measured by highest market shares for the clients portfolio.
30 Findings MVO performed pretty well based on the 20 iterations for each portfolio. The MVO algorithm failed only once (200MB data volume instead of 300MB). The four other algorithms show a more diversified picture. Differences in the portfolios are related to two attributes data volume (200MB vs. 300MB or 1GB) and duration of contract (12 months vs. 24 months). Another difference that frequently occurs is related to attribute pre-paid data speed (7,2 Mbits/s instead of 19,4 Mbits/s) Results are pretty much the same but we can show, that at least four of the five algorithms get stuck at a local optima. For sure we can t know, that the best algorithm (MVO) really reached the global optima. But MVO performed best in our computations
31 Conclusion Portfolio optimization is a rather complex problem and we can show that it s hard to decide if we really solve the problem or end up with local optima (related to the nature of the gradient free algorithms). Most of the algorithms have some tuning parameter that should be set reasonable. One could never be sure if one of the algorithms perform better with different parameters. In our MVO case it is important to set the wormhole existence probability (WEP) and travelling distance rate (TDR) correct to reach good results. Compared to four other tested algorithms MVO needs more computational time, this could sometimes be an disadvantage especially when solving large problems. Based on our simulated data all five algorithms perform pretty well. The results based on our complex alternative specific structure of the empirical study, MVO performed better than the other tested algorithms. This could be a result of the worm wholes, that suddenly appear in the iterations and open a tunnel from on branch to another. MVO therefore offers a better opportunity for slightly different concepts to switch from one branch to the other within the tree structure. Further research is needed to see if the findings hold under different conditions of complexity and for different empirical data.
32
Metaheuristics and Local Search
Metaheuristics and Local Search 8000 Discrete optimization problems Variables x 1,..., x n. Variable domains D 1,..., D n, with D j Z. Constraints C 1,..., C m, with C i D 1 D n. Objective function f :
More informationMetaheuristics and Local Search. Discrete optimization problems. Solution approaches
Discrete Mathematics for Bioinformatics WS 07/08, G. W. Klau, 31. Januar 2008, 11:55 1 Metaheuristics and Local Search Discrete optimization problems Variables x 1,...,x n. Variable domains D 1,...,D n,
More informationLecture 9 Evolutionary Computation: Genetic algorithms
Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Simulation of natural evolution Genetic algorithms Case study: maintenance scheduling with genetic
More informationMotivation, Basic Concepts, Basic Methods, Travelling Salesperson Problem (TSP), Algorithms
Motivation, Basic Concepts, Basic Methods, Travelling Salesperson Problem (TSP), Algorithms 1 What is Combinatorial Optimization? Combinatorial Optimization deals with problems where we have to search
More informationLocal Search & Optimization
Local Search & Optimization CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2017 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition, Chapter 4 Outline
More informationComputational statistics
Computational statistics Combinatorial optimization Thierry Denœux February 2017 Thierry Denœux Computational statistics February 2017 1 / 37 Combinatorial optimization Assume we seek the maximum of f
More informationIntuitionistic Fuzzy Estimation of the Ant Methodology
BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 9, No 2 Sofia 2009 Intuitionistic Fuzzy Estimation of the Ant Methodology S Fidanova, P Marinov Institute of Parallel Processing,
More informationAlgorithms and Complexity theory
Algorithms and Complexity theory Thibaut Barthelemy Some slides kindly provided by Fabien Tricoire University of Vienna WS 2014 Outline 1 Algorithms Overview How to write an algorithm 2 Complexity theory
More informationCSC 4510 Machine Learning
10: Gene(c Algorithms CSC 4510 Machine Learning Dr. Mary Angela Papalaskari Department of CompuBng Sciences Villanova University Course website: www.csc.villanova.edu/~map/4510/ Slides of this presenta(on
More informationARTIFICIAL INTELLIGENCE
BABEŞ-BOLYAI UNIVERSITY Faculty of Computer Science and Mathematics ARTIFICIAL INTELLIGENCE Solving search problems Informed local search strategies Nature-inspired algorithms March, 2017 2 Topics A. Short
More informationArtificial Intelligence Methods (G5BAIM) - Examination
Question 1 a) According to John Koza there are five stages when planning to solve a problem using a genetic program. What are they? Give a short description of each. (b) How could you cope with division
More informationZebo Peng Embedded Systems Laboratory IDA, Linköping University
TDTS 01 Lecture 8 Optimization Heuristics for Synthesis Zebo Peng Embedded Systems Laboratory IDA, Linköping University Lecture 8 Optimization problems Heuristic techniques Simulated annealing Genetic
More informationAnt Colony Optimization: an introduction. Daniel Chivilikhin
Ant Colony Optimization: an introduction Daniel Chivilikhin 03.04.2013 Outline 1. Biological inspiration of ACO 2. Solving NP-hard combinatorial problems 3. The ACO metaheuristic 4. ACO for the Traveling
More informationCapacitor Placement for Economical Electrical Systems using Ant Colony Search Algorithm
Capacitor Placement for Economical Electrical Systems using Ant Colony Search Algorithm Bharat Solanki Abstract The optimal capacitor placement problem involves determination of the location, number, type
More informationMethods for finding optimal configurations
CS 1571 Introduction to AI Lecture 9 Methods for finding optimal configurations Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square Search for the optimal configuration Optimal configuration search:
More informationSearch. Search is a key component of intelligent problem solving. Get closer to the goal if time is not enough
Search Search is a key component of intelligent problem solving Search can be used to Find a desired goal if time allows Get closer to the goal if time is not enough section 11 page 1 The size of the search
More information1 Heuristics for the Traveling Salesman Problem
Praktikum Algorithmen-Entwurf (Teil 9) 09.12.2013 1 1 Heuristics for the Traveling Salesman Problem We consider the following problem. We want to visit all the nodes of a graph as fast as possible, visiting
More informationLocal Search & Optimization
Local Search & Optimization CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition, Chapter 4 Some
More informationOptimization Methods via Simulation
Optimization Methods via Simulation Optimization problems are very important in science, engineering, industry,. Examples: Traveling salesman problem Circuit-board design Car-Parrinello ab initio MD Protein
More informationBeta Damping Quantum Behaved Particle Swarm Optimization
Beta Damping Quantum Behaved Particle Swarm Optimization Tarek M. Elbarbary, Hesham A. Hefny, Atef abel Moneim Institute of Statistical Studies and Research, Cairo University, Giza, Egypt tareqbarbary@yahoo.com,
More informationPARTICLE SWARM OPTIMISATION (PSO)
PARTICLE SWARM OPTIMISATION (PSO) Perry Brown Alexander Mathews Image: http://www.cs264.org/2009/projects/web/ding_yiyang/ding-robb/pso.jpg Introduction Concept first introduced by Kennedy and Eberhart
More informationA.I.: Beyond Classical Search
A.I.: Beyond Classical Search Random Sampling Trivial Algorithms Generate a state randomly Random Walk Randomly pick a neighbor of the current state Both algorithms asymptotically complete. Overview Previously
More informationP, NP, NP-Complete. Ruth Anderson
P, NP, NP-Complete Ruth Anderson A Few Problems: Euler Circuits Hamiltonian Circuits Intractability: P and NP NP-Complete What now? Today s Agenda 2 Try it! Which of these can you draw (trace all edges)
More informationPROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE
Artificial Intelligence, Computational Logic PROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE Lecture 4 Metaheuristic Algorithms Sarah Gaggl Dresden, 5th May 2017 Agenda 1 Introduction 2 Constraint
More informationHill climbing: Simulated annealing and Tabu search
Hill climbing: Simulated annealing and Tabu search Heuristic algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) Hill climbing Instead of repeating local search, it is
More informationNew rubric: AI in the news
New rubric: AI in the news 3 minutes Headlines this week: Silicon Valley Business Journal: Apple on hiring spreee for AI experts Forbes: Toyota Invests $50 Million In Artificial Intelligence Research For
More informationOutline. Ant Colony Optimization. Outline. Swarm Intelligence DM812 METAHEURISTICS. 1. Ant Colony Optimization Context Inspiration from Nature
DM812 METAHEURISTICS Outline Lecture 8 http://www.aco-metaheuristic.org/ 1. 2. 3. Marco Chiarandini Department of Mathematics and Computer Science University of Southern Denmark, Odense, Denmark
More informationLocal search and agents
Artificial Intelligence Local search and agents Instructor: Fabrice Popineau [These slides adapted from Stuart Russell, Dan Klein and Pieter Abbeel @ai.berkeley.edu] Local search algorithms In many optimization
More informationDiscrete evaluation and the particle swarm algorithm
Volume 12 Discrete evaluation and the particle swarm algorithm Tim Hendtlass and Tom Rodgers Centre for Intelligent Systems and Complex Processes Swinburne University of Technology P. O. Box 218 Hawthorn
More informationAdaptive Generalized Crowding for Genetic Algorithms
Carnegie Mellon University From the SelectedWorks of Ole J Mengshoel Fall 24 Adaptive Generalized Crowding for Genetic Algorithms Ole J Mengshoel, Carnegie Mellon University Severinio Galan Antonio de
More informationCS 188 Introduction to Fall 2007 Artificial Intelligence Midterm
NAME: SID#: Login: Sec: 1 CS 188 Introduction to Fall 2007 Artificial Intelligence Midterm You have 80 minutes. The exam is closed book, closed notes except a one-page crib sheet, basic calculators only.
More informationSensitive Ant Model for Combinatorial Optimization
Sensitive Ant Model for Combinatorial Optimization CAMELIA CHIRA cchira@cs.ubbcluj.ro D. DUMITRESCU ddumitr@cs.ubbcluj.ro CAMELIA-MIHAELA PINTEA cmpintea@cs.ubbcluj.ro Abstract: A combinatorial optimization
More informationAvailable online at ScienceDirect. Procedia Computer Science 20 (2013 ) 90 95
Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 20 (2013 ) 90 95 Complex Adaptive Systems, Publication 3 Cihan H. Dagli, Editor in Chief Conference Organized by Missouri
More informationSelf-Adaptive Ant Colony System for the Traveling Salesman Problem
Proceedings of the 29 IEEE International Conference on Systems, Man, and Cybernetics San Antonio, TX, USA - October 29 Self-Adaptive Ant Colony System for the Traveling Salesman Problem Wei-jie Yu, Xiao-min
More informationBounded Approximation Algorithms
Bounded Approximation Algorithms Sometimes we can handle NP problems with polynomial time algorithms which are guaranteed to return a solution within some specific bound of the optimal solution within
More information3D HP Protein Folding Problem using Ant Algorithm
3D HP Protein Folding Problem using Ant Algorithm Fidanova S. Institute of Parallel Processing BAS 25A Acad. G. Bonchev Str., 1113 Sofia, Bulgaria Phone: +359 2 979 66 42 E-mail: stefka@parallel.bas.bg
More informationFinding optimal configurations ( combinatorial optimization)
CS 1571 Introduction to AI Lecture 10 Finding optimal configurations ( combinatorial optimization) Milos Hauskrecht milos@cs.pitt.edu 539 Sennott Square Constraint satisfaction problem (CSP) Constraint
More informationResearch Article A Novel Differential Evolution Invasive Weed Optimization Algorithm for Solving Nonlinear Equations Systems
Journal of Applied Mathematics Volume 2013, Article ID 757391, 18 pages http://dx.doi.org/10.1155/2013/757391 Research Article A Novel Differential Evolution Invasive Weed Optimization for Solving Nonlinear
More informationA Note on the Parameter of Evaporation in the Ant Colony Optimization Algorithm
International Mathematical Forum, Vol. 6, 2011, no. 34, 1655-1659 A Note on the Parameter of Evaporation in the Ant Colony Optimization Algorithm Prasanna Kumar Department of Mathematics Birla Institute
More information12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria
12. LOCAL SEARCH gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley h ttp://www.cs.princeton.edu/~wayne/kleinberg-tardos
More informationSingle Solution-based Metaheuristics
Parallel Cooperative Optimization Research Group Single Solution-based Metaheuristics E-G. Talbi Laboratoire d Informatique Fondamentale de Lille Single solution-based metaheuristics Improvement of a solution.
More informationOverview. Optimization. Easy optimization problems. Monte Carlo for Optimization. 1. Survey MC ideas for optimization: (a) Multistart
Monte Carlo for Optimization Overview 1 Survey MC ideas for optimization: (a) Multistart Art Owen, Lingyu Chen, Jorge Picazo (b) Stochastic approximation (c) Simulated annealing Stanford University Intel
More informationDISTRIBUTION SYSTEM OPTIMISATION
Politecnico di Torino Dipartimento di Ingegneria Elettrica DISTRIBUTION SYSTEM OPTIMISATION Prof. Gianfranco Chicco Lecture at the Technical University Gh. Asachi, Iaşi, Romania 26 October 2010 Outline
More informationLocal Search and Optimization
Local Search and Optimization Outline Local search techniques and optimization Hill-climbing Gradient methods Simulated annealing Genetic algorithms Issues with local search Local search and optimization
More informationBinary Particle Swarm Optimization with Crossover Operation for Discrete Optimization
Binary Particle Swarm Optimization with Crossover Operation for Discrete Optimization Deepak Singh Raipur Institute of Technology Raipur, India Vikas Singh ABV- Indian Institute of Information Technology
More informationData Warehousing & Data Mining
13. Meta-Algorithms for Classification Data Warehousing & Data Mining Wolf-Tilo Balke Silviu Homoceanu Institut für Informationssysteme Technische Universität Braunschweig http://www.ifis.cs.tu-bs.de 13.
More informationBLACKHOLE WORMHOLE THEORY
BLACKHOLE WORMHOLE THEORY By - ASHU PRAKASH Black hole, a name which has infinite knowledge to define, but very difficult to define. What is a black hole? In general, a black hole is a gravitationally
More informationLocal and Stochastic Search
RN, Chapter 4.3 4.4; 7.6 Local and Stochastic Search Some material based on D Lin, B Selman 1 Search Overview Introduction to Search Blind Search Techniques Heuristic Search Techniques Constraint Satisfaction
More information5. Simulated Annealing 5.1 Basic Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini
5. Simulated Annealing 5.1 Basic Concepts Fall 2010 Instructor: Dr. Masoud Yaghini Outline Introduction Real Annealing and Simulated Annealing Metropolis Algorithm Template of SA A Simple Example References
More informationPart B" Ants (Natural and Artificial)! Langton s Vants" (Virtual Ants)! Vants! Example! Time Reversibility!
Part B" Ants (Natural and Artificial)! Langton s Vants" (Virtual Ants)! 11/14/08! 1! 11/14/08! 2! Vants!! Square grid!! Squares can be black or white!! Vants can face N, S, E, W!! Behavioral rule:!! take
More informationPengju
Introduction to AI Chapter04 Beyond Classical Search Pengju Ren@IAIR Outline Steepest Descent (Hill-climbing) Simulated Annealing Evolutionary Computation Non-deterministic Actions And-OR search Partial
More informationFundamentals of Metaheuristics
Fundamentals of Metaheuristics Part I - Basic concepts and Single-State Methods A seminar for Neural Networks Simone Scardapane Academic year 2012-2013 ABOUT THIS SEMINAR The seminar is divided in three
More informationImplementation of Travelling Salesman Problem Using ant Colony Optimization
RESEARCH ARTICLE OPEN ACCESS Implementation of Travelling Salesman Problem Using ant Colony Optimization Gaurav Singh, Rashi Mehta, Sonigoswami, Sapna Katiyar* ABES Institute of Technology, NH-24, Vay
More informationEvolutionary Computation
Evolutionary Computation - Computational procedures patterned after biological evolution. - Search procedure that probabilistically applies search operators to set of points in the search space. - Lamarck
More informationA GA Mechanism for Optimizing the Design of attribute-double-sampling-plan
A GA Mechanism for Optimizing the Design of attribute-double-sampling-plan Tao-ming Cheng *, Yen-liang Chen Department of Construction Engineering, Chaoyang University of Technology, Taiwan, R.O.C. Abstract
More informationThe Cosmic Microwave Background
The Cosmic Microwave Background Class 22 Prof J. Kenney June 26, 2018 The Cosmic Microwave Background Class 22 Prof J. Kenney November 28, 2016 Cosmic star formation history inf 10 4 3 2 1 0 z Peak of
More informationRandom Search. Shin Yoo CS454, Autumn 2017, School of Computing, KAIST
Random Search Shin Yoo CS454, Autumn 2017, School of Computing, KAIST Random Search The polar opposite to the deterministic, examineeverything, search. Within the given budget, repeatedly generate a random
More informationLearning in State-Space Reinforcement Learning CIS 32
Learning in State-Space Reinforcement Learning CIS 32 Functionalia Syllabus Updated: MIDTERM and REVIEW moved up one day. MIDTERM: Everything through Evolutionary Agents. HW 2 Out - DUE Sunday before the
More informationLocal Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:
Local Search Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat: I I Select a variable to change Select a new value for that variable Until a satisfying assignment
More informationSolving Numerical Optimization Problems by Simulating Particle-Wave Duality and Social Information Sharing
International Conference on Artificial Intelligence (IC-AI), Las Vegas, USA, 2002: 1163-1169 Solving Numerical Optimization Problems by Simulating Particle-Wave Duality and Social Information Sharing Xiao-Feng
More informationNatural Computing. Lecture 11. Michael Herrmann phone: Informatics Forum /10/2011 ACO II
Natural Computing Lecture 11 Michael Herrmann mherrman@inf.ed.ac.uk phone: 0131 6 517177 Informatics Forum 1.42 25/10/2011 ACO II ACO (in brief) ACO Represent solution space Set parameters, initialize
More informationComputational Intelligence Methods
Computational Intelligence Methods Ant Colony Optimization, Partical Swarm Optimization Pavel Kordík, Martin Šlapák Katedra teoretické informatiky FIT České vysoké učení technické v Praze MI-MVI, ZS 2011/12,
More informationArtificial Intelligence Heuristic Search Methods
Artificial Intelligence Heuristic Search Methods Chung-Ang University, Jaesung Lee The original version of this content is created by School of Mathematics, University of Birmingham professor Sandor Zoltan
More informationBioinformatics: Network Analysis
Bioinformatics: Network Analysis Model Fitting COMP 572 (BIOS 572 / BIOE 564) - Fall 2013 Luay Nakhleh, Rice University 1 Outline Parameter estimation Model selection 2 Parameter Estimation 3 Generally
More informationA pruning pattern list approach to the permutation flowshop scheduling problem
A pruning pattern list approach to the permutation flowshop scheduling problem Takeshi Yamada NTT Communication Science Laboratories, 2-4 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-02, JAPAN E-mail :
More informationIntroduction. THE FORMATION OF THE SOLAR SYSTEM - Theories Old and New Imperial College Press
Most scientists think that the work they do is very important. Well, they would wouldn t they? It is a human trait, an aspect of vanity, to consider that what one does is more significant than it really
More informationCHAPTER 9 THE ARROW OF TIME
CHAPTER 9 THE ARROW OF TIME In previous chapters we have seen how our views of the nature of time have changed over the years. Up to the beginning of this century people believed in an absolute time. That
More informationInvestigation of Mutation Strategies in Differential Evolution for Solving Global Optimization Problems
Investigation of Mutation Strategies in Differential Evolution for Solving Global Optimization Problems Miguel Leon Ortiz and Ning Xiong Mälardalen University, Västerås, SWEDEN Abstract. Differential evolution
More informationAnalog Computing: a different way to think about building a (quantum) computer
Analog Computing: a different way to think about building a (quantum) computer November 24, 2016 1 What is an analog computer? Most of the computers we have around us today, such as desktops, laptops,
More informationDesign and Analysis of Algorithms
CSE 0, Winter 08 Design and Analysis of Algorithms Lecture 8: Consolidation # (DP, Greed, NP-C, Flow) Class URL: http://vlsicad.ucsd.edu/courses/cse0-w8/ Followup on IGO, Annealing Iterative Global Optimization
More informationCS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash
CS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash Equilibrium Price of Stability Coping With NP-Hardness
More informationChapter 8: Introduction to Evolutionary Computation
Computational Intelligence: Second Edition Contents Some Theories about Evolution Evolution is an optimization process: the aim is to improve the ability of an organism to survive in dynamically changing
More informationGravity Teacher s Guide
Gravity Teacher s Guide 1.0 Summary Gravity is the 9 th and final Dynamica activity to be done before the Post-Test. This activity has not undergone many changes from the last school year. It should take
More informationScaling Up. So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.).
Local Search Scaling Up So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.). The current best such algorithms (RBFS / SMA*)
More informationIntroduction to Simulated Annealing 22c:145
Introduction to Simulated Annealing 22c:145 Simulated Annealing Motivated by the physical annealing process Material is heated and slowly cooled into a uniform structure Simulated annealing mimics this
More informationFundamentals of Genetic Algorithms
Fundamentals of Genetic Algorithms : AI Course Lecture 39 40, notes, slides www.myreaders.info/, RC Chakraborty, e-mail rcchak@gmail.com, June 01, 2010 www.myreaders.info/html/artificial_intelligence.html
More informationDETECTING THE FAULT FROM SPECTROGRAMS BY USING GENETIC ALGORITHM TECHNIQUES
DETECTING THE FAULT FROM SPECTROGRAMS BY USING GENETIC ALGORITHM TECHNIQUES Amin A. E. 1, El-Geheni A. S. 2, and El-Hawary I. A **. El-Beali R. A. 3 1 Mansoura University, Textile Department 2 Prof. Dr.
More informationGenetic Algorithms and Genetic Programming Lecture 17
Genetic Algorithms and Genetic Programming Lecture 17 Gillian Hayes 28th November 2006 Selection Revisited 1 Selection and Selection Pressure The Killer Instinct Memetic Algorithms Selection and Schemas
More informationMethods for finding optimal configurations
S 2710 oundations of I Lecture 7 Methods for finding optimal configurations Milos Hauskrecht milos@pitt.edu 5329 Sennott Square S 2710 oundations of I Search for the optimal configuration onstrain satisfaction
More information7.1 Basis for Boltzmann machine. 7. Boltzmann machines
7. Boltzmann machines this section we will become acquainted with classical Boltzmann machines which can be seen obsolete being rarely applied in neurocomputing. It is interesting, after all, because is
More informationIntelligens Számítási Módszerek Genetikus algoritmusok, gradiens mentes optimálási módszerek
Intelligens Számítási Módszerek Genetikus algoritmusok, gradiens mentes optimálási módszerek 2005/2006. tanév, II. félév Dr. Kovács Szilveszter E-mail: szkovacs@iit.uni-miskolc.hu Informatikai Intézet
More informationCS 6783 (Applied Algorithms) Lecture 3
CS 6783 (Applied Algorithms) Lecture 3 Antonina Kolokolova January 14, 2013 1 Representative problems: brief overview of the course In this lecture we will look at several problems which, although look
More informationFirefly algorithm in optimization of queueing systems
BULLETIN OF THE POLISH ACADEMY OF SCIENCES TECHNICAL SCIENCES, Vol. 60, No. 2, 2012 DOI: 10.2478/v10175-012-0049-y VARIA Firefly algorithm in optimization of queueing systems J. KWIECIEŃ and B. FILIPOWICZ
More informationDiscrete Evaluation and the Particle Swarm Algorithm.
Abstract Discrete Evaluation and the Particle Swarm Algorithm. Tim Hendtlass and Tom Rodgers, Centre for Intelligent Systems and Complex Processes, Swinburne University of Technology, P. O. Box 218 Hawthorn
More informationSIMU L TED ATED ANNEA L NG ING
SIMULATED ANNEALING Fundamental Concept Motivation by an analogy to the statistical mechanics of annealing in solids. => to coerce a solid (i.e., in a poor, unordered state) into a low energy thermodynamic
More informationGenes and DNA. 1) Natural Selection. 2) Mutations. Darwin knew this
1) Natural Selection The mechanism (driving force) for evolution, as explained by Charles Darwin. Explains changes in an entire species or population (not individuals) over time. 2) Mutations Random changes
More informationAn artificial chemical reaction optimization algorithm for. multiple-choice; knapsack problem.
An artificial chemical reaction optimization algorithm for multiple-choice knapsack problem Tung Khac Truong 1,2, Kenli Li 1, Yuming Xu 1, Aijia Ouyang 1, and Xiaoyong Tang 1 1 College of Information Science
More informationPreface to Presentation
Preface to Presentation I gave a presentation last October about time travel, warp drive, travel to a Goldilocks Planet etc. to provide some possible place to escape a possible dying world I mentioned
More informationDesign and Optimization of Energy Systems Prof. C. Balaji Department of Mechanical Engineering Indian Institute of Technology, Madras
Design and Optimization of Energy Systems Prof. C. Balaji Department of Mechanical Engineering Indian Institute of Technology, Madras Lecture - 09 Newton-Raphson Method Contd We will continue with our
More informationConstraint satisfaction search. Combinatorial optimization search.
CS 1571 Introduction to AI Lecture 8 Constraint satisfaction search. Combinatorial optimization search. Milos Hauskrecht milos@cs.pitt.edu 539 Sennott Square Constraint satisfaction problem (CSP) Objective:
More information9 Markov chain Monte Carlo integration. MCMC
9 Markov chain Monte Carlo integration. MCMC Markov chain Monte Carlo integration, or MCMC, is a term used to cover a broad range of methods for numerically computing probabilities, or for optimization.
More informationTUTORIAL: HYPER-HEURISTICS AND COMPUTATIONAL INTELLIGENCE
TUTORIAL: HYPER-HEURISTICS AND COMPUTATIONAL INTELLIGENCE Nelishia Pillay School of Mathematics, Statistics and Computer Science University of KwaZulu-Natal South Africa TUTORIAL WEBSITE URL: http://titancs.ukzn.ac.za/ssci2015tutorial.aspx
More informationStatistical Computing (36-350)
Statistical Computing (36-350) Lecture 19: Optimization III: Constrained and Stochastic Optimization Cosma Shalizi 30 October 2013 Agenda Constraints and Penalties Constraints and penalties Stochastic
More informationTraffic Signal Control with Swarm Intelligence
009 Fifth International Conference on Natural Computation Traffic Signal Control with Swarm Intelligence David Renfrew, Xiao-Hua Yu Department of Electrical Engineering, California Polytechnic State University
More informationSVAN 2016 Mini Course: Stochastic Convex Optimization Methods in Machine Learning
SVAN 2016 Mini Course: Stochastic Convex Optimization Methods in Machine Learning Mark Schmidt University of British Columbia, May 2016 www.cs.ubc.ca/~schmidtm/svan16 Some images from this lecture are
More informationMonday, November 25, 2013 Reading: Chapter 12 all; Chapter 13 all. There will be class on Wednesday. Astronomy in the news?
Monday, November 25, 2013 Reading: Chapter 12 all; Chapter 13 all There will be class on Wednesday Astronomy in the news? Goal: To understand what the Dark Energy implies for the shape and fate of the
More informationThree Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms
Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Yong Wang and Zhi-Zhong Liu School of Information Science and Engineering Central South University ywang@csu.edu.cn
More informationSAT-Solvers: propositional logic in action
SAT-Solvers: propositional logic in action Russell Impagliazzo, with assistence from Cameron Held October 22, 2013 1 Personal Information A reminder that my office is 4248 CSE, my office hours for CSE
More informationGenetic Algorithm. Outline
Genetic Algorithm 056: 166 Production Systems Shital Shah SPRING 2004 Outline Genetic Algorithm (GA) Applications Search space Step-by-step GA Mechanism Examples GA performance Other GA examples 1 Genetic
More informationWallace Hall Academy
Wallace Hall Academy CfE Higher Physics Unit 1 - Universe Notes Name 1 Newton and Gravity Newton s Thought Experiment Satellite s orbit as an Application of Projectiles Isaac Newton, as well as giving
More information