Size: px
Start display at page:

Download ""

Transcription

1 Tailoring Mutation to Landscape Properties William G. Macready? Bios Group L.P. 317 Paseo de Peralta Santa Fe, NM Abstract. We present numerical results on Kauman's NK landscape family indicating that the optimal distance at which to search for tter variants depends on both the current tness and the sampling that can be aorded at each distance. The optimal search distance from average tness congurations is large to escape local correlation limits and decreases as tness increases. An analytic derivation of the optimal search distance as a function the landscape correlation, the current tness f, and the number of samples n is determined by introducing a new landscape family { -landscapes. The utility of -landscapes is demonstrated by determining a few of their simple properties. 1 Introduction Evolutionary metaphors have re-invigorated research into the design of optimization algorithms. An important next step in this research is to gain a deeper understanding of the subtle connection between mutation operators (or equivalently landscape properties) and eective search algorithms. Until we systematically understand this connection progress in evolutionary-based search will be haphazard at best [WM97,MW96]. This paper moves towards a deeper understanding of this issue by investigating the optimal distance at which to search for tter variants as a function of landscape properties. Landscapes consist of both the tness function to be optimized and a neighborhood relationship amongst congurations in the input space. This neighborhood relation is dened by the mutation operator. The most familiar continuous landscapes are dened over IR n with the natural topology induced by IR n.combinatorial optimization is concerned with discrete input spaces and in this case, the neighborhood relation is expressed by a graph whose nodes consist of all congurations in the input space and whose edges connect congurations which are mutants of one another. The prototypical example of a discrete landscape might be spin glass models [FH91,MPV87] (including the NK model we study here) dened over bit strings with Boolean hypercube topology dened by single bit-ip Hamming neighbors. Recent work has elucidated many properties of landscapes in terms of their underlying neighborhood graph (see [Sta95] for a detailed view of the state of the? I thank both Bios Group L.P. and the Santa Fe Institute for support.

2 art in landscape theory). However, our understanding of the relationship between properties of landscapes and the design of eective optimization algorithms over those landscapes goes little beyond the obvious observation that the number of local peaks (under whatever mutation operator is being used) serves as a good indicator of the diculty of the problem. To aid in understanding the connection between landscape properties and effective algorithms, a family of tunably dicult landscape models proves useful. Presently, the most popular landscape family is Kauman's NK model [Kau93]. This family of landscapes, inspired by spin glass models, is parameterized by N the number of bits, and K the epistasis between bits. By tuning K, landscapes can generated with varying degrees of ruggedness. The decision problem associated with optimization on the NK family of landscapes is NP-complete for K>[Wei96] sothenk model generates dicult optimization tasks. The NK family has generated some useful results connecting landscape properties to optimization strategies. Particularly interesting is the result that recombination is most eective on landscapes of intermediate ruggedness, i.e K (seethe early work of Kauman [Kau93] onnk landscapes and later work by Bergman et al. [BOF95a,BOF95b] using population genetic approaches). The NK model has contributed to our empirical understanding of algorithm design but it is too complex (for intermediate K) to address deeper theoretical insights. What is needed is a landscape family which is complex enough to encompass dicult optimization problems and yet simple enough to permit analytic insight. To generally understand optimal search distances on landscapes we present a family of landscapes parameterized by thenumber of bits N and the nearest neighbor-correlation coecient coecient between tnesses. We demonstrate the utility of this landscape model by determining the optimal mutation rate as a function of N and. The paper is organized as follows. In Section we demonstrate using the NK model that optimal search distances (or equivalently, mutation rates) should be adjusted according to current tness. Mutation rates should be higher when the current conguration is of average tness, and decrease as the conguration increases in tness. In Section 3weintroduce -landscapes and determine the -landscape best matching a particular NK landscape. Section 4 calculates the mutation rate as a function of current tness f, landscape correlation, and the problem size N. Section 5 ties together the results of previous section and suggests directions for further research. Search distances on NK landscapes: numerical results Mutation rate is an important parameter in evolutionary optimization algorithms. Set too high, the search wanders over congurations never settling into a region of high tness. Set too low, little exploration is done and the algorithm settles onto the nearest local maximum. In this section we present a new method for optimizing mutation rates by determining the optimal search distance as a function of landscape correlation.

3 We imagine an algorithm that at each stage samples congurations at distance d from the current conguration. If congurations at this distance are sampled n times, what is the optimal distance at which to search for tter variants? Simulation with the NK model provides an interesting answer d Fig. 1. The mean one standard deviation of tnesses at distance d from starting congurations having average, below average, and above average tness for an N = 100 and K = 0 landscape Figures 1 and present results for N = 100 landscapes with K = 0 and K = 4. The simulation begins by picking a conguration of average, below average, or better than average tness, and then random congurations at specied distances from the initial conguration are generated and their tnesses recorded. Figures 1 and plot the average tness plus or minus its standard deviation at each distance. In all cases, the variance increases as we move away from the initial conguration and the mean moves toward the mean of the landscape. Though the mean decreases with distance, it may well be better to sample further since the variance is higher and there is a good chance one of the samples may have quite high tness. If the landscape is correlated, nearby congurations are constrained to have similar tness, while points further away have no such constraint and there is the possibility of nding a much tter variant. However, if the current conguration is already of higher than average tness, the higher variability at further distances may not overcome the decay of the mean towards the landscape average. If prior knowledge exists that the landscape is isotropic (meaning that its statisical properties are constant across the input space) then these results nat-

4 d Fig.. The mean one standard deviation of tnesses at distance d from starting congurations having slightly above average, much above average, and much below average tnesses for an N = 100 and K = 4 landscape urally suggest an optimization strategy. If current tness is about average, then it is best to search futher than a correlation length away on the landscape where much tter variants could conceivably be found. As current tness increases, conne search closer to the current conguration to exploit the gains made thus far. At high tnesses it is best to search in the immediate neighborhood, since increased variance will not compensate for the rapid drop in the mean with increasing distance. These general results depend on the amount of sampling that can be done at each distance. If more sampling is allowed then the order statistics improve and the optimal search distance increases. To understand this result so that it can be applied in other situations we need a deeper quantitative understanding. The next section denes a new family of landscapes in which optimal search distances can be determined. 3 A new family of landscapes A fundamental characteristic of a landscape is the correlation between tness values at varying distances. We start from this observation to parameterize a family of landscapes based on the correlation coecient between tnesses of nearest-neighbor congurations. In keeping with most landscape theory, which is statistical in nature, we approximate a landscape by a joint probability distribution. Under this annealed approximation, a landscape is characterized entirely by the distribution P (f ;f )

5 where f and f are the tnesses of neighboring congurations x and x. A remarkable number of landscape predictions can be made from this simple starting point. 3.1 Preliminaries A tness landscape consists of a tness function f : X 7! R and a metric structure over a search space X. In this paper we focus on discrete search spaces in which case the metric structure over X can be represented with a directed graph G. Thevertices in G consist of all points in the search space with directed edges (; ) indicating that conguration x is a neighbor of x. If the neighbor relation is symmetric, then the edges of G are undirected. The distance of two congurations in X is given by their distance in G. Nearest neighbors in G are at distance 1, while the maximal distance is given by the diameter of G. As a familiar example the neighborhood graph G for bit-strings of length N with single bit ip mutation is the hypercube graph. As discussed above landscapes are approximated by the joint probability P (f ;f )wherex and x are connected by anedgeing. Formally this probability is dened from P (f ;f )= P hx i;x ji (f, f(x i ))(f, f(x j )) P hx i;x ji 1 where the notation hx i ;x j i requires that congurations x i and x j are neighbors in G and () is the Dirac delta function 1.From P (f ;f )wemay calculate both P (f ), the probability that a randomly chosen conguration x has tness f, and P (f jf ) the probability ofa conguration having tness f given that a neighboring conguration x has tness f.formally these quantities are dened by P (f )= Z df P (f ;f ) and P (f jf )= P (f ;f ) P (f ) The joint distribution P (f jf ) is particularly useful as it gives the distribution of the tnesses of the neighbors of a conguration having tness f. 3. An application Having dened the probabilistic approximation to landscapes, we give one simple example of an application. More interesting (and complex) applications can be found in [Mac98a]. Under a rather crude simplication it is a simple matter to estimate the number of local maxima in a tness landscape. The probability 0 (f ) that a 1 The delta function (x) is dened to be zero away forx6= 0 and satises R dx (x) = I 1 if the region of integration I includes x =0.

6 conguration x with tness f is of higher tness than all of its n neighbors is 0 (f )= Z f df n,1 Z f df n,1,1 Z f,1 df 1 P (f 1 ; ;fn jf ) where P (f 1 ; ;fn jf ) is the joint probability that the n neighbors of a site with tness f have tnesses f 1 ; ;fn.ifwe assume that given f the tnesses f 1 ; ;fn are otherwise independentofeach other then P (f 1 ; ;fn jf ) factors as P (f 1 ; ;fn jf )= Q n i=1 P (f i jf )andwe can write 0 (f )= "Z f df P (f jf ),1 # n (f) n (1) (f ) is the probability that a randomly chosen neighbor has a tness lower than f. Given this result, the expected number, N p, of local maxima is then N p = jx j Z df 0 (f )P (f ) where jx j is the size of the conguration space. 3.3 NK landscapes under the annealed approximation In this section we use the annealed approximation to approximate Kauman's NK landscapes. The NK model [Kau93] generates a family of tunably rugged energy landscapes over bit-strings b = fb 1 b N g of length N. Each bitb i makes a contribution to the total tness dependent onitsown state and the state of K other bits. These K other bits may be selected at random or according to some specied connectivity. The total tness of a bit-string, b, is dened by: ffbg = 1 N NX i=1 f i (b i ; b i1 ; ;b ik ) By analogy with spin glasses [FH91] the local tness contributions, f i, for each of the K+1 local bit congurations, fb i ;b i1 ; ;b ik g, are chosen at random. Normally this underlying distribution is uniform over [0,1) but for calculational simplicity we will assume it is Gaussian with mean 0 and variance : P (f i )= exp [,f i = ] p : By specializing the f i and the selection of the K neighbors, the NK model encompasses many optimization problems, including spin-glasses, graph-coloring, number partitioning etc. Much is known about the NK model in the limit K = N, 1 [Der80b,Der80a,MP89,KW89] andk = 0 [Kau93].

7 In [Mac98a] the relevant probabilities for the NK model are calculated as: p N P (f )= p exp, N f () N P (f ;f )= p exp N, 1, s N P (f jf )= (1, exp, ) The nearest-neighbor correlation coecient is given by, f (1, ) + f, f f (3) N, f, (1, ) f : (4) =1, K +1 N and ranges from very correlated when =(N, 1)=N at K = 0 to completely uncorrelated when =0atK = N, 1. Within the NK family the correlation is constrained to be positive. In Section 4, we shall use Eq. (3) as dening a more general family of landscapes with correlation ranging from,1 1. A constructive procedure to generate landscapes satisfying Eq. (3) is described in [Mac98b]. Applying Equation (1) we see the probability that a bit-string with tness f is a local maximum is where the error function is dened as 0 (f )= 1 N erfn N(1, ) (1 + ) f Z erf[x] = p x dt exp[,t ]: 0 With this brief introduction to the landscape family we move on to determine optimal search distances within this family. 4 Search distances on -landscapes: analytical results Determination of the optimal mutation distance is straightforward using landscapes. In [Mac98a] it is shown that tnesses at distance d away from a conguration of tness f are distributed Gaussianly with mean and variance given by: (f ;d)=f d (5) (d) =1, d : (6) The error function is an odd function and has the limiting value lim x!1 erf[x] =1.

8 If the algorithm samples n times at each distance and goes with the highest tness obtained, f, the distribution in maximal tnesses P 1 (f jf ;n;d) is given by the highest order statistic: P 1 (f jf ;n;d)=np (f jf ;d) "Z f n,1 df P(fjf ;d)# f Cn (f jf ;d),1 where P (f jf ;d) is the conditional density of tness f found at distance d from a bit-string having tness f and C(f jf ;d) is its cumulative distribution. As noted above this distribution is Gaussian, i.e. P (f jf ;d)=n, (f ;d); (d). The expected maximal tness f max (f ;n;d) is just the mean of P 1 (f jf ;n;d). The mean is dicult to calculate exactly but we can approximate it by f max (f ;n;d)= (f ;d)+(n)(d). The best choice for (n) is determined in the appendix and is found to be (n) = C,1 ( np 1=). In the above equation C(f ) is the cumulative distribution function for N (0; 1) and C,1 (x) is its inverse. Explicitly C(x) = (1 + erfc[x= p ])=. The optimal search distance is that distance at which f max (f ;n;d) is greatest. The function f max (f ;n;d) is well behaved as a function of d having only a single maximum and consequently the optimal choice of d opt is the integer nearest to the d which d=d d=d (n) =, (f ;d) (f Using the results in Eq. (5) and (6) we see that d satises f d ln = d=d (n) p 1, d d ln : (7) Solving this quadratic in d yields an explicit solution for d as d (f ;n)=, 1 ln [1 + ((n)=f ) ] : ln z In the case when f 0 the only solution to Eq. (7) is found by driving both sides to zero as d!1. This corresponds to a situation in which tnessisworse than average so it is best to search maximally far away where the variance is greatest. The expected tness at the optimal distance is easily calculated as f max (f ;n;d= d )=f q 1+, (n)=f A plot of the optimal search distance as a function of f for various n on landscapes with dierent correlations is shown in Figure 3. A slight variant of this algorithm might also be applied in practice. Rather than have the algorithm continue to sample n times at each distance, it may be more ecient to terminate the sampling if a very high tness is encountered. The analysis of this situation is more complicated and requires appeal to dynamic programming. A complete solution to this problem (including the possibility of a cost of search) is given in [KLM98] in an economic context.

9 5 0 d f n 30 Fig. 3. Optimal search distance for a = 0:9 landscape as a function of the initial tness f and the number of samples n. 5 Conclusions We have outlined the framework for a new family of landscapes that allows analytic insight into dicult optimization problems. The aim in designing a new landscape family is to attempt an understanding of the connection between eective optimization algorithms and prior properties of the problem. In this simple case we have accomplished that goal by deriving an optimal parameter setting within a narrow class of algorithms based on mutation alone. We described the new family of -landscapes motivated by an annealed approximation and results for the NK model. A constructive technique for generating such landscapes based on Gaussian processes in described in [Mac98b]. We then described a family of algorithms which search at tness dependent distances and determined the optimal search distance for the -landscape family. These explorations are preliminary but encourage further research. For more complicated algorithms like evolutionary based algorithms, and simulated annealing we may be able to determine parameter settings exactly as a function of. Moreover, a greater understanding of -landscapes might suggest new - dependent optimization techniques. Acknowledgements I would like to thank Stuart Kauman and Jose Lobo for ideas and encouragement. References [BOF95a] A. Bergman, S. P. Otto, and M. W. Feldman. On the evolution of recombination in haploids and diploids i. deterministic models. Complexity, 1:57{67, [BOF95b] A. Bergman, S. P. Otto, and M. W. Feldman. On the evolution of recombination in haploids and diploids ii. stochastic models. Complexity, :49{57, 1995.

10 [Der80a] B. Derrida. The random energy model. Phys. Rep., 67:9{35, [Der80b] B. Derrida. Random energy model: Limit of a family of disordered models. Phys. Rev. Lett., 45:79{8, [FH91] K. H. Fischer and J. A. Hertz. Spin Glasses. Cambridge University Press, [Kau93] S. A. Kauman. The Origin of Order. Oxford University Press, New York, Oxford, [KLM98] S. A. Kauman, J. Lobo, and W. G. Macready. Optimal search onatech- nology landscape. In review at Econometrica, [KW89] S. A. Kauman and E. D. Weinberger. The nk model of rugged tness landscapes and its application to maturation of the immune response. J. Theor. Biol., 141:11, [Mac98a] W. G. Macready. An annealed theory of landscapes: part 1. In preparation, [Mac98b] W. G. Macready. An annealed theory of landscapes: part. In preparation, [MP89] C. A. Macken and A. S. Perelson. Protein evolution on rugged landscapes. Proc. Natl. Acad. Sci. USA, 86:6191{6195, [MPV87] M. Mezard, G. Parisi, and M.A. Virasoro. Spin Glass Theory and Beyond. [MW96] World Scientic, Singapore, W. G. Macready and D. H. Wolpert. What makes an optimization problem? Complexity, 5:40{46, [Sta95] P. F. Stadler. Towards a theory of landscapes. In R. Lopez-Pe~na, R. Capovilla, R. Garcia-Pelayo, H Waelbroeck, and F. Zertuche, editors, Complex systems and binary networks. Springer Verlag, Berlin, [Wei96] E. D. Weinberger. Np completeness of kauman's nk model, a tuneable rugged tness landscape. Santa Fe Institute technical report SFI , [WM97] D. H. Wolpert and W. G. Macready. No free lunch theorems for optimization. IEEE Trans. Evol. Comp., 1:67{83, A Determination of (n) Here we present a simple motivation for our approximation to the largest order statistic. Let P (x) be a probability distribution and C(x) be its cumulative distribution. The probability distribution for the largest of n samples from P (x) is given by np (x)c n,1 (x) =@ x C n (x). Integrating by parts we nd the expected value for the largest of the n samples, n max is then n max = xc n (x) 1,1 + Z dx C n (x) For large n, C(x) n approaches a step function at some value x c (n) that increases with n. If we make this assumption then C n (x) =(x, x c (n)) for x>0and the above integral gives n max = x c (n). So how do we determine x c (n). We simply assume that the cuto is at the value where C n (x) =1= givingx c (n) = C,1 ( np 1=). For Gaussian P (x) this estimation to n max gives an approximation which isoby3%atn = and becomes more accurate with larger n.

Optimal Search on a Technology Landscape

Optimal Search on a Technology Landscape Optimal Search on a Technology Landscape Stuart A. Kauffman José Lobo William G. Macready SFI WORKING PAPER: 1998-10-091 SFI Working Papers contain accounts of scientific work of the author(s) and do not

More information

Evolutionary Computation Theory. Jun He School of Computer Science University of Birmingham Web: jxh

Evolutionary Computation Theory. Jun He School of Computer Science University of Birmingham Web:   jxh Evolutionary Computation Theory Jun He School of Computer Science University of Birmingham Web: www.cs.bham.ac.uk/ jxh Outline Motivation History Schema Theorem Convergence and Convergence Rate Computational

More information

Timo Latvala Landscape Families

Timo Latvala Landscape Families HELSINKI UNIVERSITY OF TECHNOLOGY Department of Computer Science Laboratory for Theoretical Computer Science T-79.300 Postgraduate Course in Theoretical Computer Science Timo Latvala Landscape Families

More information

Criticality and Parallelism in Combinatorial Optimization

Criticality and Parallelism in Combinatorial Optimization Criticality and Parallelism in Combinatorial Optimization William G. Macready Athanassios G. Siapas Stuart A. Kauffman SFI WORKING PAPER: 1995-06-054 SFI Working Papers contain accounts of scientific work

More information

The Landscape of the Traveling Salesman Problem. Peter F. Stadler y. Max Planck Institut fur Biophysikalische Chemie. 080 Biochemische Kinetik

The Landscape of the Traveling Salesman Problem. Peter F. Stadler y. Max Planck Institut fur Biophysikalische Chemie. 080 Biochemische Kinetik The Landscape of the Traveling Salesman Problem Peter F. Stadler y Max Planck Institut fur Biophysikalische Chemie Karl Friedrich Bonhoeer Institut 080 Biochemische Kinetik Am Fassberg, D-3400 Gottingen,

More information

NP Completeness of Kauffman s N-k Model, a Tuneably Rugged Fitness Landscape

NP Completeness of Kauffman s N-k Model, a Tuneably Rugged Fitness Landscape P Completeness of Kauffman s -k Model, a Tuneably Rugged Fitness Landscape Edward D. Weinberger SFI WORKIG PAPER: 1996-02-003 SFI Working Papers contain accounts of scientific work of the author(s) and

More information

Amplitude Spectra of Fitness Landscapes

Amplitude Spectra of Fitness Landscapes Amplitude Spectra of Fitness Landscapes Wim Hordijk Peter F. Stadler SFI WORKING PAPER: 1998-02-021 SFI Working Papers contain accounts of scientific work of the author(s) and do not necessarily represent

More information

Undirected graphical models

Undirected graphical models Undirected graphical models Semantics of probabilistic models over undirected graphs Parameters of undirected models Example applications COMP-652 and ECSE-608, February 16, 2017 1 Undirected graphical

More information

Lecture 14 - P v.s. NP 1

Lecture 14 - P v.s. NP 1 CME 305: Discrete Mathematics and Algorithms Instructor: Professor Aaron Sidford (sidford@stanford.edu) February 27, 2018 Lecture 14 - P v.s. NP 1 In this lecture we start Unit 3 on NP-hardness and approximation

More information

Learning with Ensembles: How. over-tting can be useful. Anders Krogh Copenhagen, Denmark. Abstract

Learning with Ensembles: How. over-tting can be useful. Anders Krogh Copenhagen, Denmark. Abstract Published in: Advances in Neural Information Processing Systems 8, D S Touretzky, M C Mozer, and M E Hasselmo (eds.), MIT Press, Cambridge, MA, pages 190-196, 1996. Learning with Ensembles: How over-tting

More information

Haploid-Diploid Algorithms

Haploid-Diploid Algorithms Haploid-Diploid Algorithms Larry Bull Department of Computer Science & Creative Technologies University of the West of England Bristol BS16 1QY, U.K. +44 (0)117 3283161 Larry.Bull@uwe.ac.uk LETTER Abstract

More information

Effects of Neutral Selection on the Evolution of Molecular Species

Effects of Neutral Selection on the Evolution of Molecular Species Effects of Neutral Selection on the Evolution of Molecular Species M. E. J. Newman Robin Engelhardt SFI WORKING PAPER: 1998-01-001 SFI Working Papers contain accounts of scientific work of the author(s)

More information

When to use bit-wise neutrality

When to use bit-wise neutrality Nat Comput (010) 9:83 94 DOI 10.1007/s11047-008-9106-8 When to use bit-wise neutrality Tobias Friedrich Æ Frank Neumann Published online: 6 October 008 Ó Springer Science+Business Media B.V. 008 Abstract

More information

ground state degeneracy ground state energy

ground state degeneracy ground state energy Searching Ground States in Ising Spin Glass Systems Steven Homer Computer Science Department Boston University Boston, MA 02215 Marcus Peinado German National Research Center for Information Technology

More information

A.I.: Beyond Classical Search

A.I.: Beyond Classical Search A.I.: Beyond Classical Search Random Sampling Trivial Algorithms Generate a state randomly Random Walk Randomly pick a neighbor of the current state Both algorithms asymptotically complete. Overview Previously

More information

cells [20]. CAs exhibit three notable features, namely massive parallelism, locality of cellular interactions, and simplicity of basic components (cel

cells [20]. CAs exhibit three notable features, namely massive parallelism, locality of cellular interactions, and simplicity of basic components (cel I. Rechenberg, and H.-P. Schwefel (eds.), pages 950-959, 1996. Copyright Springer-Verlag 1996. Co-evolving Parallel Random Number Generators Moshe Sipper 1 and Marco Tomassini 2 1 Logic Systems Laboratory,

More information

A New Approach to Estimating the Expected First Hitting Time of Evolutionary Algorithms

A New Approach to Estimating the Expected First Hitting Time of Evolutionary Algorithms A New Approach to Estimating the Expected First Hitting Time of Evolutionary Algorithms Yang Yu and Zhi-Hua Zhou National Laboratory for Novel Software Technology Nanjing University, Nanjing 20093, China

More information

Lyapunov exponents in random Boolean networks

Lyapunov exponents in random Boolean networks Physica A 284 (2000) 33 45 www.elsevier.com/locate/physa Lyapunov exponents in random Boolean networks Bartolo Luque a;, Ricard V. Sole b;c a Centro de Astrobiolog a (CAB), Ciencias del Espacio, INTA,

More information

0 o 1 i B C D 0/1 0/ /1

0 o 1 i B C D 0/1 0/ /1 A Comparison of Dominance Mechanisms and Simple Mutation on Non-Stationary Problems Jonathan Lewis,? Emma Hart, Graeme Ritchie Department of Articial Intelligence, University of Edinburgh, Edinburgh EH

More information

6 Markov Chain Monte Carlo (MCMC)

6 Markov Chain Monte Carlo (MCMC) 6 Markov Chain Monte Carlo (MCMC) The underlying idea in MCMC is to replace the iid samples of basic MC methods, with dependent samples from an ergodic Markov chain, whose limiting (stationary) distribution

More information

Gaussian process for nonstationary time series prediction

Gaussian process for nonstationary time series prediction Computational Statistics & Data Analysis 47 (2004) 705 712 www.elsevier.com/locate/csda Gaussian process for nonstationary time series prediction Soane Brahim-Belhouari, Amine Bermak EEE Department, Hong

More information

Evolutionary Algorithms How to Cope With Plateaus of Constant Fitness and When to Reject Strings of The Same Fitness

Evolutionary Algorithms How to Cope With Plateaus of Constant Fitness and When to Reject Strings of The Same Fitness Evolutionary Algorithms How to Cope With Plateaus of Constant Fitness and When to Reject Strings of The Same Fitness Thomas Jansen and Ingo Wegener FB Informatik, LS 2, Univ. Dortmund, 44221 Dortmund,

More information

1.3 Forward Kolmogorov equation

1.3 Forward Kolmogorov equation 1.3 Forward Kolmogorov equation Let us again start with the Master equation, for a system where the states can be ordered along a line, such as the previous examples with population size n = 0, 1, 2,.

More information

Approximation Algorithms for Maximum. Coverage and Max Cut with Given Sizes of. Parts? A. A. Ageev and M. I. Sviridenko

Approximation Algorithms for Maximum. Coverage and Max Cut with Given Sizes of. Parts? A. A. Ageev and M. I. Sviridenko Approximation Algorithms for Maximum Coverage and Max Cut with Given Sizes of Parts? A. A. Ageev and M. I. Sviridenko Sobolev Institute of Mathematics pr. Koptyuga 4, 630090, Novosibirsk, Russia fageev,svirg@math.nsc.ru

More information

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Yong Wang and Zhi-Zhong Liu School of Information Science and Engineering Central South University ywang@csu.edu.cn

More information

Review: Directed Models (Bayes Nets)

Review: Directed Models (Bayes Nets) X Review: Directed Models (Bayes Nets) Lecture 3: Undirected Graphical Models Sam Roweis January 2, 24 Semantics: x y z if z d-separates x and y d-separation: z d-separates x from y if along every undirected

More information

Local Search and Optimization

Local Search and Optimization Local Search and Optimization Outline Local search techniques and optimization Hill-climbing Gradient methods Simulated annealing Genetic algorithms Issues with local search Local search and optimization

More information

<f> Generation t. <f> Generation t

<f> Generation t. <f> Generation t Finite Populations Induce Metastability in Evolutionary Search Erik van Nimwegen y James P. Crutcheld, yz Melanie Mitchell y y Santa Fe Institute, 99 Hyde Park Road, Santa Fe, NM 8750 z Physics Department,

More information

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Yong Wang and Zhi-Zhong Liu School of Information Science and Engineering Central South University ywang@csu.edu.cn

More information

Isotropy and Metastable States: The Landscape of the XY Hamiltonian Revisited

Isotropy and Metastable States: The Landscape of the XY Hamiltonian Revisited Isotropy and Metastable States: The Landscape of the Y Hamiltonian Revisited Ricardo Garcia-Pelayo Peter F. Stadler SFI WORKING PAPER: 1996-05-034 SFI Working Papers contain accounts of scientific work

More information

Search. Search is a key component of intelligent problem solving. Get closer to the goal if time is not enough

Search. Search is a key component of intelligent problem solving. Get closer to the goal if time is not enough Search Search is a key component of intelligent problem solving Search can be used to Find a desired goal if time allows Get closer to the goal if time is not enough section 11 page 1 The size of the search

More information

The subject of this paper is nding small sample spaces for joint distributions of

The subject of this paper is nding small sample spaces for joint distributions of Constructing Small Sample Spaces for De-Randomization of Algorithms Daphne Koller Nimrod Megiddo y September 1993 The subject of this paper is nding small sample spaces for joint distributions of n Bernoulli

More information

Lower Bounds for Cutting Planes Proofs. with Small Coecients. Abstract. We consider small-weight Cutting Planes (CP ) proofs; that is,

Lower Bounds for Cutting Planes Proofs. with Small Coecients. Abstract. We consider small-weight Cutting Planes (CP ) proofs; that is, Lower Bounds for Cutting Planes Proofs with Small Coecients Maria Bonet y Toniann Pitassi z Ran Raz x Abstract We consider small-weight Cutting Planes (CP ) proofs; that is, Cutting Planes (CP ) proofs

More information

The local equivalence of two distances between clusterings: the Misclassification Error metric and the χ 2 distance

The local equivalence of two distances between clusterings: the Misclassification Error metric and the χ 2 distance The local equivalence of two distances between clusterings: the Misclassification Error metric and the χ 2 distance Marina Meilă University of Washington Department of Statistics Box 354322 Seattle, WA

More information

EXAM IN STATISTICAL MACHINE LEARNING STATISTISK MASKININLÄRNING

EXAM IN STATISTICAL MACHINE LEARNING STATISTISK MASKININLÄRNING EXAM IN STATISTICAL MACHINE LEARNING STATISTISK MASKININLÄRNING DATE AND TIME: August 30, 2018, 14.00 19.00 RESPONSIBLE TEACHER: Niklas Wahlström NUMBER OF PROBLEMS: 5 AIDING MATERIAL: Calculator, mathematical

More information

The Bootstrap is Inconsistent with Probability Theory

The Bootstrap is Inconsistent with Probability Theory The Bootstrap is Inconsistent with Probability Theory David H. Wolpert SFI WORKING PAPER: 1995-10-091 SFI Working Papers contain accounts of scientific work of the author(s) and do not necessarily represent

More information

Statistical mechanics of fitness landscapes

Statistical mechanics of fitness landscapes Statistical mechanics of fitness landscapes Joachim Krug Institute for Theoretical Physics, University of Cologne & Jasper Franke, Johannes Neidhart, Stefan Nowak, Benjamin Schmiegelt, Ivan Szendro Advances

More information

Competing sources of variance reduction in parallel replica Monte Carlo, and optimization in the low temperature limit

Competing sources of variance reduction in parallel replica Monte Carlo, and optimization in the low temperature limit Competing sources of variance reduction in parallel replica Monte Carlo, and optimization in the low temperature limit Paul Dupuis Division of Applied Mathematics Brown University IPAM (J. Doll, M. Snarski,

More information

Upper Bounds on the Time and Space Complexity of Optimizing Additively Separable Functions

Upper Bounds on the Time and Space Complexity of Optimizing Additively Separable Functions Upper Bounds on the Time and Space Complexity of Optimizing Additively Separable Functions Matthew J. Streeter Computer Science Department and Center for the Neural Basis of Cognition Carnegie Mellon University

More information

Zebo Peng Embedded Systems Laboratory IDA, Linköping University

Zebo Peng Embedded Systems Laboratory IDA, Linköping University TDTS 01 Lecture 8 Optimization Heuristics for Synthesis Zebo Peng Embedded Systems Laboratory IDA, Linköping University Lecture 8 Optimization problems Heuristic techniques Simulated annealing Genetic

More information

Markov Chains and MCMC

Markov Chains and MCMC Markov Chains and MCMC Markov chains Let S = {1, 2,..., N} be a finite set consisting of N states. A Markov chain Y 0, Y 1, Y 2,... is a sequence of random variables, with Y t S for all points in time

More information

Kazuyuki Tanaka s work on AND-OR trees and subsequent development

Kazuyuki Tanaka s work on AND-OR trees and subsequent development Kazuyuki Tanaka s work on AND-OR trees and subsequent development Toshio Suzuki Department of Math. and Information Sciences, Tokyo Metropolitan University, CTFM 2015, Tokyo Institute of Technology September

More information

Systems Biology: A Personal View IX. Landscapes. Sitabhra Sinha IMSc Chennai

Systems Biology: A Personal View IX. Landscapes. Sitabhra Sinha IMSc Chennai Systems Biology: A Personal View IX. Landscapes Sitabhra Sinha IMSc Chennai Fitness Landscapes Sewall Wright pioneered the description of how genotype or phenotypic fitness are related in terms of a fitness

More information

Fast Evolution Strategies. Xin Yao and Yong Liu. University College, The University of New South Wales. Abstract

Fast Evolution Strategies. Xin Yao and Yong Liu. University College, The University of New South Wales. Abstract Fast Evolution Strategies Xin Yao and Yong Liu Computational Intelligence Group, School of Computer Science University College, The University of New South Wales Australian Defence Force Academy, Canberra,

More information

4.1 Notation and probability review

4.1 Notation and probability review Directed and undirected graphical models Fall 2015 Lecture 4 October 21st Lecturer: Simon Lacoste-Julien Scribe: Jaime Roquero, JieYing Wu 4.1 Notation and probability review 4.1.1 Notations Let us recall

More information

Population Genetics: a tutorial

Population Genetics: a tutorial : a tutorial Institute for Science and Technology Austria ThRaSh 2014 provides the basic mathematical foundation of evolutionary theory allows a better understanding of experiments allows the development

More information

5. Simulated Annealing 5.1 Basic Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini

5. Simulated Annealing 5.1 Basic Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini 5. Simulated Annealing 5.1 Basic Concepts Fall 2010 Instructor: Dr. Masoud Yaghini Outline Introduction Real Annealing and Simulated Annealing Metropolis Algorithm Template of SA A Simple Example References

More information

Local and Stochastic Search

Local and Stochastic Search RN, Chapter 4.3 4.4; 7.6 Local and Stochastic Search Some material based on D Lin, B Selman 1 Search Overview Introduction to Search Blind Search Techniques Heuristic Search Techniques Constraint Satisfaction

More information

Constrained Leja points and the numerical solution of the constrained energy problem

Constrained Leja points and the numerical solution of the constrained energy problem Journal of Computational and Applied Mathematics 131 (2001) 427 444 www.elsevier.nl/locate/cam Constrained Leja points and the numerical solution of the constrained energy problem Dan I. Coroian, Peter

More information

2-bit Flip Mutation Elementary Fitness Landscapes

2-bit Flip Mutation Elementary Fitness Landscapes RN/10/04 Research 15 September 2010 Note 2-bit Flip Mutation Elementary Fitness Landscapes Presented at Dagstuhl Seminar 10361, Theory of Evolutionary Algorithms, 8 September 2010 Fax: +44 (0)171 387 1397

More information

Standard Particle Swarm Optimisation

Standard Particle Swarm Optimisation Standard Particle Swarm Optimisation From 2006 to 2011 Maurice.Clerc@WriteMe.com 2012-09-23 version 1 Introduction Since 2006, three successive standard PSO versions have been put on line on the Particle

More information

Statistical Complexity of Simple 1D Spin Systems

Statistical Complexity of Simple 1D Spin Systems Statistical Complexity of Simple 1D Spin Systems James P. Crutchfield Physics Department, University of California, Berkeley, CA 94720-7300 and Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, NM 87501

More information

Stochastic Search: Part 2. Genetic Algorithms. Vincent A. Cicirello. Robotics Institute. Carnegie Mellon University

Stochastic Search: Part 2. Genetic Algorithms. Vincent A. Cicirello. Robotics Institute. Carnegie Mellon University Stochastic Search: Part 2 Genetic Algorithms Vincent A. Cicirello Robotics Institute Carnegie Mellon University 5000 Forbes Avenue Pittsburgh, PA 15213 cicirello@ri.cmu.edu 1 The Genetic Algorithm (GA)

More information

A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem

A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem Jun He 1, Yuren Zhou 2, and Xin Yao 3 1 J. He is with the Department of Computer Science,

More information

LECTURE 15 + C+F. = A 11 x 1x1 +2A 12 x 1x2 + A 22 x 2x2 + B 1 x 1 + B 2 x 2. xi y 2 = ~y 2 (x 1 ;x 2 ) x 2 = ~x 2 (y 1 ;y 2 1

LECTURE 15 + C+F. = A 11 x 1x1 +2A 12 x 1x2 + A 22 x 2x2 + B 1 x 1 + B 2 x 2. xi y 2 = ~y 2 (x 1 ;x 2 ) x 2 = ~x 2 (y 1 ;y 2  1 LECTURE 5 Characteristics and the Classication of Second Order Linear PDEs Let us now consider the case of a general second order linear PDE in two variables; (5.) where (5.) 0 P i;j A ij xix j + P i,

More information

Quantum Annealing and the Satisfiability Problem

Quantum Annealing and the Satisfiability Problem arxiv:1612.7258v1 [quant-ph] 21 Dec 216 Quantum Annealing and the Satisfiability Problem 1. Introduction Kristen L PUDENZ 1, Gregory S TALLANT, Todd R BELOTE, and Steven H ADACHI Lockheed Martin, United

More information

When to Use Bit-Wise Neutrality

When to Use Bit-Wise Neutrality When to Use Bit-Wise Neutrality Tobias Friedrich Department 1: Algorithms and Complexity Max-Planck-Institut für Informatik Saarbrücken, Germany Frank Neumann Department 1: Algorithms and Complexity Max-Planck-Institut

More information

Evolutionary Programming Using a Mixed Strategy Adapting to Local Fitness Landscape

Evolutionary Programming Using a Mixed Strategy Adapting to Local Fitness Landscape Evolutionary Programming Using a Mixed Strategy Adapting to Local Fitness Landscape Liang Shen Department of Computer Science Aberystwyth University Ceredigion, SY23 3DB UK lls08@aber.ac.uk Jun He Department

More information

ESANN'1999 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), April 1999, D-Facto public., ISBN X, pp.

ESANN'1999 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), April 1999, D-Facto public., ISBN X, pp. Statistical mechanics of support vector machines Arnaud Buhot and Mirta B. Gordon Department de Recherche Fondamentale sur la Matiere Condensee CEA-Grenoble, 17 rue des Martyrs, 38054 Grenoble Cedex 9,

More information

3 Undirected Graphical Models

3 Undirected Graphical Models Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 3 Undirected Graphical Models In this lecture, we discuss undirected

More information

Variable Objective Search

Variable Objective Search Variable Objective Search Sergiy Butenko, Oleksandra Yezerska, and Balabhaskar Balasundaram Abstract This paper introduces the variable objective search framework for combinatorial optimization. The method

More information

An Effective Chromosome Representation for Evolving Flexible Job Shop Schedules

An Effective Chromosome Representation for Evolving Flexible Job Shop Schedules An Effective Chromosome Representation for Evolving Flexible Job Shop Schedules Joc Cing Tay and Djoko Wibowo Intelligent Systems Lab Nanyang Technological University asjctay@ntuedusg Abstract As the Flexible

More information

COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017

COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017 COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University FEATURE EXPANSIONS FEATURE EXPANSIONS

More information

3.4 Relaxations and bounds

3.4 Relaxations and bounds 3.4 Relaxations and bounds Consider a generic Discrete Optimization problem z = min{c(x) : x X} with an optimal solution x X. In general, the algorithms generate not only a decreasing sequence of upper

More information

Distribution of Environments in Formal Measures of Intelligence: Extended Version

Distribution of Environments in Formal Measures of Intelligence: Extended Version Distribution of Environments in Formal Measures of Intelligence: Extended Version Bill Hibbard December 2008 Abstract This paper shows that a constraint on universal Turing machines is necessary for Legg's

More information

data lam=36.9 lam=6.69 lam=4.18 lam=2.92 lam=2.21 time max wavelength modulus of max wavelength cycle

data lam=36.9 lam=6.69 lam=4.18 lam=2.92 lam=2.21 time max wavelength modulus of max wavelength cycle AUTOREGRESSIVE LINEAR MODELS AR(1) MODELS The zero-mean AR(1) model x t = x t,1 + t is a linear regression of the current value of the time series on the previous value. For > 0 it generates positively

More information

The Evolution of Gene Dominance through the. Baldwin Effect

The Evolution of Gene Dominance through the. Baldwin Effect The Evolution of Gene Dominance through the Baldwin Effect Larry Bull Computer Science Research Centre Department of Computer Science & Creative Technologies University of the West of England, Bristol

More information

k-protected VERTICES IN BINARY SEARCH TREES

k-protected VERTICES IN BINARY SEARCH TREES k-protected VERTICES IN BINARY SEARCH TREES MIKLÓS BÓNA Abstract. We show that for every k, the probability that a randomly selected vertex of a random binary search tree on n nodes is at distance k from

More information

1 Introduction Duality transformations have provided a useful tool for investigating many theories both in the continuum and on the lattice. The term

1 Introduction Duality transformations have provided a useful tool for investigating many theories both in the continuum and on the lattice. The term SWAT/102 U(1) Lattice Gauge theory and its Dual P. K. Coyle a, I. G. Halliday b and P. Suranyi c a Racah Institute of Physics, Hebrew University of Jerusalem, Jerusalem 91904, Israel. b Department ofphysics,

More information

A variational approach to Ising spin glasses in finite dimensions

A variational approach to Ising spin glasses in finite dimensions . Phys. A: Math. Gen. 31 1998) 4127 4140. Printed in the UK PII: S0305-447098)89176-2 A variational approach to Ising spin glasses in finite dimensions R Baviera, M Pasquini and M Serva Dipartimento di

More information

Linearly-solvable Markov decision problems

Linearly-solvable Markov decision problems Advances in Neural Information Processing Systems 2 Linearly-solvable Markov decision problems Emanuel Todorov Department of Cognitive Science University of California San Diego todorov@cogsci.ucsd.edu

More information

3D HP Protein Folding Problem using Ant Algorithm

3D HP Protein Folding Problem using Ant Algorithm 3D HP Protein Folding Problem using Ant Algorithm Fidanova S. Institute of Parallel Processing BAS 25A Acad. G. Bonchev Str., 1113 Sofia, Bulgaria Phone: +359 2 979 66 42 E-mail: stefka@parallel.bas.bg

More information

1 Introduction (January 21)

1 Introduction (January 21) CS 97: Concrete Models of Computation Spring Introduction (January ). Deterministic Complexity Consider a monotonically nondecreasing function f : {,,..., n} {, }, where f() = and f(n) =. We call f a step

More information

Evolutionary Computation

Evolutionary Computation Evolutionary Computation - Computational procedures patterned after biological evolution. - Search procedure that probabilistically applies search operators to set of points in the search space. - Lamarck

More information

On reaching head-to-tail ratios for balanced and unbalanced coins

On reaching head-to-tail ratios for balanced and unbalanced coins Journal of Statistical Planning and Inference 0 (00) 0 0 www.elsevier.com/locate/jspi On reaching head-to-tail ratios for balanced and unbalanced coins Tamas Lengyel Department of Mathematics, Occidental

More information

Boxlets: a Fast Convolution Algorithm for. Signal Processing and Neural Networks. Patrice Y. Simard, Leon Bottou, Patrick Haner and Yann LeCun

Boxlets: a Fast Convolution Algorithm for. Signal Processing and Neural Networks. Patrice Y. Simard, Leon Bottou, Patrick Haner and Yann LeCun Boxlets: a Fast Convolution Algorithm for Signal Processing and Neural Networks Patrice Y. Simard, Leon Bottou, Patrick Haner and Yann LeCun AT&T Labs-Research 100 Schultz Drive, Red Bank, NJ 07701-7033

More information

Measures for information propagation in Boolean networks

Measures for information propagation in Boolean networks Physica D 227 (2007) 100 104 www.elsevier.com/locate/physd Measures for information propagation in Boolean networks Pauli Rämö a,, Stuart Kauffman b, Juha Kesseli a, Olli Yli-Harja a a Institute of Signal

More information

Notes on Dantzig-Wolfe decomposition and column generation

Notes on Dantzig-Wolfe decomposition and column generation Notes on Dantzig-Wolfe decomposition and column generation Mette Gamst November 11, 2010 1 Introduction This note introduces an exact solution method for mathematical programming problems. The method is

More information

Metaheuristics and Local Search

Metaheuristics and Local Search Metaheuristics and Local Search 8000 Discrete optimization problems Variables x 1,..., x n. Variable domains D 1,..., D n, with D j Z. Constraints C 1,..., C m, with C i D 1 D n. Objective function f :

More information

Upper and Lower Bounds on the Number of Faults. a System Can Withstand Without Repairs. Cambridge, MA 02139

Upper and Lower Bounds on the Number of Faults. a System Can Withstand Without Repairs. Cambridge, MA 02139 Upper and Lower Bounds on the Number of Faults a System Can Withstand Without Repairs Michel Goemans y Nancy Lynch z Isaac Saias x Laboratory for Computer Science Massachusetts Institute of Technology

More information

Controlling chaos in random Boolean networks

Controlling chaos in random Boolean networks EUROPHYSICS LETTERS 20 March 1997 Europhys. Lett., 37 (9), pp. 597-602 (1997) Controlling chaos in random Boolean networks B. Luque and R. V. Solé Complex Systems Research Group, Departament de Fisica

More information

Iterative procedure for multidimesional Euler equations Abstracts A numerical iterative scheme is suggested to solve the Euler equations in two and th

Iterative procedure for multidimesional Euler equations Abstracts A numerical iterative scheme is suggested to solve the Euler equations in two and th Iterative procedure for multidimensional Euler equations W. Dreyer, M. Kunik, K. Sabelfeld, N. Simonov, and K. Wilmanski Weierstra Institute for Applied Analysis and Stochastics Mohrenstra e 39, 07 Berlin,

More information

Metaheuristics and Local Search. Discrete optimization problems. Solution approaches

Metaheuristics and Local Search. Discrete optimization problems. Solution approaches Discrete Mathematics for Bioinformatics WS 07/08, G. W. Klau, 31. Januar 2008, 11:55 1 Metaheuristics and Local Search Discrete optimization problems Variables x 1,...,x n. Variable domains D 1,...,D n,

More information

Newton, Fermat, and Exactly Realizable Sequences

Newton, Fermat, and Exactly Realizable Sequences 1 2 3 47 6 23 11 Journal of Integer Sequences, Vol. 8 (2005), Article 05.1.2 Newton, Fermat, and Exactly Realizable Sequences Bau-Sen Du Institute of Mathematics Academia Sinica Taipei 115 TAIWAN mabsdu@sinica.edu.tw

More information

on a Stochastic Current Waveform Urbana, Illinois Dallas, Texas Abstract

on a Stochastic Current Waveform Urbana, Illinois Dallas, Texas Abstract Electromigration Median Time-to-Failure based on a Stochastic Current Waveform by Farid Najm y, Ibrahim Hajj y, and Ping Yang z y Coordinated Science Laboratory z VLSI Design Laboratory University of Illinois

More information

2 Differential Evolution and its Control Parameters

2 Differential Evolution and its Control Parameters COMPETITIVE DIFFERENTIAL EVOLUTION AND GENETIC ALGORITHM IN GA-DS TOOLBOX J. Tvrdík University of Ostrava 1 Introduction The global optimization problem with box constrains is formed as follows: for a

More information

Unit 1A: Computational Complexity

Unit 1A: Computational Complexity Unit 1A: Computational Complexity Course contents: Computational complexity NP-completeness Algorithmic Paradigms Readings Chapters 3, 4, and 5 Unit 1A 1 O: Upper Bounding Function Def: f(n)= O(g(n)) if

More information

1.5 Sequence alignment

1.5 Sequence alignment 1.5 Sequence alignment The dramatic increase in the number of sequenced genomes and proteomes has lead to development of various bioinformatic methods and algorithms for extracting information (data mining)

More information

Monte Carlo Methods for Statistical Inference: Variance Reduction Techniques

Monte Carlo Methods for Statistical Inference: Variance Reduction Techniques Monte Carlo Methods for Statistical Inference: Variance Reduction Techniques Hung Chen hchen@math.ntu.edu.tw Department of Mathematics National Taiwan University 3rd March 2004 Meet at NS 104 On Wednesday

More information

Parallel Genetic Algorithms

Parallel Genetic Algorithms Parallel Genetic Algorithms for the Calibration of Financial Models Riccardo Gismondi June 13, 2008 High Performance Computing in Finance and Insurance Research Institute for Computational Methods Vienna

More information

The Complexity of Maximum. Matroid-Greedoid Intersection and. Weighted Greedoid Maximization

The Complexity of Maximum. Matroid-Greedoid Intersection and. Weighted Greedoid Maximization Department of Computer Science Series of Publications C Report C-2004-2 The Complexity of Maximum Matroid-Greedoid Intersection and Weighted Greedoid Maximization Taneli Mielikäinen Esko Ukkonen University

More information

ARTIFICIAL INTELLIGENCE LABORATORY. and CENTER FOR BIOLOGICAL INFORMATION PROCESSING. A.I. Memo No August Federico Girosi.

ARTIFICIAL INTELLIGENCE LABORATORY. and CENTER FOR BIOLOGICAL INFORMATION PROCESSING. A.I. Memo No August Federico Girosi. MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL INTELLIGENCE LABORATORY and CENTER FOR BIOLOGICAL INFORMATION PROCESSING WHITAKER COLLEGE A.I. Memo No. 1287 August 1991 C.B.I.P. Paper No. 66 Models of

More information

6.207/14.15: Networks Lecture 12: Generalized Random Graphs

6.207/14.15: Networks Lecture 12: Generalized Random Graphs 6.207/14.15: Networks Lecture 12: Generalized Random Graphs 1 Outline Small-world model Growing random networks Power-law degree distributions: Rich-Get-Richer effects Models: Uniform attachment model

More information

Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering

Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering Oliver C. Schrempf, Dietrich Brunn, Uwe D. Hanebeck Intelligent Sensor-Actuator-Systems Laboratory Institute

More information

Computation Of Asymptotic Distribution. For Semiparametric GMM Estimators. Hidehiko Ichimura. Graduate School of Public Policy

Computation Of Asymptotic Distribution. For Semiparametric GMM Estimators. Hidehiko Ichimura. Graduate School of Public Policy Computation Of Asymptotic Distribution For Semiparametric GMM Estimators Hidehiko Ichimura Graduate School of Public Policy and Graduate School of Economics University of Tokyo A Conference in honor of

More information

Performance of Evolutionary Algorithms on NK Landscapes with Nearest Neighbor Interactions and Tunable Overlap

Performance of Evolutionary Algorithms on NK Landscapes with Nearest Neighbor Interactions and Tunable Overlap Performance of Evolutionary Algorithms on NK Landscapes with Nearest Neighbor Interactions and Tunable Overlap Martin Pelikan, Kumara Sastry, David E. Goldberg, Martin V. Butz, and Mark Hauschild Missouri

More information

In: Proc. BENELEARN-98, 8th Belgian-Dutch Conference on Machine Learning, pp 9-46, 998 Linear Quadratic Regulation using Reinforcement Learning Stephan ten Hagen? and Ben Krose Department of Mathematics,

More information

( ) T. Reading. Lecture 22. Definition of Covariance. Imprinting Multiple Patterns. Characteristics of Hopfield Memory

( ) T. Reading. Lecture 22. Definition of Covariance. Imprinting Multiple Patterns. Characteristics of Hopfield Memory Part 3: Autonomous Agents /8/07 Reading Lecture 22 Flake, ch. 20 ( Genetics and Evolution ) /8/07 /8/07 2 Imprinting Multiple Patterns Let x, x 2,, x p be patterns to be imprinted Define the sum-of-outer-products

More information

Lecture 2 : CS6205 Advanced Modeling and Simulation

Lecture 2 : CS6205 Advanced Modeling and Simulation Lecture 2 : CS6205 Advanced Modeling and Simulation Lee Hwee Kuan 21 Aug. 2013 For the purpose of learning stochastic simulations for the first time. We shall only consider probabilities on finite discrete

More information

A New Variation of Hat Guessing Games

A New Variation of Hat Guessing Games A New Variation of Hat Guessing Games Tengyu Ma 1, Xiaoming Sun 1, and Huacheng Yu 1 Institute for Theoretical Computer Science Tsinghua University, Beijing, China Abstract. Several variations of hat guessing

More information