PROBABILISTIC ANALYSIS OF THE (1 + 1)-EVOLUTIONARY ALGORITHM

Size: px
Start display at page:

Download "PROBABILISTIC ANALYSIS OF THE (1 + 1)-EVOLUTIONARY ALGORITHM"

Transcription

1 /59 PROBABILISTIC ANALYSIS OF THE (1 + 1)-EVOLUTIONARY ALGORITHM Hsien-Kuei Hwang (joint with Alois Panholzer, Nicolas Rolin, Tsung-Hsi Tsai, Wei-Mei Chen) October 13, 2017

2 /59 MIT: EVOLUTONARY COMPUTATION

3 3/59 WE ARE ALWAYS SEARCHING & RESEARCHING Life Algorithms Searching Searching Life = Algorithms?

4 4/59 OPTIMIZATION PROBLEMS EVERYWHERE Conflicting criteria Impasse Intractable Life = NP-Hard? Dilemmas, Quandaries Mess, Obstacles Puzzles

5 5/59 SEARCH ALGORITHMS Backtracking Branch-and-bound Greedy Dynamic programming Simulated annealing Evolutionary algorithms Ant colony optimization Particle swarm Tabu search GRASP... Meta-heuristics

6 6/59 EVOLUTIONARY ALGORITHMS

7 EVOLUTIONARY ALGORITHM The use of Darwinian principles for automated problem solving originated in the 1950s. Darwin s theory of evolution: survival of the fittest stochastic evolution on computers cultivating problem solutions instead of calculating them randomized search heuristics generate and test (or trial and error) useful for global optimization, if the problem is too complex to be handled by an exact method or no exact method is available 7/59 Pioneers: John Holland, Lawrence J. Fogel, Ingo Rechenberg,...

8 8/59 EVOLUTIONARY ALGORITHMS (EAs) Anne-Wil Harzing s s/w Publish or Perish Most popular: Multiobjective optimization problems

9 ELITISM IN MULTIOBJECTIVE EVOLUTIONARY ALGORITHM 9/59 Our motivation: from maxima (skylines) to elites to EA

10 10/59 COMPONTENTS OF EAs Representation Coding of solutions Initialization Parent selection Evaluation: Fitness function Initialize Population Randomly vary individuals Evaluate fitness Apply selection Survivor selection Offspring Reproduction Genetic operators Termination condition Stop? yes Output no

11 11/59 TYPICAL PROGRESS OF AN EA Initialization Halfway Termination

12 PROS AND CONS OF EAs Disadvantages Large convergence time Difficult adjustment of parameters Heuristic principle No guarantee of global max Advantages Reasonably good solutions quickly Suitable for complex search spaces Easy to parallelize 12/59 Scalable to higher dimensional problems

13 13/59 DIFFICULTY OF ANALYSIS OF EA A typical EA comprises several ingredients coding of solution population of individuals selection for reproduction operations for breeding new individuals fitness function to evaluate the new individual... Mathematical description of the dynamics of the algorithms or the asymptotics of the complexity = challenging

14 14/59 Droste et al. (2002): Theory is far behind experimental knowledge... rigorous research is hard to find. Applications Theory Algorithms

15 15/59 SIMPLEST VERSION: 1 PARENT, 1 CHILD, AND MUTATION ONLY Algorithm (1 + 1)-EA 1 Choose an initial string x {0, 1} n uniformly at random 2 Repeat until a terminating condition (mutation) Create y by flipping each bit of x independently with probability p Replace x by y iff f (y) f (x) f : fitness (or objective) function

16 ANALYSIS OF (1 + 1)-EA UNDER ONEMAX Known results for ONEMAX f (x) = x x n X n := # steps used by the (1 + 1)-EA to reach the optimum state f (x) = n when the mutation rate is 1 n Bäck (1992): transition probabilities Mühlenbein (1992): E(X n ) n log n Droste et al. (1998, 2002): E(X n ) n log n ONEMAX function Linear functions w i x i Doerr et al. lower bound Jagerskupper upper bound (2010) (1 o(1))en log(n) (2011) 2.02en log(n) Sudholt lower bound Doerr et al. upper bound 2010 en log(n) 2n log log(n) (2010) 1.39en log(n) Doerr et al. (2011) en log(n) Θ(n) Witt (2013) upper bound en log(n) + O(n) 16/59 Approaches used: Markov chain, martingale, coupon collection,...

17 17/59 KNOWN BOUNDS FOR E(X n ) UNDER ONEMAX ONEMAX Droste et al. (2002) en(log n + γ) Doerr et al. (2011) en log n n Our result = e(n ) log n c 1n + O(1) c 1 = Lehre & Witt (2014) en log n n O(log n) Doerr et al. (2011) Sudholt (2010) Doerr et al. (2010) Droste et al. (2002) en log n Θ(n) en log n 2n log log n (1 o(1))en log n 0.196n log n

18 18/59 GARNIER, KALLEL & SCHOENAUER (1999) Rigorous hitting times for binary mutations Strongest results obtained so far but proof incomplete (probabilistic arguments) E(X n ) = en log n + c 1 n + o(n), where c X n en log n c d 1 log Exp(1) (double-exponential) their results had remained obscure in the EA-literature

19 19/59 OUR RESULTS ( ) log n E(X n ) = en log n + c 1 n + 1e log n + c O n c 1 = e ( ( log 2 γ φ 1 )) , where γ is Euler s constant, z ( 1 φ 1 (z) := 0 S 1 (t) 1 ) dt, t S 1 (z) := z l (1 z)j (l j). l! j! l 1 0 j<l Indeed E(X n ) n k 0 c k log n + c k n k

20 LIMIT GUMBEL DISTRIBUTION ( ) Xn P en log n + log 2 φ 1( 1) x e e x 2 φ 1 ( 1 ) /59 left: n = right: e x e x

21 21/59 OUR APPROACH RECURRENCE ASYMPTOTICS

22 THE RANDOM VARIABLE X n,m X n := 0 m n ( ) n p m q n m X n,m m X n,m := # steps used by the (1 + 1)-EA to reach f (x) = n when starting from f (x) = n m Let Q n,m (t) := E(t Xn,m ). Then Q n,0 (t) = 1 and t λ n,n m,l Q n,m l (t) Q n,m (t) = 1 1 l m ( 1 1 l m λ n,n m,l ) t 22/59 for 1 m n, where (P(m 1 s m + l 1 s)) ( λ n,m,l := 1 1 ) n ( )( ) m n m (n 1) l 2j n j j + l 0 j min{m,n m l}

23 23/59 λ n,m,l := P(m 1 s m + l 1 s) 0 j min{m,n m l} ( ) ( ) j ( m ) m j ( ) ( ) j+l ( n m ) n m j l j n n j + l n n }{{}}{{} λ n,m,l 1 l m 1 λ n,m,2 λ n,m,m... optimum state X n,m n m 1s λ n,m,1 X n,m 1 n m + 1 1s X n,m 2 n m + 2 1s... X n,0 n 1s

24 24/59 Q: How to solve this recurrence? t λ n,n m,l Q n,m l (t) 1 l m Q n,m (t) = ( 1 1 ) λ n,n m,l t 1 l m m = O(1)

25 25/59 SIMPLEST CASE: m = 1 λ n,n 1,1 = 1 n m = 1 (n 1 1s = n 1s) ( ( 1 n) 1 n 1 ( 1 = Geometric 1 1 ) ) n 1 n n Q n,1 (t) = ( 1 n 1 1 n 1 n) t ( ( ) ) n 1 1 n 1. t n = X n,1 en P d Exp(1) ( ) Xn,1 en x 1 e x

26 26/59 EXPONENTIAL DISTRIBUTION

27 27/59 BY INDUCTION m = O(1) X n,m en P d Exp(1) + + Exp(m) ) en x ( 1 e x) m ( Xn,m E(X n,m ) eh m n and V(X n,m ) e 2 H (2) m n 2 m = 2, 4, 6, 8 & n = 5,..., 50 An LLT also holds

28 28/59 m = 2; n = 5,..., 50

29 29/59 SKETCH OF PROOF λ n,n m,l = = ( 1 1 ) n n ( m l 0 j min{n m,m l} )e 1 n l ( 1 + O ( m j + l ( m l n(l + 1) + l n )( n m j )) m = O(1) = j = 1 is dominant ) (n 1) l 2j

30 29/59 SKETCH OF PROOF λ n,n m,l = = ( 1 1 ) n n ( m l 0 j min{n m,m l} )e 1 n l ( 1 + O ( m j + l ( m l n(l + 1) + l n )( n m j )) m = O(1) = j = 1 is dominant Q n,m (t) = t 1 = Q n,m (t) 1 l m ( λ n,n m,l Q n,m l (t) 1 1 l m m t en 1 ( 1 m en λ n,n m,l ) ) (n 1) l 2j t ) t Q n,m 1 (t)

31 30/59 SKETCH OF PROOF: BY INDUCTION Q n,m (t) 1 r m r t en 1 ( 1 en) r t

32 30/59 SKETCH OF PROOF: BY INDUCTION Q n,m (t) 1 r m = Q n,m ( e s/(en) ) 1 r m r t en 1 ( 1 en) r t 1 1 s r Fails when m = 1 r m Exp(r)

33 30/59 SKETCH OF PROOF: BY INDUCTION Q n,m (t) 1 r m = Q n,m ( e s/(en) ) 1 r m r t en 1 ( 1 en) r t 1 1 s r Fails when m = 1 r m Exp(r) Let Y m := 1 r m Exp(r). Then as m E ( e (Ym Hm)iθ) = 1 r m e iθ r 1 iθ r e iθ r 1 iθ r 1 r = e γiθ Γ(1 iθ)

34 30/59 SKETCH OF PROOF: BY INDUCTION Q n,m (t) 1 r m = Q n,m ( e s/(en) ) 1 r m r t en 1 ( 1 en) r t 1 1 s r Fails when m = 1 r m Exp(r) Let Y m := 1 r m Exp(r). Then as m E ( e (Ym Hm)iθ) = 1 r m e iθ r 1 iθ r e iθ r 1 iθ r 1 r = e γiθ Γ(1 iθ) = P (Y m log m x) e e x (x R)

35 31/59 m

36 EXPECTED VALUES E(X n,m ) µ n,m = µ n,m := E(X n,m ) = Q n,m(1) (µ n,0 = 0) 1 + λ n,n m,l µ n,m l 1 l m 1 l m λ n,n m,l µ n,m l (1 m n) = E(X n ) = 2 n 0 m n ( ) n µ n,m m 32/59 Let e n := ( ) 1 1 n+1 n+1 and µ n,m := en µ n n+1,m ( ) λ n,m,l µ n,m µ 1 n,m l = n 1 l m λ n,m,l := λ n+1,n+1 m,l e n = 0 j min{n+1 m,m l} ( n + 1 m j )( ) m j + l n l 2j

37 33/59 µ n,1 = 1 & 1 l m λ n,m,l ( µ n,m µ n,m l) = 1 n µ n,2 = 3 n2 +n 1 2 n 2 +2 n 1 µ n,3 = 22 n6 +40 n 5 19 n 4 42 n n n 6 (2 n 2 +2 n 1)(6 n n 3 7 n 2 9 n+6)

38 33/59 µ n,1 = 1 & 1 l m λ n,m,l ( µ n,m µ n,m l) = 1 n µ n,2 = 3 n2 +n 1 2 n 2 +2 n 1 µ n,3 = 22 n6 +40 n 5 19 n 4 42 n n n 6 µ n,4 = µ n,5 = (2 n 2 +2 n 1)(6 n n 3 7 n 2 9 n+6) 600 n n n n n n n n n n n n ( 2 n n 1 ) ( 6 n n 3 7 n 2 9 n + 6 ) ( 24 n n 5 48 n n n n 60 ) n n n n n n n n n n n n n n n n n n n n ( 2 n n 1 ) ( 6 n n 3 7 n 2 9 n + 6 ) ( 24 n n 5 48 n n n n 60 ) (120 n n n n n n n n + 840)

39 34/59 µ n,1 = 1 ASYMPTOTICS OF µ n,m ( ) 1 l m λ n,m,l µ n,m µ n,m l = 1 n µ n,2 = 3 2 n n n n n 5 + µ n,3 = n n n n 4 + µ n,4 = n n n n 4 + µ n,5 = n n n n 4 + µ n,6 = n n n n 4 +

40 ASYMPTOTICS OF µ n,m ( ) 1 l m λ n,m,l µ n,m µ n,m l = 1 n 34/59 µ n,1 = 1 µ n,2 = 3 2 n n n n n 5 + µ n,3 = n n n n 4 + µ n,4 = n n n n 4 + µ n,5 = n n n n 4 + µ n,6 = n n n n 4 + H m = 1 j m 1 { j = 1, 3 2, 11 6, 25 12, , 49 } 20,...

41 35/59 HEURISTICS An Ansatz approximation: µ n,m k 0 d 0 (m) = H m (m 0) d k (m) n k d 1 (m) = H m m (m 1) d 2 (m) = 2 3 H m m m2 (m 2) d 3 (m) = 1 2 H m m m m3 (m 2) d 4 (m) = 5 18 H m m m m m4 (m 4)

42 35/59 HEURISTICS An Ansatz approximation: µ n,m k 0 d 0 (m) = H m (m 0) d k (m) n k d 1 (m) = H m m (m 1) d 2 (m) = 2 3 H m m m2 (m 2) d 3 (m) = 1 2 H m m m m3 (m 2) d 4 (m) = 5 18 H m m m m m4 (m 4) Complication: d k (m) holds for m 2 k 2

43 35/59 HEURISTICS An Ansatz approximation: µ n,m k 0 d 0 (m) = H m (m 0) d k (m) n k d 1 (m) = H m m (m 1) d 2 (m) = 2 3 H m m m2 (m 2) d 3 (m) = 1 2 H m m m m3 (m 2) d 4 (m) = 5 18 H m m m m m4 (m 4) Complication: d k (m) holds for m 2 k 2 General pattern: µ n,m ( k 0 n k b k H m + ) 0 j k ϖ k,jm j

44 35/59 HEURISTICS An Ansatz approximation: µ n,m k 0 d 0 (m) = H m (m 0) d k (m) n k d 1 (m) = H m m (m 1) d 2 (m) = 2 3 H m m m2 (m 2) d 3 (m) = 1 2 H m m m m3 (m 2) d 4 (m) = 5 18 H m m m m m4 (m 4) Complication: d k (m) holds for m 2 k 2 General pattern: µ n,m ( k 0 n k b k H m + ) 0 j k ϖ k,jm j α := m n = µ n,m H m + φ(α) for 1 m n

45 6/59 A MORE GENERAL ANSATZ µ n,m H m + φ 1 (α) + b 1H m+φ 2 (α) n + b 2H m+φ 3 (α) n 2 + µ n,m µ n,m H m µ n,m (H m + φ 1 (α)) µ n,m ( H m + φ 1 (α) + Hm+φ ) 2(α) n

46 φ 1 (z) := φ 2 (z) = 1 2 S r (z) := l 1 z 0 z 0 z l l! ( 1 S 1 (t) 1 ) dt t ( S2 (t)s 1 (t) 2S 1 (t) 3 0 j<l r (1 z)j (l j) j! S 0(t) S 1 (t) 2 1 2S 1 (t) 1 2t ) dt t (analytic in z 1) 37/59 S 1 (x) & S 2 (x) φ 1 (x) φ 2 (x)

47 38/59 A GENERATING FUNCTION APPROACH? λ n,m,l = 0 j min{n+1 m,m l} f n (z) := m 1 ( n + 1 m j µ n,mz m )( ) m j + l n l 2j ( ) λ n,m,m l µ n,m µ 1 n,l = n 0 l<m = 1 ( ( ) ) 1 2πi 1 t z t + 1 n t ( 1 + t z ( )) t + 1 n n ( 1 + t ) ( n+1 ( )) z t + 1 n f n n t ( ) dt = 1 + t n z n(1 z)

48 9/59 A HEURISTIC 1 z f n (w)φ n (z, w) dw = 2πi n(1 z) ( ( Φ n (z, w) := 1 + τ ) ( ) ) n+1 1 n 1 τ z τ + 1 n τ ( 1 + τ n z ( )) τ + 1 n z(w 1) n(w z) 2 + Assume f n (w) φ(w). dτ dw z φ(w)(w 1) 2πin (w z) 2 dw = z n (φ(z) (1 z)φ (z)) = z 1 n 1 z = RHS Then (φ(0) = 0) φ(z) (1 z)φ (z) = 1 1 z = µ n,m H m. = φ(z) = 1 1 z log 1 1 z.

49 HOW TO GUESS φ 1 (α)? Assume µ n,m H m + φ(α) (α := m n ) H m H m l = l l(l 1) + + m 2m 2 ( m ) ( ) m l φ φ = φ (α) lm ( ) l 2 n n + O m 2 40/59 1 n = Matched asymptotics 1 l m ( ) λ n,m,l µ n,m µ n,m l ( l λ n,m,l m + φ (α) l ) n 1 l m 1 ( ) 1 n α + φ (α) 1 l m lλ n,m,l,

50 41/59 1 l m lλ n,m,l = j 1 1 n 1 ( ) 1 n α + φ (α) lλ n,m,l 1 l m j 1 = S 1 (α) ( n + 1 m j ) n j (1 α) j l αl j! l! l>j Then we see that φ must satisfy j<l m ( ) m l n l l φ (x) = 1 S 1 (x) 1 x = x = φ = φ 1 The justification relies on a careful error analysis

51 TOOLS NEEDED Lemma 1. Asymptotics of A n,m := 1 l m a lλ n,m,l Assume that A(z) = l 1 a lz l 1 has a nonzero radius of convergence in the z-plane. Then where à 0 (α) := l 1 à 1 (α) := l 1 α l l! α l l! A n,m = Ã0(α) Ã1(α) 2n + O ( n 2), 0 j<l 0 j<l (1 α) j a l j j! ( ) (1 α)j+2 (1 α)j 1 (1 α)j 2 a l j α 2 + (1 α) (j + 2)! (j 1)! (j 2)! 2/59 A n,m = 1 ( A(z) ) m ( 1 + z n+1 m dz 2πi z =c nz n)

52 43/59 TOOLS NEEDED Lemma 2. (Asymptotic tranfer) λ n,m,l(a n,m a n,m l ) = b n,m 1 l m If b n,m c/n, uniformly for 1 m n and n 1, where c > 0, then a n,m ch m (1 m n). In particular, µ n,m H m Λ n,m := 1 l m λ n,m,l m n (1 m n) a n,m b n,m Λ n,m + a n,m 1 c n n m + ch m 1 = ch m, Useful for error analysis

53 44/59 TOOLS NEEDED Lemma 3 If φ C 2 [0, 1] and φ (x) 0 for x [0, 1], then ( ( m ) ( )) m l λ n,m,l φ φ n n 1 l m = φ (α) n 1 l m = φ (α) S 1 (α) n uniformly for 1 m n. lλ n,m,l + O ( n 2) + O ( n 2) Bootstrapping & induction = µ n,m k 0 b k H m+φ k+1 (α) n k

54 45/59 INITIAL BITS ARE BERNOULLI(p): m ( n m) p m q n m µ n,m E(X n ) en = log qn + γ + φ 1(q) + 1 ( log qn + γ + 3 φ 1 (q) 2n + 2qφ (q) + pqφ 1(q) + 2φ 2 (q) + O ( n 2 log n ) ) p = 1 2 E(X n ) en = log n log 2 + γ + φ 1( 1) + 1 ( log n log 2 + γ 2 2n ) + 3 φ 1 ( 1) + 2 φ ( 1) φ 1( 1) + 2φ 2 2( 1) + 2

55 VARIANCE OF X n,m Uniformly for 1 m n V(X n,m ) = e 2 H (2) m n 2 e(2e + 1) ( n + 1 2) Hm + ψ 1 (α)n + ψ 2 (α) + O ( n 1 H m ) The two dominating terms independent of p 46/59 V(X n ) = e2 π 2 6 n2 e(2e + 1) ( ) n log n + c 1n + c 2 + O ( n 1 log n ) ψ 1 (α) = ( ) α S2 (x) 0 S dx (x) 3 x 2 x ψ 2 (α) = 7 12 ( α 5S 1 (x)s 2(x) 2 0 2S 1 (x) 5 S 0(x) S (x) 3 S (x) 2 x 3 x 2 2x 2S 1 (x)s 3(x)+S 2 (x)s 2 (x)+6s 0(x)S 2 (x) 2S 1 (x) 4 ) dx

56 V n,m := (1 1 n+1 )n+1 n 2 (V(X n+1,m ) + E(X n+1,m )) = H (2) m + 1 k<k r k H m + s k H (2) m + t k n k + O ( H m n K ) V n,m V n,m H (2) m 47/59 K = 2 K = 3

57 48/59 LIMIT GUMBEL DISTRIBUTION OF X n,m P ( 1 r m Exp(r) log m x ) e e x If m with n and m n, then ( ) Xn,m P en log m φ 1( m) x e e x n By induction ) ( E (e Xn,ms/(en) (Hm+φ 1( m n ))s = 1 + O ( Hm n )) 1 r m uniformly for 1 m n (proof long and messy). e s/r 1 s, r

58 49/59 X n = 0 m n ( ) n p m q n m X n,m m ( ) Xn P en log pn φ 1(ρ) x e e x (x R) P ( X n en log n 2 φ 1( 1 2 ) x) convergence rates

59 MAIN STEPS F n,m (s) = F n,m (s) := E ( e ) Xn,ms/(en) e φ( m n )s 1 l m 1 r m 1 1 s r = Q ( ) n,m e s/(en) e Hms φ( m n )s. 1 r m e s/r 1 s r λ n,n m,l F n,m l (s)e ( φ( m n ) φ( m l n ) ) s e s/(en) ( 1 1 l m λ n,n m,l ) m l+1 r m ( ) 1 s r 50/59 Prove F n,m (s) 1 Cn 1 H m when φ = φ 1

60 51/59 AN AUXILIARY FUNCTION G n,m (s) := 1 l m λ n,n m,l e ( φ( m n ) φ( m l n ) ) s e s/(en) ( 1 1 l m m l+1 r m λ n,n m,l ) e s/(en) ( ) 1 s r If φ C 2 [0, 1], then G n,m (s) = 1 s (1 + m αφ (α)) S 1(α) + O ( ) 1 S(α) mn 1 s α + O ( ), 1 m S(α) mn If φ = φ 1 then G n,m (s) = 1 + O((mn) 1 )

61 52/59 A SUMMARY OF THE APPROACHES Recurrence Ansatz Error analysis

62 53/59 (1 + 1)-EA FOR LEADINGONES f (x) = x j ; Y n := optimization time 1 k n 1 j k Rudolph (1997): introduced LEADINGONES Droste et al. (2002): E(Y n ) n 2 Ladret (2005): CLT(c 1 n 2, c 2 n 3 ) Böttcher et al. (2010): re-derived mean many other papers Prove Ladret s results by a direct analytic approach

63 4/59 TIME TO OPTIMUM STATE UNDER LEADINGONES Y n (starting with n random bits (each being 1 with probability 1 2 ) E ( e Yns) := 2 n + 1 m n 2 m n 1 Q n,m (s) where the conditional moment generating function Q n,m (s) satisfies the recurrence relation ( 1 (1 pq n m )e s) Q n,m (s) = pq n m e s 2 1 m + Q n,l (s) 2 m l, for 1 m n, where q = 1 p. p n 1 1 l<m

64 55/59 CLOSED-FORM SOLUTION FOR Q n,m Q n,m (s) = e s pq n m 1 j<m 1 1 e s 2pq n j 1 1 e s pq n j Y n,m d = Z [0] n,m }{{} geom + Z [m 1] n,m ) R m (t) := E (t Z [0] n,m = + + Z [m 1] n,m } {{ } geom ) E (t Z [j] n,m = (1 2pqn j )t 1 (1 pq n j )t (j = 1,..., m 1) pq n m t 1 (1 pq n m )t = R j(t) 2

65 Y n E(Y n ) V(Yn ) N (0, 1) E(Y n,m ) = 1 ( 1 q m 1 pq n 1 2p E(Y n ) = 1 m n p= c n = n2 2c 2 + q m 1 ) ( e c e c(1 α) + O ( c(c + 1) 2 n+m 1 E(Y n,m ) = q 2p 2 ( q n 1 ) n )). = ec 1 2c 2 n 2 + (c 2)ec + 2 n + cec (3c 4) + 4c 48 56/59 V(Y n ) = 3q 2 ( q 2n 4p 3 1 ) µ n (1 + q) p= c n = e2c 1 8c 3 n 3 + 3e2c (2c 3) 8e c c 2 n 2 + (6c2 10c + 3)e 2c 8(c 2)e c 19 32c n + O(1)

66 57/59 Properties ONEMAX (X n) LEADINGONES (Y n) Mean Variance Limit law en log n + c 1 n π 2 6 (en)2 (2e + 1)en log n Gumbel distribution P ( X n log n φ en 2 1( 1 ) x) 2 e e x e 1 2 n 2 e n 3 Gaussian distribution P e 1 Yn n 2 2 e 2 1 n 8 3 x 1 x t2 2π e 2 dt Approach Ansatz & error analysis Analytic combinatorics

67 58/59 THANK YOU (1 + 1)-EA = 232

A Gentle Introduction to the Time Complexity Analysis of Evolutionary Algorithms

A Gentle Introduction to the Time Complexity Analysis of Evolutionary Algorithms A Gentle Introduction to the Time Complexity Analysis of Evolutionary Algorithms Pietro S. Oliveto Department of Computer Science, University of Sheffield, UK Symposium Series in Computational Intelligence

More information

How to Treat Evolutionary Algorithms as Ordinary Randomized Algorithms

How to Treat Evolutionary Algorithms as Ordinary Randomized Algorithms How to Treat Evolutionary Algorithms as Ordinary Randomized Algorithms Technical University of Denmark (Workshop Computational Approaches to Evolution, March 19, 2014) 1/22 Context Running time/complexity

More information

Evolutionary Algorithms How to Cope With Plateaus of Constant Fitness and When to Reject Strings of The Same Fitness

Evolutionary Algorithms How to Cope With Plateaus of Constant Fitness and When to Reject Strings of The Same Fitness Evolutionary Algorithms How to Cope With Plateaus of Constant Fitness and When to Reject Strings of The Same Fitness Thomas Jansen and Ingo Wegener FB Informatik, LS 2, Univ. Dortmund, 44221 Dortmund,

More information

A Lower Bound Analysis of Population-based Evolutionary Algorithms for Pseudo-Boolean Functions

A Lower Bound Analysis of Population-based Evolutionary Algorithms for Pseudo-Boolean Functions A Lower Bound Analysis of Population-based Evolutionary Algorithms for Pseudo-Boolean Functions Chao Qian,2, Yang Yu 2, and Zhi-Hua Zhou 2 UBRI, School of Computer Science and Technology, University of

More information

Lecture 9 Evolutionary Computation: Genetic algorithms

Lecture 9 Evolutionary Computation: Genetic algorithms Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Simulation of natural evolution Genetic algorithms Case study: maintenance scheduling with genetic

More information

When to Use Bit-Wise Neutrality

When to Use Bit-Wise Neutrality When to Use Bit-Wise Neutrality Tobias Friedrich Department 1: Algorithms and Complexity Max-Planck-Institut für Informatik Saarbrücken, Germany Frank Neumann Department 1: Algorithms and Complexity Max-Planck-Institut

More information

Black Box Search By Unbiased Variation

Black Box Search By Unbiased Variation Black Box Search By Unbiased Variation Per Kristian Lehre and Carsten Witt CERCIA, University of Birmingham, UK DTU Informatics, Copenhagen, Denmark ThRaSH - March 24th 2010 State of the Art in Runtime

More information

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 U N I V E R S I T Y OF D O R T M U N D REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

The Fitness Level Method with Tail Bounds

The Fitness Level Method with Tail Bounds The Fitness Level Method with Tail Bounds Carsten Witt DTU Compute Technical University of Denmark 2800 Kgs. Lyngby Denmark arxiv:307.4274v [cs.ne] 6 Jul 203 July 7, 203 Abstract The fitness-level method,

More information

UNIVERSITY OF DORTMUND

UNIVERSITY OF DORTMUND UNIVERSITY OF DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods

More information

Geometric Semantic Genetic Programming (GSGP): theory-laden design of variation operators

Geometric Semantic Genetic Programming (GSGP): theory-laden design of variation operators Geometric Semantic Genetic Programming (GSGP): theory-laden design of variation operators Andrea Mambrini University of Birmingham, UK NICaiA Exchange Programme LaMDA group, Nanjing University, China 7th

More information

Plateaus Can Be Harder in Multi-Objective Optimization

Plateaus Can Be Harder in Multi-Objective Optimization Plateaus Can Be Harder in Multi-Objective Optimization Tobias Friedrich and Nils Hebbinghaus and Frank Neumann Max-Planck-Institut für Informatik, Campus E1 4, 66123 Saarbrücken, Germany Abstract In recent

More information

Evolutionary Computation Theory. Jun He School of Computer Science University of Birmingham Web: jxh

Evolutionary Computation Theory. Jun He School of Computer Science University of Birmingham Web:   jxh Evolutionary Computation Theory Jun He School of Computer Science University of Birmingham Web: www.cs.bham.ac.uk/ jxh Outline Motivation History Schema Theorem Convergence and Convergence Rate Computational

More information

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 U N I V E R S I T Y OF D O R T M U N D REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 53 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

CSC 4510 Machine Learning

CSC 4510 Machine Learning 10: Gene(c Algorithms CSC 4510 Machine Learning Dr. Mary Angela Papalaskari Department of CompuBng Sciences Villanova University Course website: www.csc.villanova.edu/~map/4510/ Slides of this presenta(on

More information

Switch Analysis for Running Time Analysis of Evolutionary Algorithms

Switch Analysis for Running Time Analysis of Evolutionary Algorithms IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. XX, NO. X, 204 Switch Analysis for Running Time Analysis of Evolutionary Algorithms Yang Yu, Member, IEEE, Chao Qian, Zhi-Hua Zhou, Fellow, IEEE Abstract

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,

More information

A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem

A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem Jun He 1, Yuren Zhou 2, and Xin Yao 3 1 J. He is with the Department of Computer Science,

More information

Running time analysis of a multi-objective evolutionary algorithm on a simple discrete optimization problem

Running time analysis of a multi-objective evolutionary algorithm on a simple discrete optimization problem Research Collection Working Paper Running time analysis of a multi-objective evolutionary algorithm on a simple discrete optimization problem Author(s): Laumanns, Marco; Thiele, Lothar; Zitzler, Eckart;

More information

Chapter 8: Introduction to Evolutionary Computation

Chapter 8: Introduction to Evolutionary Computation Computational Intelligence: Second Edition Contents Some Theories about Evolution Evolution is an optimization process: the aim is to improve the ability of an organism to survive in dynamically changing

More information

Geometric Semantic Genetic Programming (GSGP): theory-laden design of semantic mutation operators

Geometric Semantic Genetic Programming (GSGP): theory-laden design of semantic mutation operators Geometric Semantic Genetic Programming (GSGP): theory-laden design of semantic mutation operators Andrea Mambrini 1 University of Birmingham, Birmingham UK 6th June 2013 1 / 33 Andrea Mambrini GSGP: theory-laden

More information

Runtime Analysis of Evolutionary Algorithms for the Knapsack Problem with Favorably Correlated Weights

Runtime Analysis of Evolutionary Algorithms for the Knapsack Problem with Favorably Correlated Weights Runtime Analysis of Evolutionary Algorithms for the Knapsack Problem with Favorably Correlated Weights Frank Neumann 1 and Andrew M. Sutton 2 1 Optimisation and Logistics, School of Computer Science, The

More information

Local and Stochastic Search

Local and Stochastic Search RN, Chapter 4.3 4.4; 7.6 Local and Stochastic Search Some material based on D Lin, B Selman 1 Search Overview Introduction to Search Blind Search Techniques Heuristic Search Techniques Constraint Satisfaction

More information

Applications of Analytic Combinatorics in Mathematical Biology (joint with H. Chang, M. Drmota, E. Y. Jin, and Y.-W. Lee)

Applications of Analytic Combinatorics in Mathematical Biology (joint with H. Chang, M. Drmota, E. Y. Jin, and Y.-W. Lee) Applications of Analytic Combinatorics in Mathematical Biology (joint with H. Chang, M. Drmota, E. Y. Jin, and Y.-W. Lee) Michael Fuchs Department of Applied Mathematics National Chiao Tung University

More information

Runtime Analysis of a Binary Particle Swarm Optimizer

Runtime Analysis of a Binary Particle Swarm Optimizer Runtime Analysis of a Binary Particle Swarm Optimizer Dirk Sudholt Fakultät für Informatik, LS 2 Technische Universität Dortmund Dortmund, Germany Carsten Witt Fakultät für Informatik, LS 2 Technische

More information

Expected Running Time Analysis of a Multiobjective Evolutionary Algorithm on Pseudo-boolean Functions

Expected Running Time Analysis of a Multiobjective Evolutionary Algorithm on Pseudo-boolean Functions Expected Running Time Analysis of a Multiobjective Evolutionary Algorithm on Pseudo-boolean Functions Nilanjan Banerjee and Rajeev Kumar Department of Computer Science and Engineering Indian Institute

More information

UNIVERSITY OF DORTMUND

UNIVERSITY OF DORTMUND UNIVERSITY OF DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods

More information

A New Approach to Estimating the Expected First Hitting Time of Evolutionary Algorithms

A New Approach to Estimating the Expected First Hitting Time of Evolutionary Algorithms A New Approach to Estimating the Expected First Hitting Time of Evolutionary Algorithms Yang Yu and Zhi-Hua Zhou National Laboratory for Novel Software Technology Nanjing University, Nanjing 20093, China

More information

When to use bit-wise neutrality

When to use bit-wise neutrality Nat Comput (010) 9:83 94 DOI 10.1007/s11047-008-9106-8 When to use bit-wise neutrality Tobias Friedrich Æ Frank Neumann Published online: 6 October 008 Ó Springer Science+Business Media B.V. 008 Abstract

More information

DRAFT -- DRAFT -- DRAFT -- DRAFT -- DRAFT --

DRAFT -- DRAFT -- DRAFT -- DRAFT -- DRAFT -- Towards an Analytic Framework for Analysing the Computation Time of Evolutionary Algorithms Jun He and Xin Yao Abstract In spite of many applications of evolutionary algorithms in optimisation, theoretical

More information

Optimizing Linear Functions with Randomized Search Heuristics - The Robustness of Mutation

Optimizing Linear Functions with Randomized Search Heuristics - The Robustness of Mutation Downloaded from orbit.dtu.dk on: Oct 12, 2018 Optimizing Linear Functions with Randomized Search Heuristics - The Robustness of Mutation Witt, Carsten Published in: 29th International Symposium on Theoretical

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

METHODS FOR THE ANALYSIS OF EVOLUTIONARY ALGORITHMS ON PSEUDO-BOOLEAN FUNCTIONS

METHODS FOR THE ANALYSIS OF EVOLUTIONARY ALGORITHMS ON PSEUDO-BOOLEAN FUNCTIONS METHODS FOR THE ANALYSIS OF EVOLUTIONARY ALGORITHMS ON PSEUDO-BOOLEAN FUNCTIONS Ingo Wegener FB Informatik, LS2, Univ. Dortmund, 44221 Dortmund, Germany wegener@ls2.cs.uni-dortmund.de Abstract Many experiments

More information

The Wright-Fisher Model and Genetic Drift

The Wright-Fisher Model and Genetic Drift The Wright-Fisher Model and Genetic Drift January 22, 2015 1 1 Hardy-Weinberg Equilibrium Our goal is to understand the dynamics of allele and genotype frequencies in an infinite, randomlymating population

More information

Probabilistic analysis of the asymmetric digital search trees

Probabilistic analysis of the asymmetric digital search trees Int. J. Nonlinear Anal. Appl. 6 2015 No. 2, 161-173 ISSN: 2008-6822 electronic http://dx.doi.org/10.22075/ijnaa.2015.266 Probabilistic analysis of the asymmetric digital search trees R. Kazemi a,, M. Q.

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction

More information

Local Search & Optimization

Local Search & Optimization Local Search & Optimization CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2017 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition, Chapter 4 Outline

More information

The (1+) evolutionary algorithm with self-adjusting mutation rate

The (1+) evolutionary algorithm with self-adjusting mutation rate Downloaded from orbit.dtu.dk on: Dec 15, 2017 The 1+) evolutionary algorithm with self-adjusting mutation rate Doerr, Benjamin; Witt, Carsten; Gießen, Christian; Yang, Jing Published in: Proceedings of

More information

Solution Set for Homework #1

Solution Set for Homework #1 CS 683 Spring 07 Learning, Games, and Electronic Markets Solution Set for Homework #1 1. Suppose x and y are real numbers and x > y. Prove that e x > ex e y x y > e y. Solution: Let f(s = e s. By the mean

More information

Running Time Analysis of Multi-objective Evolutionary Algorithms on a Simple Discrete Optimization Problem

Running Time Analysis of Multi-objective Evolutionary Algorithms on a Simple Discrete Optimization Problem Running Time Analysis of Multi-objective Evolutionary Algorithms on a Simple Discrete Optimization Problem Marco Laumanns 1, Lothar Thiele 1, Eckart Zitzler 1,EmoWelzl 2,and Kalyanmoy Deb 3 1 ETH Zürich,

More information

arxiv: v1 [cs.dm] 29 Aug 2013

arxiv: v1 [cs.dm] 29 Aug 2013 Collecting Coupons with Random Initial Stake Benjamin Doerr 1,2, Carola Doerr 2,3 1 École Polytechnique, Palaiseau, France 2 Max Planck Institute for Informatics, Saarbrücken, Germany 3 Université Paris

More information

Evolutionary Algorithms

Evolutionary Algorithms Evolutionary Algorithms a short introduction Giuseppe Narzisi Courant Institute of Mathematical Sciences New York University 31 January 2008 Outline 1 Evolution 2 Evolutionary Computation 3 Evolutionary

More information

Section 27. The Central Limit Theorem. Po-Ning Chen, Professor. Institute of Communications Engineering. National Chiao Tung University

Section 27. The Central Limit Theorem. Po-Ning Chen, Professor. Institute of Communications Engineering. National Chiao Tung University Section 27 The Central Limit Theorem Po-Ning Chen, Professor Institute of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 3000, R.O.C. Identically distributed summands 27- Central

More information

Computational statistics

Computational statistics Computational statistics Combinatorial optimization Thierry Denœux February 2017 Thierry Denœux Computational statistics February 2017 1 / 37 Combinatorial optimization Assume we seek the maximum of f

More information

Runtime Analysis of Evolutionary Algorithms: Basic Introduction 1

Runtime Analysis of Evolutionary Algorithms: Basic Introduction 1 Runtime Analysis of Evolutionary Algorithms: Basic Introduction 1 Per Kristian Lehre University of Nottingham Nottingham NG8 1BB, UK PerKristian.Lehre@nottingham.ac.uk Pietro S. Oliveto University of Sheffield

More information

An Analysis on Recombination in Multi-Objective Evolutionary Optimization

An Analysis on Recombination in Multi-Objective Evolutionary Optimization An Analysis on Recombination in Multi-Objective Evolutionary Optimization Chao Qian, Yang Yu, Zhi-Hua Zhou National Key Laboratory for Novel Software Technology Nanjing University, Nanjing 20023, China

More information

On the Impact of Objective Function Transformations on Evolutionary and Black-Box Algorithms

On the Impact of Objective Function Transformations on Evolutionary and Black-Box Algorithms On the Impact of Objective Function Transformations on Evolutionary and Black-Box Algorithms [Extended Abstract] Tobias Storch Department of Computer Science 2, University of Dortmund, 44221 Dortmund,

More information

Lecture 9. = 1+z + 2! + z3. 1 = 0, it follows that the radius of convergence of (1) is.

Lecture 9. = 1+z + 2! + z3. 1 = 0, it follows that the radius of convergence of (1) is. The Exponential Function Lecture 9 The exponential function 1 plays a central role in analysis, more so in the case of complex analysis and is going to be our first example using the power series method.

More information

On the Effectiveness of Sampling for Evolutionary Optimization in Noisy Environments

On the Effectiveness of Sampling for Evolutionary Optimization in Noisy Environments On the Effectiveness of Sampling for Evolutionary Optimization in Noisy Environments Chao Qian 1, Yang Yu 1, Yaochu Jin 2, and Zhi-Hua Zhou 1 1 National Key Laboratory for Novel Software Technology, Nanjing

More information

Positive and null recurrent-branching Process

Positive and null recurrent-branching Process December 15, 2011 In last discussion we studied the transience and recurrence of Markov chains There are 2 other closely related issues about Markov chains that we address Is there an invariant distribution?

More information

When Is an Estimation of Distribution Algorithm Better than an Evolutionary Algorithm?

When Is an Estimation of Distribution Algorithm Better than an Evolutionary Algorithm? When Is an Estimation of Distribution Algorithm Better than an Evolutionary Algorithm? Tianshi Chen, Per Kristian Lehre, Ke Tang, and Xin Yao Abstract Despite the wide-spread popularity of estimation of

More information

Intelligens Számítási Módszerek Genetikus algoritmusok, gradiens mentes optimálási módszerek

Intelligens Számítási Módszerek Genetikus algoritmusok, gradiens mentes optimálási módszerek Intelligens Számítási Módszerek Genetikus algoritmusok, gradiens mentes optimálási módszerek 2005/2006. tanév, II. félév Dr. Kovács Szilveszter E-mail: szkovacs@iit.uni-miskolc.hu Informatikai Intézet

More information

Pure Strategy or Mixed Strategy?

Pure Strategy or Mixed Strategy? Pure Strategy or Mixed Strategy? Jun He, Feidun He, Hongbin Dong arxiv:257v4 [csne] 4 Apr 204 Abstract Mixed strategy evolutionary algorithms EAs) aim at integrating several mutation operators into a single

More information

Black-Box Search by Unbiased Variation

Black-Box Search by Unbiased Variation Electronic Colloquium on Computational Complexity, Revision 1 of Report No. 102 (2010) Black-Box Search by Unbiased Variation Per Kristian Lehre and Carsten Witt Technical University of Denmark Kgs. Lyngby,

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 12: Gaussian Belief Propagation, State Space Models and Kalman Filters Guest Kalman Filter Lecture by

More information

Metaheuristics and Local Search

Metaheuristics and Local Search Metaheuristics and Local Search 8000 Discrete optimization problems Variables x 1,..., x n. Variable domains D 1,..., D n, with D j Z. Constraints C 1,..., C m, with C i D 1 D n. Objective function f :

More information

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat: Local Search Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat: I I Select a variable to change Select a new value for that variable Until a satisfying assignment

More information

Erdős-Renyi random graphs basics

Erdős-Renyi random graphs basics Erdős-Renyi random graphs basics Nathanaël Berestycki U.B.C. - class on percolation We take n vertices and a number p = p(n) with < p < 1. Let G(n, p(n)) be the graph such that there is an edge between

More information

Search. Search is a key component of intelligent problem solving. Get closer to the goal if time is not enough

Search. Search is a key component of intelligent problem solving. Get closer to the goal if time is not enough Search Search is a key component of intelligent problem solving Search can be used to Find a desired goal if time allows Get closer to the goal if time is not enough section 11 page 1 The size of the search

More information

Evolutionary Algorithms: Introduction. Department of Cybernetics, CTU Prague.

Evolutionary Algorithms: Introduction. Department of Cybernetics, CTU Prague. Evolutionary Algorithms: duction Jiří Kubaĺık Department of Cybernetics, CTU Prague http://cw.felk.cvut.cz/doku.php/courses/a4m33bia/start pcontents 1. duction to Evolutionary Algorithms (EAs) Pioneers

More information

IEOR 3106: Introduction to Operations Research: Stochastic Models. Fall 2011, Professor Whitt. Class Lecture Notes: Thursday, September 15.

IEOR 3106: Introduction to Operations Research: Stochastic Models. Fall 2011, Professor Whitt. Class Lecture Notes: Thursday, September 15. IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2011, Professor Whitt Class Lecture Notes: Thursday, September 15. Random Variables, Conditional Expectation and Transforms 1. Random

More information

Metaheuristics and Local Search. Discrete optimization problems. Solution approaches

Metaheuristics and Local Search. Discrete optimization problems. Solution approaches Discrete Mathematics for Bioinformatics WS 07/08, G. W. Klau, 31. Januar 2008, 11:55 1 Metaheuristics and Local Search Discrete optimization problems Variables x 1,...,x n. Variable domains D 1,...,D n,

More information

MATH 56A: STOCHASTIC PROCESSES CHAPTER 2

MATH 56A: STOCHASTIC PROCESSES CHAPTER 2 MATH 56A: STOCHASTIC PROCESSES CHAPTER 2 2. Countable Markov Chains I started Chapter 2 which talks about Markov chains with a countably infinite number of states. I did my favorite example which is on

More information

Evolutionary computation

Evolutionary computation Evolutionary computation Andrea Roli andrea.roli@unibo.it DEIS Alma Mater Studiorum Università di Bologna Evolutionary computation p. 1 Evolutionary Computation Evolutionary computation p. 2 Evolutionary

More information

Average Drift Analysis and Population Scalability

Average Drift Analysis and Population Scalability Average Drift Analysis and Population Scalability Jun He and Xin Yao Abstract This paper aims to study how the population size affects the computation time of evolutionary algorithms in a rigorous way.

More information

REIHE COMPUTATIONAL INTELLIGENCE S O N D E R F O R S C H U N G S B E R E I C H 5 3 1

REIHE COMPUTATIONAL INTELLIGENCE S O N D E R F O R S C H U N G S B E R E I C H 5 3 1 U N I V E R S I T Ä T D O R T M U N D REIHE COMPUTATIONAL INTELLIGENCE S O N D E R F O R S C H U N G S B E R E I C H 5 3 1 Design und Management komplexer technischer Prozesse und Systeme mit Methoden

More information

Rigorous Analyses for the Combination of Ant Colony Optimization and Local Search

Rigorous Analyses for the Combination of Ant Colony Optimization and Local Search Rigorous Analyses for the Combination of Ant Colony Optimization and Local Search Frank Neumann 1, Dirk Sudholt 2, and Carsten Witt 2 1 Max-Planck-Institut für Informatik, 66123 Saarbrücken, Germany, fne@mpi-inf.mpg.de

More information

Runtime Analysis of Binary PSO

Runtime Analysis of Binary PSO Runtime Analysis of Binary PSO Dirk Sudholt Fakultät für Informatik, LS 2 Technische Universität Dortmund Dortmund, Germany Carsten Witt Fakultät für Informatik, LS 2 Technische Universität Dortmund Dortmund,

More information

Fundamentals of Genetic Algorithms

Fundamentals of Genetic Algorithms Fundamentals of Genetic Algorithms : AI Course Lecture 39 40, notes, slides www.myreaders.info/, RC Chakraborty, e-mail rcchak@gmail.com, June 01, 2010 www.myreaders.info/html/artificial_intelligence.html

More information

Probability and Estimation. Alan Moses

Probability and Estimation. Alan Moses Probability and Estimation Alan Moses Random variables and probability A random variable is like a variable in algebra (e.g., y=e x ), but where at least part of the variability is taken to be stochastic.

More information

FINAL EXAM: 3:30-5:30pm

FINAL EXAM: 3:30-5:30pm ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

Evolutionary Computation

Evolutionary Computation Evolutionary Computation - Computational procedures patterned after biological evolution. - Search procedure that probabilistically applies search operators to set of points in the search space. - Lamarck

More information

Topics in Approximation Algorithms Solution for Homework 3

Topics in Approximation Algorithms Solution for Homework 3 Topics in Approximation Algorithms Solution for Homework 3 Problem 1 We show that any solution {U t } can be modified to satisfy U τ L τ as follows. Suppose U τ L τ, so there is a vertex v U τ but v L

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

Runtime Analyses for Using Fairness in Evolutionary Multi-Objective Optimization

Runtime Analyses for Using Fairness in Evolutionary Multi-Objective Optimization Runtime Analyses for Using Fairness in Evolutionary Multi-Objective Optimization Tobias Friedrich 1, Christian Horoba 2, and Frank Neumann 1 1 Max-Planck-Institut für Informatik, Saarbrücken, Germany 2

More information

Lecture 4: Two-point Sampling, Coupon Collector s problem

Lecture 4: Two-point Sampling, Coupon Collector s problem Randomized Algorithms Lecture 4: Two-point Sampling, Coupon Collector s problem Sotiris Nikoletseas Associate Professor CEID - ETY Course 2013-2014 Sotiris Nikoletseas, Associate Professor Randomized Algorithms

More information

A Runtime Analysis of Parallel Evolutionary Algorithms in Dynamic Optimization

A Runtime Analysis of Parallel Evolutionary Algorithms in Dynamic Optimization Algorithmica (2017) 78:641 659 DOI 10.1007/s00453-016-0262-4 A Runtime Analysis of Parallel Evolutionary Algorithms in Dynamic Optimization Andrei Lissovoi 1 Carsten Witt 2 Received: 30 September 2015

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

Asymptotic Statistics-III. Changliang Zou

Asymptotic Statistics-III. Changliang Zou Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (

More information

Evolutionary Computation. DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia)

Evolutionary Computation. DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia) Evolutionary Computation DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia) andrea.roli@unibo.it Evolutionary Computation Inspiring principle: theory of natural selection Species face

More information

T. Liggett Mathematics 171 Final Exam June 8, 2011

T. Liggett Mathematics 171 Final Exam June 8, 2011 T. Liggett Mathematics 171 Final Exam June 8, 2011 1. The continuous time renewal chain X t has state space S = {0, 1, 2,...} and transition rates (i.e., Q matrix) given by q(n, n 1) = δ n and q(0, n)

More information

Stochastic Volatility and Correction to the Heat Equation

Stochastic Volatility and Correction to the Heat Equation Stochastic Volatility and Correction to the Heat Equation Jean-Pierre Fouque, George Papanicolaou and Ronnie Sircar Abstract. From a probabilist s point of view the Twentieth Century has been a century

More information

Some mathematical models from population genetics

Some mathematical models from population genetics Some mathematical models from population genetics 5: Muller s ratchet and the rate of adaptation Alison Etheridge University of Oxford joint work with Peter Pfaffelhuber (Vienna), Anton Wakolbinger (Frankfurt)

More information

Considering our result for the sum and product of analytic functions, this means that for (a 0, a 1,..., a N ) C N+1, the polynomial.

Considering our result for the sum and product of analytic functions, this means that for (a 0, a 1,..., a N ) C N+1, the polynomial. Lecture 3 Usual complex functions MATH-GA 245.00 Complex Variables Polynomials. Construction f : z z is analytic on all of C since its real and imaginary parts satisfy the Cauchy-Riemann relations and

More information

Approximate Counting via the Poisson-Laplace-Mellin Method (joint with Chung-Kuei Lee and Helmut Prodinger)

Approximate Counting via the Poisson-Laplace-Mellin Method (joint with Chung-Kuei Lee and Helmut Prodinger) Approximate Counting via the Poisson-Laplace-Mellin Method (joint with Chung-Kuei Lee and Helmut Prodinger) Michael Fuchs Department of Applied Mathematics National Chiao Tung University Hsinchu, Taiwan

More information

On the Usefulness of Infeasible Solutions in Evolutionary Search: A Theoretical Study

On the Usefulness of Infeasible Solutions in Evolutionary Search: A Theoretical Study On the Usefulness of Infeasible Solutions in Evolutionary Search: A Theoretical Study Yang Yu, and Zhi-Hua Zhou, Senior Member, IEEE National Key Laboratory for Novel Software Technology Nanjing University,

More information

Stat 451 Lecture Notes Monte Carlo Integration

Stat 451 Lecture Notes Monte Carlo Integration Stat 451 Lecture Notes 06 12 Monte Carlo Integration Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapter 6 in Givens & Hoeting, Chapter 23 in Lange, and Chapters 3 4 in Robert & Casella 2 Updated:

More information

Ant Colony Optimization: an introduction. Daniel Chivilikhin

Ant Colony Optimization: an introduction. Daniel Chivilikhin Ant Colony Optimization: an introduction Daniel Chivilikhin 03.04.2013 Outline 1. Biological inspiration of ACO 2. Solving NP-hard combinatorial problems 3. The ACO metaheuristic 4. ACO for the Traveling

More information

Lecture 22. Introduction to Genetic Algorithms

Lecture 22. Introduction to Genetic Algorithms Lecture 22 Introduction to Genetic Algorithms Thursday 14 November 2002 William H. Hsu, KSU http://www.kddresearch.org http://www.cis.ksu.edu/~bhsu Readings: Sections 9.1-9.4, Mitchell Chapter 1, Sections

More information

Runtime Analysis of (1+1) EA on Computing Unique Input Output Sequences

Runtime Analysis of (1+1) EA on Computing Unique Input Output Sequences 1 Runtime Analysis of (1+1) EA on Computing Unique Input Output Sequences Per Kristian Lehre and Xin Yao Abstract Computing unique input output (UIO) sequences is a fundamental and hard problem in conformance

More information

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition Preface Preface to the First Edition xi xiii 1 Basic Probability Theory 1 1.1 Introduction 1 1.2 Sample Spaces and Events 3 1.3 The Axioms of Probability 7 1.4 Finite Sample Spaces and Combinatorics 15

More information

Usefulness of infeasible solutions in evolutionary search: an empirical and mathematical study

Usefulness of infeasible solutions in evolutionary search: an empirical and mathematical study Edith Cowan University Research Online ECU Publications 13 13 Usefulness of infeasible solutions in evolutionary search: an empirical and mathematical study Lyndon While Philip Hingston Edith Cowan University,

More information

Scaling Up. So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.).

Scaling Up. So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.). Local Search Scaling Up So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.). The current best such algorithms (RBFS / SMA*)

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress

Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress Petr Pošík Czech Technical University, Faculty of Electrical Engineering, Department of Cybernetics Technická, 66 7 Prague

More information

Lecture 4: Sampling, Tail Inequalities

Lecture 4: Sampling, Tail Inequalities Lecture 4: Sampling, Tail Inequalities Variance and Covariance Moment and Deviation Concentration and Tail Inequalities Sampling and Estimation c Hung Q. Ngo (SUNY at Buffalo) CSE 694 A Fun Course 1 /

More information

1 Randomized complexity

1 Randomized complexity 80240233: Complexity of Computation Lecture 6 ITCS, Tsinghua Univesity, Fall 2007 23 October 2007 Instructor: Elad Verbin Notes by: Zhang Zhiqiang and Yu Wei 1 Randomized complexity So far our notion of

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information