Some mathematical models from population genetics

Similar documents
arxiv: v1 [math.pr] 18 Sep 2007

Properties of an infinite dimensional EDS system : the Muller s ratchet

Evolution in a spatial continuum

The Λ-Fleming-Viot process and a connection with Wright-Fisher diffusion. Bob Griffiths University of Oxford

Modelling populations under fluctuating selection

The mathematical challenge. Evolution in a spatial continuum. The mathematical challenge. Other recruits... The mathematical challenge

The tree-valued Fleming-Viot process with mutation and selection

The effects of a weak selection pressure in a spatially structured population

The Wright-Fisher Model and Genetic Drift

Dynamics of the evolving Bolthausen-Sznitman coalescent. by Jason Schweinsberg University of California at San Diego.

How robust are the predictions of the W-F Model?

Homework # , Spring Due 14 May Convergence of the empirical CDF, uniform samples

Problems on Evolutionary dynamics

arxiv: v4 [math.pr] 15 Oct 2010

Population Genetics: a tutorial

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

A representation for the semigroup of a two-level Fleming Viot process in terms of the Kingman nested coalescent

Convergence to equilibrium for rough differential equations

SDE Coefficients. March 4, 2008

Frequency Spectra and Inference in Population Genetics

Diffusion Models in Population Genetics

Time Series 3. Robert Almgren. Sept. 28, 2009

Mathematical models in population genetics II

UNIVERSITY OF LONDON IMPERIAL COLLEGE LONDON

ON COMPOUND POISSON POPULATION MODELS

Introduction to multiscale modeling and simulation. Explicit methods for ODEs : forward Euler. y n+1 = y n + tf(y n ) dy dt = f(y), y(0) = y 0

URN MODELS: the Ewens Sampling Lemma

Universal examples. Chapter The Bernoulli process

LAN property for ergodic jump-diffusion processes with discrete observations

Intensive Course on Population Genetics. Ville Mustonen

Stochastic Differential Equations.

Example: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected

Branching Processes II: Convergence of critical branching to Feller s CSB

6 Introduction to Population Genetics

Computational Systems Biology: Biology X

MULLER S RATCHET WITH COMPENSATORY MUTATIONS

Statistics & Data Sciences: First Year Prelim Exam May 2018

The Combinatorial Interpretation of Formulas in Coalescent Theory

LogFeller et Ray Knight

Brownian motion and the Central Limit Theorem

Excited random walks in cookie environments with Markovian cookie stacks

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory

Wright-Fisher Models, Approximations, and Minimum Increments of Evolution

Endowed with an Extra Sense : Mathematics and Evolution

Genetic Variation in Finite Populations

Simulation methods for stochastic models in chemistry

Dynamics of net-baryon density correlations near the QCD critical point

6 Introduction to Population Genetics

On critical branching processes with immigration in varying environment

INFORMATION-THEORETIC BOUNDS OF EVOLUTIONARY PROCESSES MODELED AS A PROTEIN COMMUNICATION SYSTEM. Liuling Gong, Nidhal Bouaynaya and Dan Schonfeld

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

Elementary Applications of Probability Theory

Stochastic Demography, Coalescents, and Effective Population Size

Yaglom-type limit theorems for branching Brownian motion with absorption. by Jason Schweinsberg University of California San Diego

Finite Markov Information-Exchange processes

Wald Lecture 2 My Work in Genetics with Jason Schweinsbreg

Quantitative trait evolution with mutations of large effect

The genealogy of branching Brownian motion with absorption. by Jason Schweinsberg University of California at San Diego

THE vast majority of mutations are neutral or deleterious.

Multiple Random Variables

Processes of Evolution

Bayesian Inference and the Symbolic Dynamics of Deterministic Chaos. Christopher C. Strelioff 1,2 Dr. James P. Crutchfield 2

Mean field simulation for Monte Carlo integration. Part II : Feynman-Kac models. P. Del Moral

Mutation-Selection Systems

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ)

Selection and Population Genetics

Uncertainty quantification and systemic risk

Gillespie s Algorithm and its Approximations. Des Higham Department of Mathematics and Statistics University of Strathclyde

The fate of beneficial mutations in a range expansion

Dynamics of Coexistence of Asexual and Sexual Reproduction in Adaptive Landscape

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Macroscopic quasi-stationary distribution and microscopic particle systems

GENETICS. On the Fixation Process of a Beneficial Mutation in a Variable Environment

A. Bovier () Branching Brownian motion: extremal process and ergodic theorems

Fast-slow systems with chaotic noise

The Cameron-Martin-Girsanov (CMG) Theorem

Optimization. The value x is called a maximizer of f and is written argmax X f. g(λx + (1 λ)y) < λg(x) + (1 λ)g(y) 0 < λ < 1; x, y X.

State-Space Methods for Inferring Spike Trains from Calcium Imaging

APM 541: Stochastic Modelling in Biology Diffusion Processes. Jay Taylor Fall Jay Taylor (ASU) APM 541 Fall / 61

The implications of neutral evolution for neutral ecology. Daniel Lawson Bioinformatics and Statistics Scotland Macaulay Institute, Aberdeen

Two viewpoints on measure valued processes

On prediction and density estimation Peter McCullagh University of Chicago December 2004

Computational statistics

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Lecture 1. References

This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and

Sympatric Speciation

Stochastic Volatility and Correction to the Heat Equation

Pathwise construction of tree-valued Fleming-Viot processes

A Dynamic Contagion Process with Applications to Finance & Insurance

Nonparametric Drift Estimation for Stochastic Differential Equations

The number of accessible paths in the hypercube

Reading for Lecture 13 Release v10

A Bayesian method for the analysis of deterministic and stochastic time series

Markov Chain Monte Carlo (MCMC)

Affine Processes. Econometric specifications. Eduardo Rossi. University of Pavia. March 17, 2009

Mathematical Population Genetics II

To appear in the American Naturalist

Transcription:

Some mathematical models from population genetics 5: Muller s ratchet and the rate of adaptation Alison Etheridge University of Oxford joint work with Peter Pfaffelhuber (Vienna), Anton Wakolbinger (Frankfurt) Charles Cuthbertson (Oxford), Feng Yu (Bristol) New York, Sept. 07 p.1

Muller s ratchet Asexually reproducing population. All mutations deleterious. Chromosomes passed down as indivisible blocks. New York, Sept. 07 p.2

Muller s ratchet Asexually reproducing population. All mutations deleterious. Chromosomes passed down as indivisible blocks. Number of deleterious mutations accumulated along any ancestral line can only increase. New York, Sept. 07 p.2

Muller s ratchet Asexually reproducing population. All mutations deleterious. Chromosomes passed down as indivisible blocks. Number of deleterious mutations accumulated along any ancestral line can only increase. When everyone in the current best class has accumulated at least one additional bad mutation the ratchet clicks. New York, Sept. 07 p.2

Muller s ratchet Asexually reproducing population. All mutations deleterious. Chromosomes passed down as indivisible blocks. Number of deleterious mutations accumulated along any ancestral line can only increase. When everyone in the current best class has accumulated at least one additional bad mutation the ratchet clicks. How many generations will it take for an asexually reproducing population to lose its best class? New York, Sept. 07 p.2

Haigh s model Wright-Fisher model: Individuals in (t + 1)st generation select parent at random from generation t. Probability individual which has accumulated k mutations is selected as parent proportional to relative fitness (1 s) k. Number of mutations carried by offspring then k + J, where J Poiss(λ) (independent). New York, Sept. 07 p.3

Type frequencies: x(t) = (x k (t)) k=0,1,... Nx k (t) = No. of individuals carrying exactly k mutations at time t. New York, Sept. 07 p.4

Type frequencies: x(t) = (x k (t)) k=0,1,... Nx k (t) = No. of individuals carrying exactly k mutations at time t. H an N 0 -valued random variable. P[H = k] (1 s) k x k (t), New York, Sept. 07 p.4

Type frequencies: x(t) = (x k (t)) k=0,1,... Nx k (t) = No. of individuals carrying exactly k mutations at time t. H an N 0 -valued random variable. J Poiss(λ) independent of H. P[H = k] (1 s) k x k (t), New York, Sept. 07 p.4

Type frequencies: x(t) = (x k (t)) k=0,1,... Nx k (t) = No. of individuals carrying exactly k mutations at time t. H an N 0 -valued random variable. P[H = k] (1 s) k x k (t), J Poiss(λ) independent of H. K 1, K 2,..., K N independent copies of H + J. Random type frequencies in next generation are X k (t + 1) = 1 N #{i : K i = k}. New York, Sept. 07 p.4

Infinite populations As N, LLN x(t + 1) = E x(t) [X(t + 1)]. New York, Sept. 07 p.5

Infinite populations As N, LLN x(t + 1) = E x(t) [X(t + 1)]. Suppose x(t) Poiss(α). P[H = k] (1 s) k x k = (1 s) k αk e α. k! Then H Poiss(α(1 s)), J Poiss(λ), so H + J Poiss(α(1 s) + λ). New York, Sept. 07 p.5

Infinite populations As N, LLN x(t + 1) = E x(t) [X(t + 1)]. Suppose x(t) Poiss(α). P[H = k] (1 s) k x k = (1 s) k αk e α. k! Then H Poiss(α(1 s)), J Poiss(λ), so H + J Poiss(α(1 s) + λ). Poisson weights Poisson weights. New York, Sept. 07 p.5

For every initial condition with x 0 > 0, the solution to the deterministic dynamics converges as t to the stationary point π := Poiss(λ/s). New York, Sept. 07 p.6

Back to finite populations Write k for the number of mutations carried by individuals in fittest class. New York, Sept. 07 p.7

Back to finite populations Write k for the number of mutations carried by individuals in fittest class. forms a recurrent Markov chain. Y := (Y k ) k=0,1,... := (X k +k ) k=0,1,2... New York, Sept. 07 p.7

Back to finite populations Write k for the number of mutations carried by individuals in fittest class. Y := (Y k ) k=0,1,... := (X k +k ) k=0,1,2... forms a recurrent Markov chain. Condition on Y(t) = y(t). Size of new best class, y 0 (t + 1) Binom(N, p 0 (t)), with p 0 (t) probability of sampling parent from best class and not acquiring any additional mutations: p 0 (t) = y 0(t) W (t) e λ, W (t) = y i (t)(1 s) i. i=0 New York, Sept. 07 p.7

Back to finite populations Write k for the number of mutations carried by individuals in fittest class. Y := (Y k ) k=0,1,... := (X k +k ) k=0,1,2... forms a recurrent Markov chain. Condition on Y(t) = y(t). Size of new best class, y 0 (t + 1) Binom(N, p 0 (t)), with p 0 (t) probability of sampling parent from best class and not acquiring any additional mutations: p 0 (t) = y 0(t) W (t) e λ, W (t) = y i (t)(1 s) i. i=0 Evolution of best class determined by W (t), the mean fitness in the population. New York, Sept. 07 p.7

Elements of Haigh s analysis Immediately after a click, the type frequencies are π := 1 1 π 0 (π 1, π 2,...). New York, Sept. 07 p.8

Elements of Haigh s analysis Immediately after a click, the type frequencies are π := 1 1 π 0 (π 1, π 2,...). Phase one: deterministic dynamical system dominates, decaying exponentially fast towards its equilibrium New York, Sept. 07 p.8

Elements of Haigh s analysis Immediately after a click, the type frequencies are π := 1 1 π 0 (π 1, π 2,...). Phase one: deterministic dynamical system dominates, decaying exponentially fast towards its equilibrium Phase two: the bulk of the population changes only slowly. Mean fitness assumed constant and then No. of individuals in best class approximated by Galton-Watson branching process with Poisson offspring distribution. New York, Sept. 07 p.8

The Fleming Viot diffusion Relevant for applications are: large N, small s, small λ. New York, Sept. 07 p.9

The Fleming Viot diffusion Relevant for applications are: large N, small s, small λ. Dynamics then captured by dx k = j s(j k)x j X k + λ(x k 1 X k ) dt + j k 1 N X jx k dw jk, k = 0, 1, 2,... where X 1 = 0 and (W jk ) j>k array of independent Brownian motions, W kj := W jk. New York, Sept. 07 p.9

The Fleming Viot diffusion Relevant for applications are: large N, small s, small λ. Dynamics then captured by dx k = j s(j k)x j X k + λ(x k 1 X k ) dt + j k 1 N X jx k dw jk, k = 0, 1, 2,... where X 1 = 0 and (W jk ) j>k array of independent Brownian motions, W kj := W jk. As before Y k = X k +k, dy 0 = s(m 1 (Y) λ)y 0 (t)dt + 1 N Y 0(1 Y 0 )dw 0, M 1 (Y) = j jy j. New York, Sept. 07 p.9

Infinite population limit with x 1 = 0. dx k = (s(m 1 (x) k) λ)x k + λx k 1 ) dt, k = 0, 1, 2,... New York, Sept. 07 p.10

Infinite population limit dx k = (s(m 1 (x) k) λ)x k + λx k 1 ) dt, k = 0, 1, 2,... with x 1 = 0. Transform into system of equations for cumulants: log x k e ξk = k=0 k=1 κ k ( ξ) k k!. Assume x 0 > 0 and set κ 0 = log x 0. Then κ k = sκ k+1 + λ, k = 0, 1, 2,... New York, Sept. 07 p.10

System can be solved. In particular, κ 1 (t) = kx k (t) = ξ log k=0 k=0 x k (0)e ξk ξ=st + λ s (1 e st ). New York, Sept. 07 p.11

System can be solved. In particular, κ 1 (t) = kx k (t) = ξ log k=0 k=0 x k (0)e ξk ξ=st + λ s (1 e st ). Notice the exponential decay towards the equilibrium vaue of λ/s. The time τ = log(λ/s) s corresponds exactly to the end of Haigh s phase one. New York, Sept. 07 p.11

Approximations dy 0 = s(m 1 (Y) λ)y 0 (t)dt + 1 N Y 0(1 Y 0 )dw 0. Cannot solve for M 1 (Y). Instead seek a good approximation of M 1 given Y 0. New York, Sept. 07 p.12

Approximations dy 0 = s(m 1 (Y) λ)y 0 (t)dt + 1 N Y 0(1 Y 0 )dw 0. Cannot solve for M 1 (Y). Instead seek a good approximation of M 1 given Y 0. Simulations suggest a good fit to a linear relationship between Y 0 and M 1. New York, Sept. 07 p.12

Extending Haigh s approach Haigh assumes at click times π 0 distributed evenly over other classes. Suppose now that this holds in between click times too: given Y 0 approximate state of system by the PPA (Poisson Profile Approximation) Π(Y 0 ) = ( Y 0, 1 Y ) 0 (π 1, π 2,...). 1 π 0 New York, Sept. 07 p.13

Extending Haigh s approach Haigh assumes at click times π 0 distributed evenly over other classes. Suppose now that this holds in between click times too: given Y 0 approximate state of system by the PPA (Poisson Profile Approximation) Π(Y 0 ) = ( Y 0, 1 Y ) 0 (π 1, π 2,...). 1 π 0 Estimate M 1 not from PPA but from relaxed PPA obtained by evolving PPA according to the deterministic dynamical system for time Aτ := A log(λ/s)/s. This gives M 1 = θ + η e η 1 ( 1 Y ) 0, η = (λ/s) 1 A. π 0 New York, Sept. 07 p.13

Three one dimensional diffusions Substituting in the one-dimensional diffusion approximation for Y 0 gives: A small, dy 0 = λ(π 0 Y 0 )Y 0 dt + ( A = 1, dy 0 = 0.58s 1 Y ) 0 π 0 A large, ( dy 0 = s 1 Y ) 0 Y 0 dt + π 0 1 N Y 0dW, Y 0 dt + 1 N Y 0 dw, 1 N Y 0 dw, New York, Sept. 07 p.14

A rescaling Z(t) = 1 π 0 Y 0 (Nπ 0 t). Set γ = Nλ Ns log(nλ). New York, Sept. 07 p.15

A rescaling Z(t) = 1 π 0 Y 0 (Nπ 0 t). Set γ = Nλ Ns log(nλ). A small, dz = (Nλ) 1 2γ (1 Z)Zdt + ZdW, A = 1, 1 dz = 0.58 γ log(nλ) (Nλ)1 γ (1 Z)Zdt + Z dw, A large, dz = 1 γ log(nλ) (Nλ)1 γ (1 Z)Zdt + Z dw. New York, Sept. 07 p.15

Implications A small fast clicking: dz = (Nλ) 1 2γ (1 Z)Zdt + ZdW. The ratchet never clicks for γ < 1/2. New York, Sept. 07 p.16

Implications A small fast clicking: dz = (Nλ) 1 2γ (1 Z)Zdt + ZdW. The ratchet never clicks for γ < 1/2. A = 1 (A large) moderate clicking (slow clicking): dz = 0.58(1) 1 γ log(nλ) (Nλ)1 γ (1 Z)Zdt + Z dw. New York, Sept. 07 p.16

Implications A small fast clicking: dz = (Nλ) 1 2γ (1 Z)Zdt + ZdW. The ratchet never clicks for γ < 1/2. A = 1 (A large) moderate clicking (slow clicking): dz = 0.58(1) 1 γ log(nλ) (Nλ)1 γ (1 Z)Zdt + Z dw. In order for 0.58 1 γ log(nλ) (Nλ)1 γ to be > 5, γ 0.3 0.4 0.5 0.55 0.6 0.7 0.8 0.9 Nλ 20 10 2 9 10 2 4 10 3 2 10 4 4 10 6 2 10 11 8 10 26 New York, Sept. 07 p.16

Rule of thumb For biologically realistic parameters, transition from no clicks to moderate clicks (on evolutionary timescale) around γ = 0.5. New York, Sept. 07 p.17

Rule of thumb For biologically realistic parameters, transition from no clicks to moderate clicks (on evolutionary timescale) around γ = 0.5. The rate of the ratchet is of the order N γ 1 λ γ for γ (1/2, 1), whereas it is exponentially slow in (Nλ) 1 γ for γ < 1/2. New York, Sept. 07 p.17

Simulations New York, Sept. 07 p.18

New York, Sept. 07 p.19

Purely beneficial mutations We saw before that the probability of an isolated selected sweep 2σ. New York, Sept. 07 p.20

Purely beneficial mutations We saw before that the probability of an isolated selected sweep 2σ. What about overlapping sweeps? New York, Sept. 07 p.20

Purely beneficial mutations We saw before that the probability of an isolated selected sweep 2σ. What about overlapping sweeps? Interference reduces the chance of fixation as beneficial mutations compete with one another. New York, Sept. 07 p.20

Purely beneficial mutations We saw before that the probability of an isolated selected sweep 2σ. What about overlapping sweeps? Interference reduces the chance of fixation as beneficial mutations compete with one another. Is there a limit to the rate of adaptation? New York, Sept. 07 p.20

A mixture of mutations All mutations equal strength so ith individual s fitness characterized by net number, X i, of beneficial mutations. New York, Sept. 07 p.21

A mixture of mutations All mutations equal strength so ith individual s fitness characterized by net number, X i, of beneficial mutations. Mutation: For each individual i a mutation event occurs at rate µ. With probability 1 q, X i changes to X i 1 and with probability q, X i changes to X i + 1. Selection: For each pair of individuals (i, j), at rate σ N (X i X j ) +, individual i replaces individual j. Resampling: For each pair of individuals (i, j), at rate 1 N, individual i replaces individual j. New York, Sept. 07 p.21

A mixture of mutations All mutations equal strength so ith individual s fitness characterized by net number, X i, of beneficial mutations. Mutation: For each individual i a mutation event occurs at rate µ. With probability 1 q, X i changes to X i 1 and with probability q, X i changes to X i + 1. Selection: For each pair of individuals (i, j), at rate σ N (X i X j ) +, individual i replaces individual j. Resampling: For each pair of individuals (i, j), at rate 1 N, individual i replaces individual j. Technical modification: suppress mutations which would make the width of the population > L N 1/4. New York, Sept. 07 p.21

..., P 0 (t),..., P k (t),... - Proportion of individuals with k mutations at time t k max (k min ) - type of the fittest (least fit) individual New York, Sept. 07 p.22

..., P 0 (t),..., P k (t),... - Proportion of individuals with k mutations at time t k max (k min ) - type of the fittest (least fit) individual dp k = µ k (P ) dt + σ l Z(k l)p k P l dt + dm P k = [ µ k (P ) + σ(k m(p ))P k ] dt + dm P k µ k (P ) µ (qp k 1 P k + (1 q)p k+1 ) New York, Sept. 07 p.22

..., P 0 (t),..., P k (t),... - Proportion of individuals with k mutations at time t k max (k min ) - type of the fittest (least fit) individual dp k = µ k (P ) dt + σ l Z(k l)p k P l dt + dm P k = [ µ k (P ) + σ(k m(p ))P k ] dt + dm P k µ k (P ) µ (qp k 1 P k + (1 q)p k+1 ) M P is a martingale with [ ] M P 2µ k (t) N t + 1 N t 0 (2 + σ(k l) + + σ(l k) + )P k (s)p l (s) ds l Z New York, Sept. 07 p.22

Moments Mean fitness m(p ) = k kp k satisfies dm(p ) = ( µ(p ) + σc 2 (P )) dt + dm P,m µ(p ) µ(2q 1) New York, Sept. 07 p.23

Moments Mean fitness m(p ) = k kp k satisfies dm(p ) = ( µ(p ) + σc 2 (P )) dt + dm P,m µ(p ) µ(2q 1) Ignoring mutation terms, centred moments c k = k (k m(p))n P k satisfy dc 2 σc 3 dt + small noise terms dc 3 σ(c 4 3c 2 c 2 ) dt + small noise terms dc 4 σ(c 5 4c 3 c 2 ) dt + small noise terms dc 5 σ(c 6 5c 4 c 2 ) dt + small noise terms New York, Sept. 07 p.23

Heuristics Centred process has a stationary distribution. New York, Sept. 07 p.24

Heuristics Centred process has a stationary distribution. Approximate by a deterministic distribution. New York, Sept. 07 p.24

Heuristics Centred process has a stationary distribution. Approximate by a deterministic distribution. Equations for centred moments give: 0 σc 3 dt 0 σ(c 4 3c 2 c 2 ) 0 σ(c 5 4c 3 c 2 ) 0 σ(c 6 5c 4 c 2 )... New York, Sept. 07 p.24

Heuristics Centred process has a stationary distribution. Approximate by a deterministic distribution. Equations for centred moments give: 0 σc 3 dt 0 σ(c 4 3c 2 c 2 ) 0 σ(c 5 4c 3 c 2 ) 0 σ(c 6 5c 4 c 2 )... Stationary distribution approximately Gaussian. New York, Sept. 07 p.24

Suppose stationary distribution Gaussian with mean m(p ), variance b 2. New York, Sept. 07 p.25

Suppose stationary distribution Gaussian with mean m(p ), variance b 2. Front is at K where 1 2πb e K2 /2b 2 = 1 N K b 2 log N New York, Sept. 07 p.25

Suppose stationary distribution Gaussian with mean m(p ), variance b 2. Front is at K where 1 2πb e K2 /2b 2 = 1 N K b 2 log N If there is a single individual at m(p ) + K at time t = 0, how long until there is an individual at m(p ) + K + 1? New York, Sept. 07 p.25

Suppose stationary distribution Gaussian with mean m(p ), variance b 2. Front is at K where 1 2πb e K2 /2b 2 = 1 N K b 2 log N If there is a single individual at m(p ) + K at time t = 0, how long until there is an individual at m(p ) + K + 1? Ignoring beneficial mutations occurring to individuals at m(p ) + K 1, until the front advances Z(t) e (σk (1 q)µ)t. New York, Sept. 07 p.25

More heuristics Front advances as soon as individual of Z(t) accumulates a beneficial mutation. New York, Sept. 07 p.26

More heuristics Front advances as soon as individual of Z(t) accumulates a beneficial mutation. The probability that the front doesn t advance by time t is exp { t qµ 0 } Z(s) ds exp { } qµ σk (1 q)µ (e(σk (1 q)µ)t 1) New York, Sept. 07 p.26

More heuristics Front advances as soon as individual of Z(t) accumulates a beneficial mutation. The probability that the front doesn t advance by time t is exp { t qµ 0 } Z(s) ds exp Average time for front to advance by one is { } qµ σk (1 q)µ (e(σk (1 q)µ)t 1) 1 σk (1 q)µ log(σk (1 q)µ). New York, Sept. 07 p.26

More heuristics Front advances as soon as individual of Z(t) accumulates a beneficial mutation. The probability that the front doesn t advance by time t is exp { t qµ 0 } Z(s) ds exp Average time for front to advance by one is { } qµ σk (1 q)µ (e(σk (1 q)µ)t 1) 1 σk (1 q)µ log(σk (1 q)µ). But wave speed is µ(2q 1) + σc 2 (P ) µ(2q 1) + σb 2 µ(2q 1) + σk2 2 log N. New York, Sept. 07 p.26

Conclusion Consistency condition: σk (1 q)µ log(σk (1 q)µ) = µ(2q 1) + σk2 2 log N. New York, Sept. 07 p.27

Conclusion Consistency condition: σk (1 q)µ log(σk (1 q)µ) = µ(2q 1) + σk2 2 log N. For large K, this approximately reduces to K log(σk) = 2 log N. New York, Sept. 07 p.27

Conclusion Consistency condition: σk (1 q)µ log(σk (1 q)µ) = µ(2q 1) + σk2 2 log N. For large K, this approximately reduces to K log(σk) = 2 log N. If K = log N, then LHS > RHS; K = log 1 δ N, δ > 0, then LHS < RHS. New York, Sept. 07 p.27

Conclusion Consistency condition: σk (1 q)µ log(σk (1 q)µ) = µ(2q 1) + σk2 2 log N. For large K, this approximately reduces to K log(σk) = 2 log N. If K = log N, then LHS > RHS; K = log 1 δ N, δ > 0, then LHS < RHS. So K between log N and any fractional power of log N rate of adaptation, of order between log N and any fractional power of log N. New York, Sept. 07 p.27

Rigorous result Theorem. If q > 0, then for any β > 0, there exists a positive constant c µ,σ,q such that if N is sufficiently large. E π [m(1)] E π [c 2 ] c µ,σ,q log 1 β N New York, Sept. 07 p.28

Simulations With µ = 0.01, q = 0.01, σ = 0.01, N = 1000, 2000, 5000, 10000, 30000. First row: mean; second row: variance. 0 0.5 1 1.5 2.5 0.5 0 0.5 1 2 1 1.5 0 1.5 0.5 0.5 1 mean 2 mean mean 0.5 mean mean 2.5 1 0 0.5 1 3 0 3.5 1.5 1.5 0.5 0.5 4 0 500 1000 1500 time 2 0 500 1000 1500 time 2 0 500 1000 1500 time 1 0 500 1000 1500 time 1 0 500 1000 1500 time 2 2.5 2.5 1.8 1.6 1.8 1.6 1.4 1.6 2 2 1.4 1.2 1.4 1.2 1.2 1.5 1.5 1 1 var 1 var var var var 0.8 0.8 0.8 1 1 0.6 0.6 0.6 0.4 0.5 0.5 0.4 0.4 0.2 0.2 0.2 0 0 500 1000 1500 time 0 0 500 1000 1500 time 0 0 500 1000 1500 time 0 0 500 1000 1500 time 0 0 500 1000 1500 time New York, Sept. 07 p.29

6 x 10 3 5 4 Adaptation Rate 3 2 1 0 1 10 3 10 4 10 5 N Adaptation rate against population size. From top to bottom, q = 4%, 2%, 1%, 0.2%, µ = 0.01, σ = 0.01. New York, Sept. 07 p.30