Entropy and Free Energy in Biology
Energy vs. length from Phillips, Quake. Physics Today. 59:38-43, 2006. kt = 0.6 kcal/mol = 2.5 kj/mol = 25 mev typical protein typical cell Thermal effects = deterministic ones!
Energy minimization principles mechanical equilibrium demands potential energy be minimized (net force is zero) optical trap works on this principle Assumes T = 0, i.e., no thermal fluctuations!!!
Statistics of N body systems what does equilibrium mean when there are thermal fluctuations? *in equilibrium, the properties of the system should not depend on time (which properties???) the properties that describe the system as a whole!
Statistics of N body systems D C more precisely, the relevant properties for equilibrium are the accessible measurables of a system, e.g., total energy, volume B A What would be a likely measurable of the system at left? { A = 5, B = 2, C = 1, D= 1 } microstate macrostate
Statistics of N body systems D C B A Flip of particles in different levels = change in microstate
{ 5, 2, 1, 1 } { 5, 2, 1, 1 } Same macrostate Statistics of N body systems D C B A
Statistics of N body systems D C B A jump of one particle into different level
{ 5, 2, 1, 1 } { 4, 2, 1, 2 } Change in macrostate Statistics of N body systems D C B A
Principle of equal a priori probabilities all accessible microstates are equally probable -NOTE: this is an assumption!!! but one with much evidence. *So for a given system, which is the most probable macrostate?
Principle of equal a priori probabilities If all microstates have equal probability, the macrostate with the largest number of microstates W is the most likely to be observed statistical mechanics becomes all about counting Example: how many ways to arrange N particles in M levels with a set number per level (a given macrostate)? W = N! n 1!n 2!...n M! N! sequences of particles, but we don t care about their order in each level (n! rearrangements per level) *This is how Boltzmann originally derived the connection between entropy and probability for an ideal gas
Large N approximations Factorials are not fun to work with (non-analytic) For large values of N, we can use Stirling s approximation x! (x/e) x W = 1 p n 1 1 pn 2 2...pn M M 1 N ln W = M X i=1 p i ln p i S defined as Shannon entropy (useful in information theory, for example) ***the factor of N is because we assume particles are independent (ideal gas) - Gibbs derived it more generally, with no N thermodynamic entropy is the same, just with proper units through Boltzmann s constant k S = k ln W
Large N approximations Why do we define entropy as S = k ln W? Entropy needs to be an extensive quantity, i.e., it should scale with system size (intensive variables do not) W A S A W B W A+B =? S B S A+B =? W A+B = W A W B S A+B = S A + S B S(W A W B )=S(W A )+S(W B ) The only way to satisfy this is to define entropy using a logarithm
Large N approximations S = k ln W What is the probability distribution that maximizes S? *Need to use Lagrange multipliers and constraint on total probability MX i=1 p i =1 p i = 1 M 1/M Distribution is flat pi i M
Calculating entropy Np proteins bind non-specifically to N binding sites on DNA What is the entropy of this system? Will use Stirling s approximation: ln N! N ln N N PBoC 5.5 S = kn[c ln c +(1 c)ln(1 c)]
First Law of Thermodynamics U = Q + W conservation of energy in differential form: @U du = @S V,N @U ds + @V dq S = W = T dv + S,N @U @N Z Vf V i dn S,V P dv the fundamental thermodynamic relation the partial derivatives define familiar properties ds = Q T! T = @U @S V,N du = T ds P dv + i µ i dn i *note that T, P, μ are intensive variables
Consequences of entropy maximization exchange heat exchange volume exchange particles S = S(U,V,N) (all extensive variables of an isolated system) @S1 @S2 ds = @U 1 du 1 + @U 2 du 2 =0 T 1 = T 2
Real systems (not isolated) microcanonical (NVE) canonical (NVT) or isothermalisobaric (NpT) -typical biological system in contact with its environment -environment can be treated as a reservoir of heat and/or volume (T, p constant) -TOTAL entropy max. leads to SYSTEM free energy min.
Real systems (not isolated) -cellular systems are NOT in static equilibrium, but often a dynamic one G = U - TS + pv A = U - TS Gibbs free energy Helmholtz free energy (volume not changing) Free energy is that available to do useful work accounts for thermal energy pushing system away from static equilibrium Biological systems minimize free energy
Real systems (not isolated) a consequence of contact with a thermal reservoir is that the average energy of the system is constrained - how does this affect the probability distribution? MX p i =1 i=1 total probability MX p i E i = hei i=1 average energy p i = e E i P e E i Boltzmann distribution pi partition function Z X e E i i
classic derivation of Boltzmann what is the probability of a particular microstate of our system in contact with a reservoir? Recall: P (E i ) W total (E i ) S = k ln(w ) S/ E = 1 T P (E i ) e E i/kt P (E i )= e E i/kt Z
calculating average quantities hai = X p i A i = P Ai e Z E i many quantities can be calculated directly from the partition function Z hei = P Ei e Z E i hei = @ @ ln Z PBoC 6.1
Example: Hydrophobicity -one of the most important driving forces in biology! -hydrophobic effect ensures: -proteins fold -membranes form and membrane proteins insert -substrates bind -etc. -placing a non-polar substance in water disrupts the hydrogen bonding network, limiting the orientations (i.e., microstates) of a water molecule in contact -to maximize entropy (and minimize free energy) contact surface area is minimized, leading to aggregation of non-polar substance
Second Law of Thermodynamics S 0 -empirically known to be true, but dependent on a low-entropy state in the early universe (inflation?) -in equilibrium, equality holds ( ) S =0 -actually only statistically true -violations can briefly occur, particularly in microscopic systems
Maxwell s Demon - can he violate the 2nd law? NO! -the demon is part of the system, although he can decrease the entropy of the molecules in the box, his own entropy must go up
Brownian ratchet (credit to Feynman) -uses thermal motion of molecules randomly hitting propeller to drive it -ratchet and pawl assures that motion is only in one direction, even though molecules hit propeller from all directions -can this be used to do useful work and violate the 2nd law? NO! the ratchet also undergoes thermal fluctuations back and forth!
Modern-day statistical mechanics The free energy change between two states is related to the work done to move the system from one state to the other by ΔF <W> (Second Law) Inequality can be converted to an equality by accounting for fluctuations that transiently violate the Second Law e F/kT = e W/kT Example application: unfolding a small protein helix (see HW) Jarzynski equality (1997!!!) Work extension
Hill function Example: ligand binding to a protein PBoC 6.1 this system is highly degenerate, meaning most states have the same energy first count the number of microstates for bound and unbound states then determine their energies and, thus, the weight of each state P bound = c c 0 e β E 1+ c c 0 e β E