Entropy and Free Energy in Biology

Similar documents
Entropy and Free Energy in Biology

Chapter 3. Entropy, temperature, and the microcanonical partition function: how to calculate results with statistical mechanics.

A Brief Introduction to Statistical Mechanics

MCB100A/Chem130 MidTerm Exam 2 April 4, 2013

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

Lecture 13. Multiplicity and statistical definition of entropy

Statistical mechanics of biological processes

MCB100A/Chem130 MidTerm Exam 2 April 4, 2013

Molecular Interactions F14NMI. Lecture 4: worked answers to practice questions

Chapter 4: Going from microcanonical to canonical ensemble, from energy to temperature.

Lecture 14. Entropy relationship to heat

Statistical Mechanics. Atomistic view of Materials

SCORING. The exam consists of 5 questions totaling 100 points as broken down in this table:

Physics 172H Modern Mechanics

Statistical thermodynamics (mechanics)

Introduction Statistical Thermodynamics. Monday, January 6, 14

Statistical Mechanics Primer

Lecture 4: Mechanical and Chemical Equilibrium In the Living Cell (Contd.)

Part II: Statistical Physics

4.1 Constant (T, V, n) Experiments: The Helmholtz Free Energy

Part II: Statistical Physics

Boltzmann Distribution Law (adapted from Nash)

Lecture 6 Free Energy

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

Statistical. mechanics

Lecture Notes 2014March 13 on Thermodynamics A. First Law: based upon conservation of energy

to satisfy the large number approximations, W W sys can be small.

Statistical thermodynamics L1-L3. Lectures 11, 12, 13 of CY101

1. Thermodynamics 1.1. A macroscopic view of matter

Physics is time symmetric Nature is not

Solid Thermodynamics (1)

Statistical Physics. The Second Law. Most macroscopic processes are irreversible in everyday life.

Statistical thermodynamics Lectures 7, 8

Chapter 17. Free Energy and Thermodynamics. Chapter 17 Lecture Lecture Presentation. Sherril Soman Grand Valley State University

Advanced Thermodynamics. Jussi Eloranta (Updated: January 22, 2018)

Chapter 3. Property Relations The essence of macroscopic thermodynamics Dependence of U, H, S, G, and F on T, P, V, etc.

UC Berkeley. Chem 130A. Spring nd Exam. March 10, 2004 Instructor: John Kuriyan

Physics 4230 Final Examination 10 May 2007

Physics 132- Fundamentals of Physics for Biologists II

Statistical thermodynamics Lectures 7, 8

[S R (U 0 ɛ 1 ) S R (U 0 ɛ 2 ]. (0.1) k B

6.730 Physics for Solid State Applications

Removing the mystery of entropy and thermodynamics. Part 3

Before the Quiz. Make sure you have SIX pennies

Lecture 2 and 3: Review of forces (ctd.) and elementary statistical mechanics. Contributions to protein stability

Lecture 2-3: Review of forces (ctd.) and elementary statistical mechanics. Contributions to protein stability

First Law Limitations

2. Thermodynamics. Introduction. Understanding Molecular Simulation

Lecture 27: Entropy and Information Prof. WAN, Xin

Chapter 20 Entropy and the 2nd Law of Thermodynamics

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

Exam Thermodynamics 12 April 2018

The physics of information: from Maxwell s demon to Landauer. Eric Lutz University of Erlangen-Nürnberg

PHY 5524: Statistical Mechanics, Spring February 11 th, 2013 Midterm Exam # 1

Entropy. Physics 1425 Lecture 36. Michael Fowler, UVa

763620SS STATISTICAL PHYSICS Solutions 5 Autumn 2012

Irreversible Processes

4/19/2016. Chapter 17 Free Energy and Thermodynamics. First Law of Thermodynamics. First Law of Thermodynamics. The Energy Tax.

Some properties of the Helmholtz free energy

Properties of Entropy

Statistical Mechanics

Statistical Mechanics Notes. Ryan D. Reece

Mathematical Structures of Statistical Mechanics: from equilibrium to nonequilibrium and beyond Hao Ge

Statistical Thermodynamics. Lecture 8: Theory of Chemical Equilibria(I)

Appendix 1: Normal Modes, Phase Space and Statistical Physics

Lecture 6. Preliminary and simple applications of statistical mechanics

Chemistry 431. Lecture 27 The Ensemble Partition Function Statistical Thermodynamics. NC State University

Lecture 27 Thermodynamics: Enthalpy, Gibbs Free Energy and Equilibrium Constants

The Second Law of Thermodynamics

Short Announcements. 1 st Quiz today: 15 minutes. Homework 3: Due next Wednesday.

Entropy A measure of molecular disorder

ESCI 341 Atmospheric Thermodynamics Lesson 12 The Energy Minimum Principle

although Boltzmann used W instead of Ω for the number of available states.

Lecture 2: Intro. Statistical Mechanics

2m + U( q i), (IV.26) i=1

Lecture 5: Entropy Rules!

1 Foundations of statistical physics

213 Midterm coming up

Adiabatic Expansion (DQ = 0)

Computer simulation methods (1) Dr. Vania Calandrini

Random arrang ement (disorder) Ordered arrangement (order)

Physics 132- Fundamentals of Physics for Biologists II. Statistical Physics and Thermodynamics

Atkins / Paula Physical Chemistry, 8th Edition. Chapter 16. Statistical thermodynamics 1: the concepts

Physics 4230 Final Exam, Spring 2004 M.Dubson This is a 2.5 hour exam. Budget your time appropriately. Good luck!

Thermodynamics. Energy is driving life. Energy of sun ultimately drives most of life on Earth

Energy Barriers and Rates - Transition State Theory for Physicists

Chapter 19 Chemical Thermodynamics Entropy and free energy

Statistical thermodynamics for MD and MC simulations

Thermodynamics & Statistical Mechanics SCQF Level 9, U03272, PHY-3-ThermStat. Thursday 24th April, a.m p.m.

Grand Canonical Formalism

Thermochemical Properties

Physics 408 Final Exam

Statistical Physics. How to connect the microscopic properties -- lots of changes to the macroscopic properties -- not changing much.

Statistical Mechanics

Computational Biology 1

PHY214 Thermal & Kinetic Physics Duration: 2 hours 30 minutes

Class 22 - Second Law of Thermodynamics and Entropy

Chapter 20 The Second Law of Thermodynamics

Entropy in Macroscopic Systems

Physics 9 Wednesday, February 29, 2012

Transcription:

Entropy and Free Energy in Biology

Energy vs. length from Phillips, Quake. Physics Today. 59:38-43, 2006. kt = 0.6 kcal/mol = 2.5 kj/mol = 25 mev typical protein typical cell Thermal effects = deterministic ones!

Energy minimization principles mechanical equilibrium demands potential energy be minimized (net force is zero) optical trap works on this principle Assumes T = 0, i.e., no thermal fluctuations!!!

Statistics of N body systems what does equilibrium mean when there are thermal fluctuations? *in equilibrium, the properties of the system should not depend on time (which properties???) the properties that describe the system as a whole!

Statistics of N body systems D C more precisely, the relevant properties for equilibrium are the accessible measurables of a system, e.g., total energy, volume B A What would be a likely measurable of the system at left? { A = 5, B = 2, C = 1, D= 1 } microstate macrostate

Statistics of N body systems D C B A Flip of particles in different levels = change in microstate

{ 5, 2, 1, 1 } { 5, 2, 1, 1 } Same macrostate Statistics of N body systems D C B A

Statistics of N body systems D C B A jump of one particle into different level

{ 5, 2, 1, 1 } { 4, 2, 1, 2 } Change in macrostate Statistics of N body systems D C B A

Principle of equal a priori probabilities all accessible microstates are equally probable -NOTE: this is an assumption!!! but one with much evidence. *So for a given system, which is the most probable macrostate?

Principle of equal a priori probabilities If all microstates have equal probability, the macrostate with the largest number of microstates W is the most likely to be observed statistical mechanics becomes all about counting Example: how many ways to arrange N particles in M levels with a set number per level (a given macrostate)? W = N! n 1!n 2!...n M! N! sequences of particles, but we don t care about their order in each level (n! rearrangements per level) *This is how Boltzmann originally derived the connection between entropy and probability for an ideal gas

Large N approximations Factorials are not fun to work with (non-analytic) For large values of N, we can use Stirling s approximation x! (x/e) x W = 1 p n 1 1 pn 2 2...pn M M 1 N ln W = M X i=1 p i ln p i S defined as Shannon entropy (useful in information theory, for example) thermodynamic entropy is the same, just with proper units through Boltzmann s constant k S = k ln W

Large N approximations Why do we define entropy as S = k ln W? Entropy needs to be an extensive quantity, i.e., it should scale with system size (intensive variables do not) W A S A W B W A+B =? S B S A+B =? W A+B = W A W B S A+B = S A + S B S(W A W B )=S(W A )+S(W B ) The only way to satisfy this is to define entropy using a logarithm

Large N approximations S = k ln W What is the probability distribution that maximizes S? *Need to use Lagrange multipliers and constraint on total probability MX p i =1 i=1 p i = 1 N Distribution is flat most disordered 1/N pi i N

Calculating entropy N p proteins bind non-specifically to N binding sites on DNA What is the entropy of this system? Will use Stirling s approximation: ln N! N ln N N PBoC 5.5 S = kn[c ln c +(1 c)ln(1 c)]

Calculating entropy Lac repressor - canonical example of gene regulation ~10 copies/cell, about 5*10 6 binding sites on E. coli DNA what is the entropy, assuming non-specific binding? (terrible assumption!) Entropy peaks when the relative concentration is 0.5 largest number of possible arrangements PBoC 5.5 S/Nk = 2.8 x 10-5 N

First Law of Thermodynamics U = Q + W conservation of energy in differential form: @U du = @S V,N @U ds + @V dq S = W = T dv + S,N @U @N Z Vf V i dn S,V P dv the fundamental thermodynamic relation the partial derivatives define familiar properties ds = Q T! T = @U @S V,N du = T ds P dv + i µ i dn i *note that T, P, μ are intensive variables

First Law of Thermodynamics why ds but δq? P exact differential inexact differential P1,V1 WA entropy, volume, pressure, energy are state variables, i.e., they can be described by a differentiable function P2,V2 work/heat are path variables WB V work is area under the curve, so it depends on how we get from point 1 to point 2 W = Z Vf V i P dv

Consequences of entropy maximization exchange heat exchange volume exchange particles S = S(U,V,N) (all extensive variables of an isolated system) @S1 @S2 ds = @U 1 du 1 + @U 2 du 2 =0 T 1 = T 2

Real systems (not isolated) canonical (NVT) microcanonical (NVE) or isothermalisobaric (NpT) -typical biological system in contact with its environment -environment can be treated as a reservoir of heat and/or volume (T, p constant) -TOTAL entropy max. leads to SYSTEM free energy min.

Real systems (not isolated) -cellular systems are NOT in static equilibrium, but often a dynamic one G = U - TS + pv A = U - TS Gibbs free energy Helmholtz free energy (volume not changing) Free energy is that available to do useful work accounts for thermal energy pushing system away from static equilibrium Biological systems minimize free energy

Real systems (not isolated) a consequence of contact with a thermal reservoir is that the average energy of the system is constrained - how does this affect the probability distribution? MX p i =1 i=1 total probability MX p i E i = hei i=1 average energy p i = e E i P e E i Boltzmann distribution pi partition function Z X e E i i

classic derivation of Boltzmann what is the probability of a particular microstate of our system in contact with a reservoir? Recall: P (E i ) W total (E i ) S = k ln(w ) S/ E = 1 T P (E i ) e E i/kt P (E i )= e E i/kt Z

calculating average quantities hai = X p i A i = P Ai e Z E i many quantities can be calculated directly from the partition function Z hei = P Ei e Z E i hei = @ @ ln Z PBoC 6.1

Example: Hydrophobicity -one of the most important driving forces in biology! -hydrophobic effect ensures: -proteins fold -membranes form and membrane proteins insert -substrates bind -etc. -placing a non-polar substance in water disrupts the hydrogen bonding network, limiting the orientations (i.e., microstates) of a water molecule in contact -to maximize entropy (and minimize free energy) contact surface area is minimized, leading to aggregation of non-polar substance

Second Law of Thermodynamics S 0 -empirically known to be true, but dependent on a low-entropy state in the early universe (inflation?) -in equilibrium, equality holds ( ) S =0 -actually only statistically true -violations can briefly occur, particularly in microscopic systems

Maxwell s Demon - can he violate the 2nd law? NO! -the demon is part of the system, although he can decrease the entropy of the molecules in the box, his own entropy must go up

Brownian ratchet (credit to Feynman) -uses thermal motion of molecules randomly hitting propeller to drive it -ratchet and pawl assures that motion is only in one direction, even though molecules hit propeller from all directions -can this be used to do useful work and violate the 2nd law? NO! the ratchet also undergoes thermal fluctuations back and forth!

Modern-day statistical mechanics The free energy change between two states is related to the work done to move the system from one state to the other by ΔF <W> (Second Law) Inequality can be converted to an equality by accounting for fluctuations that transiently violate the Second Law e F/kT = e W/kT Example application: unfolding a small protein helix (see HW) Jarzynski equality (1997!!!) Work extension

Hill function Example: ligand binding to a protein PBoC 6.1 this system is highly degenerate, meaning most states have the same energy first count the number of microstates for bound and unbound states then determine their energies and, thus, the weight of each state P bound = c c 0 e β E 1+ c c 0 e β E