Why You should do a PhD in Statistical Mechanics Quirin Vogel Department of Mathematics Warwick Open Day, 7th February 2018
Outline General Introduction Statistical Mechanics of discrete State Spaces My area of research to other areas
General Statistical Mechanics can be seen as the a branch of probability theory where highly correlated random variables are studied. It orginates from Physics. It was derived as a justification of classical thermodynamics. It aims to give an accurate description of systems comprised of a large number of subentities.
General Statistical Mechanics can be seen as the a branch of probability theory where highly correlated random variables are studied. It orginates from Physics. It was derived as a justification of classical thermodynamics. It aims to give an accurate description of systems comprised of a large number of subentities.
General Statistical Mechanics can be seen as the a branch of probability theory where highly correlated random variables are studied. It orginates from Physics. It was derived as a justification of classical thermodynamics. It aims to give an accurate description of systems comprised of a large number of subentities.
The Ising Model The Ising model is one of the oldest and well-studied models in statistical mechanics. It sough to describe the behaviour of magnets. We begin by fixing a region of the lattice Λ Zd and attach to each site x Λ a magnet σx { 1, +1}. The have Ω = {σ : σ { 1, +1}Λ } and the energy is given by Hβ (σ) = β X σx σy, σz = τz for z 3 Λ, (2.1) x y =1 x Λ OR y Λ and probability distribution 1 e Hβ (σ). Hβ (σ) e σ Ω PΛ,τ,β (σ) = P (2.2)
The Ising Model The Ising model is one of the oldest and well-studied models in statistical mechanics. It sough to describe the behaviour of magnets. We begin by fixing a region of the lattice Λ Zd and attach to each site x Λ a magnet σx { 1, +1}. The have Ω = {σ : σ { 1, +1}Λ } and the energy is given by Hβ (σ) = β X σx σy, σz = τz for z 3 Λ, (2.1) x y =1 x Λ OR y Λ and probability distribution 1 e Hβ (σ). Hβ (σ) e σ Ω PΛ,τ,β (σ) = P (2.2)
Pictures Figure: Two dimensional Ising Model on a square with + boundary conditions.
Thermodynamic Limit The thermodynamic limit refers to the limit Λ Zd. Two typical questions of importance 1 What happens to the limit f (β, τ, d) = limλ Zd EΛ,τ,β [σ0 ]? 2 Does PΛ,τ,β have a limit as Λ Zd? The (mathematically) surprising answer is that answer is that this depends on β if d > 1: There exists βc such that: 1 For β < βc one has f (β, τ, d) = 0 and PΛ,τ,β looses its dependency of τ in the limit. 2 For β > βc one has 0 < f (β, +) 6= f (β, ) < 0 and lim PΛ,+,β 6= lim PΛ,τ,β. Λ Zd A phase transition occurs! Λ Zd (2.3)
Thermodynamic Limit The thermodynamic limit refers to the limit Λ Zd. Two typical questions of importance 1 What happens to the limit f (β, τ, d) = limλ Zd EΛ,τ,β [σ0 ]? 2 Does PΛ,τ,β have a limit as Λ Zd? The (mathematically) surprising answer is that answer is that this depends on β if d > 1: There exists βc such that: 1 For β < βc one has f (β, τ, d) = 0 and PΛ,τ,β looses its dependency of τ in the limit. 2 For β > βc one has 0 < f (β, +) 6= f (β, ) < 0 and lim PΛ,+,β 6= lim PΛ,τ,β. Λ Zd A phase transition occurs! Λ Zd (2.3)
Thermodynamic Limit The thermodynamic limit refers to the limit Λ Zd. Two typical questions of importance 1 What happens to the limit f (β, τ, d) = limλ Zd EΛ,τ,β [σ0 ]? 2 Does PΛ,τ,β have a limit as Λ Zd? The (mathematically) surprising answer is that answer is that this depends on β if d > 1: There exists βc such that: 1 For β < βc one has f (β, τ, d) = 0 and PΛ,τ,β looses its dependency of τ in the limit. 2 For β > βc one has 0 < f (β, +) 6= f (β, ) < 0 and lim PΛ,+,β 6= lim PΛ,τ,β. Λ Zd A phase transition occurs! Λ Zd (2.3)
Physical Evidence and Generalisations This behaviour may be known to you from A-levels under the name Curie-temperature: If you heat a magnet up, it will loose its magnetisation. If you cool it down, it spontaneously assumes a polarisation. There exists a vast number of generalisations: 1 Consider more than just nearest neighbour spin-pairs in H. 2 Consider more general graphs than Zd. 3 Consider more general (random) coupling constants. 4 Consider large spaces than {+1, 1}, e.g. {red, green, blue} Potts model. 5...
Physical Evidence and Generalisations This behaviour may be known to you from A-levels under the name Curie-temperature: If you heat a magnet up, it will loose its magnetisation. If you cool it down, it spontaneously assumes a polarisation. There exists a vast number of generalisations: 1 Consider more than just nearest neighbour spin-pairs in H. 2 Consider more general graphs than Zd. 3 Consider more general (random) coupling constants. 4 Consider large spaces than {+1, 1}, e.g. {red, green, blue} Potts model. 5...
Pictures Figure: A 2 d Ising model with + boundary conditions. A Potts model.
Consider the space of configurations ΩΛ = {φ : φ (Rm )Λ }. Much larger but more general. Take m = 1 for now. We can give the density of the measure PΛ,τ (dφ) = R RΛ 1 e Hβ (φ) dφ e H(φ) (3.4) with Hβ (φ) = β X W (φ(x ) φ(y )), φ(z) = τ (z) for z / Λ, x y =1 (3.5) for some function W : R R.
Consider the space of configurations ΩΛ = {φ : φ (Rm )Λ }. Much larger but more general. Take m = 1 for now. We can give the density of the measure PΛ,τ (dφ) = R RΛ 1 e Hβ (φ) dφ e H(φ) (3.4) with Hβ (φ) = β X W (φ(x ) φ(y )), φ(z) = τ (z) for z / Λ, x y =1 (3.5) for some function W : R R.
Gaussian Free Field The most studied example is the Gaussian Case where W (x ) = x 2. Figure: Gaussian Free Field in Dimension 2 with linear interpolation.
Random Walk Representation Since it is not easy to calculate integrals over RΛ with Λ Zd we can invoke the so called random walk representation. EΛ,τ [φ(x )] = Ex [τ (XTΛ )], (3.6) and "Z G(x, y ) = EΛ,τ [φ(x ); φ(y )] = Ex 0 TΛ # 1{Xs = y }ds. (3.7) According to Gaussian calculus it is sufficient to know these two quantities to infer everything about the system.
Random Walk Representation Since it is not easy to calculate integrals over RΛ with Λ Zd we can invoke the so called random walk representation. EΛ,τ [φ(x )] = Ex [τ (XTΛ )], (3.6) and "Z G(x, y ) = EΛ,τ [φ(x ); φ(y )] = Ex 0 TΛ # 1{Xs = y }ds. (3.7) According to Gaussian calculus it is sufficient to know these two quantities to infer everything about the system.
Random Walk Representation Since it is not easy to calculate integrals over RΛ with Λ Zd we can invoke the so called random walk representation. EΛ,τ [φ(x )] = Ex [τ (XTΛ )], (3.6) and "Z G(x, y ) = EΛ,τ [φ(x ); φ(y )] = Ex 0 TΛ # 1{Xs = y }ds. (3.7) According to Gaussian calculus it is sufficient to know these two quantities to infer everything about the system.
Thermodynamic Limit A standard result from random walk theory is that in one and two dimensions a random walk revisits the every point -often, in d 3 only a finite number of times. 1 In d = 1, 2 we cannot define a reasonable limit of PΛ,τ as Λ. 2 In d 3 we can define Pτ = limλ Zd PΛ,τ, in infinite collection of Gaussian vectors. This holds for all β > 0! If we however look at the differences η(x, y ) = φ(x ) φ(y ) we can define a measure on the (η) s even in d = 1, 2.
Thermodynamic Limit A standard result from random walk theory is that in one and two dimensions a random walk revisits the every point -often, in d 3 only a finite number of times. 1 In d = 1, 2 we cannot define a reasonable limit of PΛ,τ as Λ. 2 In d 3 we can define Pτ = limλ Zd PΛ,τ, in infinite collection of Gaussian vectors. This holds for all β > 0! If we however look at the differences η(x, y ) = φ(x ) φ(y ) we can define a measure on the (η) s even in d = 1, 2.
Thermodynamic Limit A standard result from random walk theory is that in one and two dimensions a random walk revisits the every point -often, in d 3 only a finite number of times. 1 In d = 1, 2 we cannot define a reasonable limit of PΛ,τ as Λ. 2 In d 3 we can define Pτ = limλ Zd PΛ,τ, in infinite collection of Gaussian vectors. This holds for all β > 0! If we however look at the differences η(x, y ) = φ(x ) φ(y ) we can define a measure on the (η) s even in d = 1, 2.
Convex Potentials What if W (x ) 6= x 2 but we have 0 < c1 < W 00 < c2 <? This leads to the celebrated Helffer-Sjoestrand random walk representation. "Z TΛ # EΛ,τ [F (φ); G(φ)] = EΛ,τ,x F (φ) G(φ) dt, φ(x ) φ(xt ) 0 x Λ (3.8) where Xt is a random walk in random environment generated by the system of SDE s X dφt (x ) = X W 0 (φt (x ) φt (y )) + dbt. y x =1 Virtually all the results as in the Gaussian case hold true. (3.9)
Convex Potentials What if W (x ) 6= x 2 but we have 0 < c1 < W 00 < c2 <? This leads to the celebrated Helffer-Sjoestrand random walk representation. "Z TΛ # EΛ,τ [F (φ); G(φ)] = EΛ,τ,x F (φ) G(φ) dt, φ(x ) φ(xt ) 0 x Λ (3.8) where Xt is a random walk in random environment generated by the system of SDE s X dφt (x ) = X W 0 (φt (x ) φt (y )) + dbt. y x =1 Virtually all the results as in the Gaussian case hold true. (3.9)
Convex Potentials What if W (x ) 6= x 2 but we have 0 < c1 < W 00 < c2 <? This leads to the celebrated Helffer-Sjoestrand random walk representation. "Z TΛ # EΛ,τ [F (φ); G(φ)] = EΛ,τ,x F (φ) G(φ) dt, φ(x ) φ(xt ) 0 x Λ (3.8) where Xt is a random walk in random environment generated by the system of SDE s X dφt (x ) = X W 0 (φt (x ) φt (y )) + dbt. y x =1 Virtually all the results as in the Gaussian case hold true. (3.9)
Non-convex Potentials In this case we cannot hope for a random walk representation, as convexity is equivalent to have positive probabilities. Currently, the methods being used either rely on clever perturbation theory and renormalisation theory. Figure: Why I am trying to avoid renormalisation theory.
Connections within Mathematics Statistical Mechanics uses a rich variety of tools within mathematics such as 1 Tools from Probability theory: Martingales, Gaussian Calculus, inequalities, Large Deviation Theory, Combinatorics, Stochastic Calculus, Random Walks... 2 Tools from Analysis: More inequalities, Sobolev spaces/pdes, variational analysis, Fourier Analysis, Functional Analysis,... 3 (Linear) Algebra, Manifold-theory, Statistics, Graph Theory, Measure Theory, Number Theory,... Naturally it also involves a lot of ideas from (theoretical/mathematical) Physics.
Connections within Mathematics Statistical Mechanics uses a rich variety of tools within mathematics such as 1 Tools from Probability theory: Martingales, Gaussian Calculus, inequalities, Large Deviation Theory, Combinatorics, Stochastic Calculus, Random Walks... 2 Tools from Analysis: More inequalities, Sobolev spaces/pdes, variational analysis, Fourier Analysis, Functional Analysis,... 3 (Linear) Algebra, Manifold-theory, Statistics, Graph Theory, Measure Theory, Number Theory,... Naturally it also involves a lot of ideas from (theoretical/mathematical) Physics.
Connections within Mathematics Statistical Mechanics uses a rich variety of tools within mathematics such as 1 Tools from Probability theory: Martingales, Gaussian Calculus, inequalities, Large Deviation Theory, Combinatorics, Stochastic Calculus, Random Walks... 2 Tools from Analysis: More inequalities, Sobolev spaces/pdes, variational analysis, Fourier Analysis, Functional Analysis,... 3 (Linear) Algebra, Manifold-theory, Statistics, Graph Theory, Measure Theory, Number Theory,... Naturally it also involves a lot of ideas from (theoretical/mathematical) Physics.
Connections within Mathematics Statistical Mechanics uses a rich variety of tools within mathematics such as 1 Tools from Probability theory: Martingales, Gaussian Calculus, inequalities, Large Deviation Theory, Combinatorics, Stochastic Calculus, Random Walks... 2 Tools from Analysis: More inequalities, Sobolev spaces/pdes, variational analysis, Fourier Analysis, Functional Analysis,... 3 (Linear) Algebra, Manifold-theory, Statistics, Graph Theory, Measure Theory, Number Theory,... Naturally it also involves a lot of ideas from (theoretical/mathematical) Physics.
Connections within Mathematics Statistical Mechanics uses a rich variety of tools within mathematics such as 1 Tools from Probability theory: Martingales, Gaussian Calculus, inequalities, Large Deviation Theory, Combinatorics, Stochastic Calculus, Random Walks... 2 Tools from Analysis: More inequalities, Sobolev spaces/pdes, variational analysis, Fourier Analysis, Functional Analysis,... 3 (Linear) Algebra, Manifold-theory, Statistics, Graph Theory, Measure Theory, Number Theory,... Naturally it also involves a lot of ideas from (theoretical/mathematical) Physics.
Applications to other Sciences Statistical Mechanics has contributed to a vast number of fields such as 1 The theory of neural networks and artificial intelligence. 2 Neuroscience and other areas of medical studies. 3 Economics, finance and social sciences. 4 Material sciences and other applied areas of Physics. 5 (Bio-)chemistry, Complexity Science, etc..
Applications to other Sciences Statistical Mechanics has contributed to a vast number of fields such as 1 The theory of neural networks and artificial intelligence. 2 Neuroscience and other areas of medical studies. 3 Economics, finance and social sciences. 4 Material sciences and other applied areas of Physics. 5 (Bio-)chemistry, Complexity Science, etc..
Applications to other Sciences Statistical Mechanics has contributed to a vast number of fields such as 1 The theory of neural networks and artificial intelligence. 2 Neuroscience and other areas of medical studies. 3 Economics, finance and social sciences. 4 Material sciences and other applied areas of Physics. 5 (Bio-)chemistry, Complexity Science, etc..
Applications to other Sciences Statistical Mechanics has contributed to a vast number of fields such as 1 The theory of neural networks and artificial intelligence. 2 Neuroscience and other areas of medical studies. 3 Economics, finance and social sciences. 4 Material sciences and other applied areas of Physics. 5 (Bio-)chemistry, Complexity Science, etc..
Applications to other Sciences Statistical Mechanics has contributed to a vast number of fields such as 1 The theory of neural networks and artificial intelligence. 2 Neuroscience and other areas of medical studies. 3 Economics, finance and social sciences. 4 Material sciences and other applied areas of Physics. 5 (Bio-)chemistry, Complexity Science, etc..
Applications to other Sciences Statistical Mechanics has contributed to a vast number of fields such as 1 The theory of neural networks and artificial intelligence. 2 Neuroscience and other areas of medical studies. 3 Economics, finance and social sciences. 4 Material sciences and other applied areas of Physics. 5 (Bio-)chemistry, Complexity Science, etc..
Thank you for your attention