Probability and combinatorics

Similar documents
Free Probability Theory and Non-crossing Partitions. Roland Speicher Queen s University Kingston, Canada

Linearization coefficients for orthogonal polynomials. Michael Anshelevich

Free coherent states and distributions on p-adic numbers

Fermionic coherent states in infinite dimensions

arxiv:math/ v3 [math.oa] 16 Oct 2005

Econ 508B: Lecture 5

Ranks of Real Symmetric Tensors

LIST OF FORMULAS FOR STK1100 AND STK1110

Stable Process. 2. Multivariate Stable Distributions. July, 2006

Review of Probability Theory

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

Combinatorial Aspects of Free Probability and Free Stochastic Calculus

Another consequence of the Cauchy Schwarz inequality is the continuity of the inner product.

Problem 1 (Equations with the dependent variable missing) By means of the substitutions. v = dy dt, dv

Gaussian Random Fields

On Infinite-Dimensional Integration in Weighted Hilbert Spaces

Lecture 22: Variance and Covariance

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

On the classification and modular extendability of E 0 -semigroups on factors Joint work with Daniel Markiewicz

A direct proof of a result of Wassermann James Tener Subfactor Seminar April 16, 2010

Linear Algebra and Dirac Notation, Pt. 1

Part II Probability and Measure

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Hilbert function, Betti numbers. Daniel Gromada

Section 4: The Quantum Scalar Field

Vectors in Function Spaces

INTRODUCTION TO LIE ALGEBRAS. LECTURE 7.

Math 581 Problem Set 5 Solutions

Projection Theorem 1

Functional Analysis I

Kernel Method: Data Analysis with Positive Definite Kernels

P AC COMMUTATORS AND THE R TRANSFORM

The Klein-Gordon equation

Second quantization: where quantization and particles come from?

Math 121 Homework 2: Notes on Selected Problems

2 n k In particular, using Stirling formula, we can calculate the asymptotic of obtaining heads exactly half of the time:

5 Operations on Multiple Random Variables

Characterizations of free Meixner distributions

Tensor algebras and subproduct systems arising from stochastic matrices

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

Ph.D. Qualifying Exam: Algebra I

An Invitation to Geometric Quantization

Thomas Hoover and Alan Lambert

4.6. Linear Operators on Hilbert Spaces

Fock Spaces. Part 1. Julia Singer Universität Bonn. From Field Theories to Elliptic Objects Winterschule Schloss Mickeln

THEOREM OF OSELEDETS. We recall some basic facts and terminology relative to linear cocycles and the multiplicative ergodic theorem of Oseledets [1].

Non commutative Khintchine inequalities and Grothendieck s theo

Probability: Handout

VERLINDE ALGEBRA LEE COHN. Contents

Honors Linear Algebra, Spring Homework 8 solutions by Yifei Chen

The Schwartz Space: Tools for Quantum Mechanics and Infinite Dimensional Analysis

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

MATH MEASURE THEORY AND FOURIER ANALYSIS. Contents

MA6451 PROBABILITY AND RANDOM PROCESSES

Two-parameter Noncommutative Central Limit Theorem

Kernel Measures of Conditional Dependence

1 Math 241A-B Homework Problem List for F2015 and W2016

From random matrices to free groups, through non-crossing partitions. Michael Anshelevich

Tutorial 5 Clifford Algebra and so(n)

Review of Probability Theory II

Oberwolfach Preprints

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

The Free Central Limit Theorem: A Combinatorial Approach

Quantization of a Scalar Field

2.4.8 Heisenberg Algebra, Fock Space and Harmonic Oscillator

CS8803: Statistical Techniques in Robotics Byron Boots. Hilbert Space Embeddings

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

arxiv:math/ v1 [math.qa] 29 Dec 2003

Part III Symmetries, Fields and Particles

About Grupo the Mathematical de Investigación Foundation of Quantum Mechanics

I forgot to mention last time: in the Ito formula for two standard processes, putting

MATH 131B: ALGEBRA II PART B: COMMUTATIVE ALGEBRA

Moduli spaces of sheaves and the boson-fermion correspondence

Solutions for Econometrics I Homework No.3

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

The Poisson Process in Quantum Stochastic Calculus

Lecture Notes Part 2: Matrix Algebra

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

MATH 205C: STATIONARY PHASE LEMMA

Compatibly split subvarieties of Hilb n (A 2 k)

Partition-Dependent Stochastic Measures and q-deformed Cumulants

3.1. Derivations. Let A be a commutative k-algebra. Let M be a left A-module. A derivation of A in M is a linear map D : A M such that

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

Recall that any inner product space V has an associated norm defined by

Postulates of quantum mechanics viewed as a non-commutative extension of probability theory with a dynamical law

Normal approximation of Poisson functionals in Kolmogorov distance

AMCS243/CS243/EE243 Probability and Statistics. Fall Final Exam: Sunday Dec. 8, 3:00pm- 5:50pm VERSION A

Divide and Conquer Kernel Ridge Regression. A Distributed Algorithm with Minimax Optimal Rates

FOCK SPACE TECHNIQUES IN TENSOR ALGEBRAS OF DIRECTED GRAPHS

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Multiple Random Variables

Week 12-13: Discrete Probability

Lectures 22-23: Conditional Expectations

LECTURE 16: LIE GROUPS AND THEIR LIE ALGEBRAS. 1. Lie groups

Interacting Fock space versus full Fock module

Partial Solutions for h4/2014s: Sampling Distributions

where P a is a projector to the eigenspace of A corresponding to a. 4. Time evolution of states is governed by the Schrödinger equation

Operator-Valued Free Probability Theory and Block Random Matrices. Roland Speicher Queen s University Kingston

Transcription:

Texas A&M University May 1, 2012

Probability spaces. (Λ, M, P ) = measure space. Probability space: P a probability measure, P (Λ) = 1.

Probability spaces. (Λ, M, P ) = measure space. Probability space: P a probability measure, P (Λ) = 1. Algebra A = L (Λ, P ) of bounded random variables. E[X] = X dp = expectation functional on A.

Probability spaces. (Λ, M, P ) = measure space. Probability space: P a probability measure, P (Λ) = 1. Algebra A = L (Λ, P ) of bounded random variables. E[X] = X dp = expectation functional on A. For each real-valued X, have µ X = probability measure on R defined by f(x) dµ X (x) = f(x) dp = E[f(X)] R for f C 0 (R). µ X = distribution of X. Λ

Independence. More generally, if X 1, X 2,..., X n random variables, µ X1,X 2,...,X n = measure on R n = joint distribution. Definition. X, Y are independent if µ X,Y = µ X µ Y (product measure) i.e. E[f(X)g(Y )] = E[f(X)]E[g(Y )]

Independence. More generally, if X 1, X 2,..., X n random variables, µ X1,X 2,...,X n = measure on R n = joint distribution. Definition. X, Y are independent if µ X,Y = µ X µ Y (product measure) i.e. E[f(X)g(Y )] = E[f(X)]E[g(Y )] Remark. If X, Y independent f(t) dµ X+Y (t) = f(x + y) dµ X,Y (x, y) = f(x + y) d(µ X µ Y ) = f(t) d(µ X µ Y ). So in this case, µ X+Y = µ X µ Y.

Fourier transform. Definition. Fourier transform F X (θ) = e iθx dµ X (x) = E[e iθx ]

Fourier transform. Definition. Fourier transform F X (θ) = e iθx dµ X (x) = E[e iθx ] Lemma. If X, Y independent, F X+Y (θ) = F X (θ)f Y (θ).

Fourier transform. Definition. Fourier transform F X (θ) = e iθx dµ X (x) = E[e iθx ] Lemma. If X, Y independent, F X+Y (θ) = F X (θ)f Y (θ). Proof. E[e iθ(x+y ) ] = E[e iθx e iθy ] = E[e iθx ]E[e iθy ]

Combinatorics. F X (θ) = e iθx (iθ) n dµ X (x) = m n (X), n! n=0 m n = x n dµ X (x) = E[X n ]. {m 0, m 1, m 2,...} = moments of X.

Combinatorics. F X (θ) = m n = e iθx dµ X (x) = {m 0, m 1, m 2,...} = moments of X. (iθ) n m n (X), n! n=0 x n dµ X (x) = E[X n ]. For X, Y independent, m n (X + Y ) complicated. But: F X+Y (θ) = F X (θ)f Y (θ), log F X+Y (θ) = log F X (θ) + log F Y (θ). Denote l X (θ) = log F X+Y (θ).

Cumulants. (iθ) n l X (θ) = c n, n! n=1 {c 1, c 2, c 3,...} = cumulants of X. c n (X + Y ) = c n (X) + c n (Y ).

Cumulants. (iθ) n l X (θ) = c n, n! n=1 {c 1, c 2, c 3,...} = cumulants of X. c n (X + Y ) = c n (X) + c n (Y ). Relation between {m n }, {c n }?

Cumulants. (iθ) n l X (θ) = c n, n! n=1 {c 1, c 2, c 3,...} = cumulants of X. c n (X + Y ) = c n (X) + c n (Y ). Relation between {m n }, {c n }? A set partition {(1, 3, 4), (2, 7), (5), (6)} P(7).

Moment-cumulant formula. Proposition. m n = c B. π P(n) B π

Moment-cumulant formula. Proposition. m n = c B. π P(n) B π m 1 = c 1, c 1 = m 1 mean m 2 = c 2 + c 2 1, c 2 = m 2 m 2 1 variance m 3 = c 3 + 3c 2 c 1 + c 3 1, c 3 = m 3 3m 2 m 2 1 + 2m 3 1

Moment-cumulant formula. Proposition. m n = c B. π P(n) B π m 1 = c 1, c 1 = m 1 mean m 2 = c 2 + c 2 1, c 2 = m 2 m 2 1 variance m 3 = c 3 + 3c 2 c 1 + c 3 1, c 3 = m 3 3m 2 m 2 1 + 2m 3 1 l = log F, F = e l, F = l F.

Moment-cumulant formula. Proposition. m n = c B. π P(n) B π m 1 = c 1, c 1 = m 1 mean m 2 = c 2 + c 2 1, c 2 = m 2 m 2 1 variance m 3 = c 3 + 3c 2 c 1 + c 3 1, c 3 = m 3 3m 2 m 2 1 + 2m 3 1 l = log F, F = e l, F = l F. m n+1 = n k=0 ( ) n c k+1 m n k. k

Central limit theorem. Theorem. Let {X n : n N} be independent, identically distributed, mean 0, variance v. E[X n ] = 0, E[X 2 n] = v. Let S n = X 1 + X 2 +... + X n n. Then the moments of S n converge to the moments of the normal distribution N (0, v).

Central limit theorem. Proof. For each k, c k (αx) = α k c k (X).

Central limit theorem. Proof. For each k, c k (αx) = α k c k (X). ( ) X1 + X 2 +... + X n c k (S n ) = c k = n n ( n) k c k(x 1 ).

Central limit theorem. Proof. For each k, c k (αx) = α k c k (X). ( ) X1 + X 2 +... + X n c k (S n ) = c k = n n ( n) k c k(x 1 ). (k = 1) c 1 (X 1 ) = 0, c 1 (S n ) = 0.

Central limit theorem. Proof. For each k, c k (αx) = α k c k (X). ( ) X1 + X 2 +... + X n c k (S n ) = c k = n n ( n) k c k(x 1 ). (k = 1) c 1 (X 1 ) = 0, c 1 (S n ) = 0. (k = 2) c 2 (X 1 ) = v, c 2 (S n ) = v.

Central limit theorem. Proof. For each k, c k (αx) = α k c k (X). ( ) X1 + X 2 +... + X n c k (S n ) = c k = n n ( n) k c k(x 1 ). (k = 1) c 1 (X 1 ) = 0, c 1 (S n ) = 0. (k = 2) c 2 (X 1 ) = v, c 2 (S n ) = v. n (k > 2) n k/2 0, c k(s n ) 0.

Central limit theorem. Proof. For each k, c k (αx) = α k c k (X). ( ) X1 + X 2 +... + X n c k (S n ) = c k = n n ( n) k c k(x 1 ). (k = 1) c 1 (X 1 ) = 0, c 1 (S n ) = 0. (k = 2) c 2 (X 1 ) = v, c 2 (S n ) = v. n (k > 2) n k/2 0, c k(s n ) 0. In the limit, get whichever distribution has { v, k = 2, c k = 0, otherwise. Check: normal distribution.

Central limit theorem. Proof. For each k, c k (αx) = α k c k (X). ( ) X1 + X 2 +... + X n c k (S n ) = c k = n n ( n) k c k(x 1 ). (k = 1) c 1 (X 1 ) = 0, c 1 (S n ) = 0. (k = 2) c 2 (X 1 ) = v, c 2 (S n ) = v. n (k > 2) n k/2 0, c k(s n ) 0. In the limit, get whichever distribution has { v, k = 2, c k = 0, otherwise. Check: normal distribution. Note m n = π P 2 (n) vn/2.

Operators. H = real Hilbert space, e.g. R n. H C = its complexification (C n ).

Operators. H = real Hilbert space, e.g. R n. H C = its complexification (C n ). H n C = H C H C... H C = symmetric tensor product = Span ({h 1 h 2... h n, order immaterial}) with the inner product h 1... h n, g 1... g n = (degenerate inner product). σ Sym(n) h1, g σ(1)... hn, g σ(n)

Creation and annihilation operators. Symmetric Fock space F(H C ) = n=0 Ω = vacuum vector. H n C = CΩ H C H 2 C H 3 C...,

Creation and annihilation operators. Symmetric Fock space F(H C ) = n=0 Ω = vacuum vector. H n C For h H, define a + h, a h on F(H C) = CΩ H C H 2 C H 3 C..., a + h (f 1... f n ) = h f 1... f n, n a h (f 1... f n ) = f i, h f 1... ˆf i... f n, a h i=1 (f) = f, h Ω creation and annihilation operators.

Operator algebra. Check: a h = (a+ h ) adjoint. So X h = a + h + a h = self-adjoint.

Operator algebra. Check: a h = (a+ h ) adjoint. So X h = a + h + a h = self-adjoint. a +, a do not commute: a h a+ g a + g a h = g, h.

Operator algebra. Check: a h = (a+ h ) adjoint. So X h = a + h + a h = self-adjoint. a +, a do not commute: a h a+ g a + g a h = g, h. But X h, X g commute. A = Alg {X h : h H} = commutative algebra. Define the expectation functional on it by E[A] = AΩ, Ω. (A, E) = probability space.

Wick formula. E[X h1 X h2... X hn ] = (a + h 1 + a h 1 )(a + h 2 + a h 2 )... (a + h n + a h n )Ω, Ω = h i, h j. π P 2 (n) (i,j) π

Wick formula. E[X h1 X h2... X hn ] = (a + h 1 + a h 1 )(a + h 2 + a h 2 )... (a + h n + a h n )Ω, Ω = h i, h j. Therefore: c k (X h ) = π P 2 (n) (i,j) π { h 2, k = 2, 0, otherwise, and so X h N (0, h 2 ).

Wick formula. E[X h1 X h2... X hn ] = (a + h 1 + a h 1 )(a + h 2 + a h 2 )... (a + h n + a h n )Ω, Ω = h i, h j. Therefore: c k (X h ) = π P 2 (n) (i,j) π { h 2, k = 2, 0, otherwise, and so X h N (0, h 2 ). If h g, the X h, X g are independent.