Texas A&M University May 1, 2012
Probability spaces. (Λ, M, P ) = measure space. Probability space: P a probability measure, P (Λ) = 1.
Probability spaces. (Λ, M, P ) = measure space. Probability space: P a probability measure, P (Λ) = 1. Algebra A = L (Λ, P ) of bounded random variables. E[X] = X dp = expectation functional on A.
Probability spaces. (Λ, M, P ) = measure space. Probability space: P a probability measure, P (Λ) = 1. Algebra A = L (Λ, P ) of bounded random variables. E[X] = X dp = expectation functional on A. For each real-valued X, have µ X = probability measure on R defined by f(x) dµ X (x) = f(x) dp = E[f(X)] R for f C 0 (R). µ X = distribution of X. Λ
Independence. More generally, if X 1, X 2,..., X n random variables, µ X1,X 2,...,X n = measure on R n = joint distribution. Definition. X, Y are independent if µ X,Y = µ X µ Y (product measure) i.e. E[f(X)g(Y )] = E[f(X)]E[g(Y )]
Independence. More generally, if X 1, X 2,..., X n random variables, µ X1,X 2,...,X n = measure on R n = joint distribution. Definition. X, Y are independent if µ X,Y = µ X µ Y (product measure) i.e. E[f(X)g(Y )] = E[f(X)]E[g(Y )] Remark. If X, Y independent f(t) dµ X+Y (t) = f(x + y) dµ X,Y (x, y) = f(x + y) d(µ X µ Y ) = f(t) d(µ X µ Y ). So in this case, µ X+Y = µ X µ Y.
Fourier transform. Definition. Fourier transform F X (θ) = e iθx dµ X (x) = E[e iθx ]
Fourier transform. Definition. Fourier transform F X (θ) = e iθx dµ X (x) = E[e iθx ] Lemma. If X, Y independent, F X+Y (θ) = F X (θ)f Y (θ).
Fourier transform. Definition. Fourier transform F X (θ) = e iθx dµ X (x) = E[e iθx ] Lemma. If X, Y independent, F X+Y (θ) = F X (θ)f Y (θ). Proof. E[e iθ(x+y ) ] = E[e iθx e iθy ] = E[e iθx ]E[e iθy ]
Combinatorics. F X (θ) = e iθx (iθ) n dµ X (x) = m n (X), n! n=0 m n = x n dµ X (x) = E[X n ]. {m 0, m 1, m 2,...} = moments of X.
Combinatorics. F X (θ) = m n = e iθx dµ X (x) = {m 0, m 1, m 2,...} = moments of X. (iθ) n m n (X), n! n=0 x n dµ X (x) = E[X n ]. For X, Y independent, m n (X + Y ) complicated. But: F X+Y (θ) = F X (θ)f Y (θ), log F X+Y (θ) = log F X (θ) + log F Y (θ). Denote l X (θ) = log F X+Y (θ).
Cumulants. (iθ) n l X (θ) = c n, n! n=1 {c 1, c 2, c 3,...} = cumulants of X. c n (X + Y ) = c n (X) + c n (Y ).
Cumulants. (iθ) n l X (θ) = c n, n! n=1 {c 1, c 2, c 3,...} = cumulants of X. c n (X + Y ) = c n (X) + c n (Y ). Relation between {m n }, {c n }?
Cumulants. (iθ) n l X (θ) = c n, n! n=1 {c 1, c 2, c 3,...} = cumulants of X. c n (X + Y ) = c n (X) + c n (Y ). Relation between {m n }, {c n }? A set partition {(1, 3, 4), (2, 7), (5), (6)} P(7).
Moment-cumulant formula. Proposition. m n = c B. π P(n) B π
Moment-cumulant formula. Proposition. m n = c B. π P(n) B π m 1 = c 1, c 1 = m 1 mean m 2 = c 2 + c 2 1, c 2 = m 2 m 2 1 variance m 3 = c 3 + 3c 2 c 1 + c 3 1, c 3 = m 3 3m 2 m 2 1 + 2m 3 1
Moment-cumulant formula. Proposition. m n = c B. π P(n) B π m 1 = c 1, c 1 = m 1 mean m 2 = c 2 + c 2 1, c 2 = m 2 m 2 1 variance m 3 = c 3 + 3c 2 c 1 + c 3 1, c 3 = m 3 3m 2 m 2 1 + 2m 3 1 l = log F, F = e l, F = l F.
Moment-cumulant formula. Proposition. m n = c B. π P(n) B π m 1 = c 1, c 1 = m 1 mean m 2 = c 2 + c 2 1, c 2 = m 2 m 2 1 variance m 3 = c 3 + 3c 2 c 1 + c 3 1, c 3 = m 3 3m 2 m 2 1 + 2m 3 1 l = log F, F = e l, F = l F. m n+1 = n k=0 ( ) n c k+1 m n k. k
Central limit theorem. Theorem. Let {X n : n N} be independent, identically distributed, mean 0, variance v. E[X n ] = 0, E[X 2 n] = v. Let S n = X 1 + X 2 +... + X n n. Then the moments of S n converge to the moments of the normal distribution N (0, v).
Central limit theorem. Proof. For each k, c k (αx) = α k c k (X).
Central limit theorem. Proof. For each k, c k (αx) = α k c k (X). ( ) X1 + X 2 +... + X n c k (S n ) = c k = n n ( n) k c k(x 1 ).
Central limit theorem. Proof. For each k, c k (αx) = α k c k (X). ( ) X1 + X 2 +... + X n c k (S n ) = c k = n n ( n) k c k(x 1 ). (k = 1) c 1 (X 1 ) = 0, c 1 (S n ) = 0.
Central limit theorem. Proof. For each k, c k (αx) = α k c k (X). ( ) X1 + X 2 +... + X n c k (S n ) = c k = n n ( n) k c k(x 1 ). (k = 1) c 1 (X 1 ) = 0, c 1 (S n ) = 0. (k = 2) c 2 (X 1 ) = v, c 2 (S n ) = v.
Central limit theorem. Proof. For each k, c k (αx) = α k c k (X). ( ) X1 + X 2 +... + X n c k (S n ) = c k = n n ( n) k c k(x 1 ). (k = 1) c 1 (X 1 ) = 0, c 1 (S n ) = 0. (k = 2) c 2 (X 1 ) = v, c 2 (S n ) = v. n (k > 2) n k/2 0, c k(s n ) 0.
Central limit theorem. Proof. For each k, c k (αx) = α k c k (X). ( ) X1 + X 2 +... + X n c k (S n ) = c k = n n ( n) k c k(x 1 ). (k = 1) c 1 (X 1 ) = 0, c 1 (S n ) = 0. (k = 2) c 2 (X 1 ) = v, c 2 (S n ) = v. n (k > 2) n k/2 0, c k(s n ) 0. In the limit, get whichever distribution has { v, k = 2, c k = 0, otherwise. Check: normal distribution.
Central limit theorem. Proof. For each k, c k (αx) = α k c k (X). ( ) X1 + X 2 +... + X n c k (S n ) = c k = n n ( n) k c k(x 1 ). (k = 1) c 1 (X 1 ) = 0, c 1 (S n ) = 0. (k = 2) c 2 (X 1 ) = v, c 2 (S n ) = v. n (k > 2) n k/2 0, c k(s n ) 0. In the limit, get whichever distribution has { v, k = 2, c k = 0, otherwise. Check: normal distribution. Note m n = π P 2 (n) vn/2.
Operators. H = real Hilbert space, e.g. R n. H C = its complexification (C n ).
Operators. H = real Hilbert space, e.g. R n. H C = its complexification (C n ). H n C = H C H C... H C = symmetric tensor product = Span ({h 1 h 2... h n, order immaterial}) with the inner product h 1... h n, g 1... g n = (degenerate inner product). σ Sym(n) h1, g σ(1)... hn, g σ(n)
Creation and annihilation operators. Symmetric Fock space F(H C ) = n=0 Ω = vacuum vector. H n C = CΩ H C H 2 C H 3 C...,
Creation and annihilation operators. Symmetric Fock space F(H C ) = n=0 Ω = vacuum vector. H n C For h H, define a + h, a h on F(H C) = CΩ H C H 2 C H 3 C..., a + h (f 1... f n ) = h f 1... f n, n a h (f 1... f n ) = f i, h f 1... ˆf i... f n, a h i=1 (f) = f, h Ω creation and annihilation operators.
Operator algebra. Check: a h = (a+ h ) adjoint. So X h = a + h + a h = self-adjoint.
Operator algebra. Check: a h = (a+ h ) adjoint. So X h = a + h + a h = self-adjoint. a +, a do not commute: a h a+ g a + g a h = g, h.
Operator algebra. Check: a h = (a+ h ) adjoint. So X h = a + h + a h = self-adjoint. a +, a do not commute: a h a+ g a + g a h = g, h. But X h, X g commute. A = Alg {X h : h H} = commutative algebra. Define the expectation functional on it by E[A] = AΩ, Ω. (A, E) = probability space.
Wick formula. E[X h1 X h2... X hn ] = (a + h 1 + a h 1 )(a + h 2 + a h 2 )... (a + h n + a h n )Ω, Ω = h i, h j. π P 2 (n) (i,j) π
Wick formula. E[X h1 X h2... X hn ] = (a + h 1 + a h 1 )(a + h 2 + a h 2 )... (a + h n + a h n )Ω, Ω = h i, h j. Therefore: c k (X h ) = π P 2 (n) (i,j) π { h 2, k = 2, 0, otherwise, and so X h N (0, h 2 ).
Wick formula. E[X h1 X h2... X hn ] = (a + h 1 + a h 1 )(a + h 2 + a h 2 )... (a + h n + a h n )Ω, Ω = h i, h j. Therefore: c k (X h ) = π P 2 (n) (i,j) π { h 2, k = 2, 0, otherwise, and so X h N (0, h 2 ). If h g, the X h, X g are independent.