A Brief Analysis of Central Limit Theorem. SIAM Chapter Florida State University
|
|
- Octavia Hoover
- 5 years ago
- Views:
Transcription
1 1 / 36 A Brief Analysis of Central Limit Theorem Omid Khanmohamadi (okhanmoh@math.fsu.edu) Diego Hernán Díaz Martínez (ddiazmar@math.fsu.edu) Tony Wills (twills@math.fsu.edu) Kouadio David Yao (kyao@math.fsu.edu) SIAM Chapter Florida State University March 17, 2014
2 2 / 36 Outline Examples Statement of Theorem Modes of Convergence Fourier Transform and Convolution Outline of Proof Generalizations
3 3 / 36 Outline Examples Statement of Theorem Modes of Convergence Fourier Transform and Convolution Outline of Proof Generalizations
4 From Concrete to Abstract: Examples then Theorems! You should start with understanding the interesting examples and build up to explain what the general phenomena are. This was your progress from initial understanding to more understanding. Michael Atiyah [image source: Wikipedia] The source of all great mathematics is the special case, the concrete example. It is frequent in mathematics that every instance of a concept of seemingly great generality is in essence the same as a small and concrete special case. Paul Halmos ( ) [image source: Wikipedia] 4 / 36
5 Sum of Dice Throws is (Eventually) Normally Distributed Comparison of probability density functions, p(k), for sum of n fair 6-sided dice, showing convergence to a normal distribution with increasing n [image source: Wikipedia] n = 1 p(k) / p(k) p(k) / 6 n = n = 3 1 / ,11 18k 5 / 36 k k p(k) p(k) n = 4 73 / n = 5 65 / ,18 30k k
6 6 / 36 Dice Throws (Cont'd) Roll a fair dice 10 9 times, with each roll independently of others. fair = faces have equal probability (identically distributed) Let X i be the number that come up on the ith die and let S 10 9 = 10 9 i=1 X i be the total (sum) of the numbers rolled. The probability that S 10 9 is less than x standard deviations 1 x above its mean is (approximately) 2π e t2 /2 dt.
7 7 / 36 Outline Examples Statement of Theorem Modes of Convergence Fourier Transform and Convolution Outline of Proof Generalizations
8 8 / 36 Denitions and Assumptions Let X 1, X 2,..., X n be a sequence i.i.d random variables, each with mean µ = 0 and variance σ 2 = 1. Let S n = n i=1 X i. Any other nite µ and σ 2 may be reduced to this case. [ ] Sn E n = 1 E[S n ] = 1 n n n i=1 E[X i] = 0. Var Mean (E) is a linear function. [ Sn n ] = ( 1 n ) 2 Var[S n ] = 1 n n i=1 Var[X i] = 1 n n = 1. Var is not a linear function; it distributes over sums (when the random variables are independent) and it squares scalar multipliers.
9 9 / 36 Denitions and Assumptions (cont'd) Central Limit Theorem is a statement about the so-called normalized sum dened as Sn nµ Sn which in our case is nσ n. Normalized mean is the dierence between the sum S n and its expected value nµ, measured relative to (in units of) standard deviation nσ; it measures how many standard deviations the sum is from its expected value.
10 10 / 36 Statement of Central Limit Theorem With the assumptions of the previous slide, we have ( ) Pr a S n b n 1 2π b Convergence ( ) is in distribution. a e t2 /2 dt Convergence is not in probability or almost surely. Convergence is not uniform. as n Tails of the distribution converge more slowly than its center.
11 11 / 36 Outline Examples Statement of Theorem Modes of Convergence Fourier Transform and Convolution Outline of Proof Generalizations
12 12 / 36 Convergence in Distribution Central Limit Theorem is expressed in terms of convergence in distribution which is dened as follows: Denition (Convergence in Distribution) A sequence of random variables X 1,..., X n converges in distribution to X if, F Xn (x) F X (x) as n at all points x where F X is continuous, where F X represents the distribution of the random variable X, given by F X (x) := Pr(X x)
13 13 / 36 Characteristic Function and its relation to Convergence in Distribution Denition (Characteristic function) The characteristic function of any real-valued random variable completely denes its probability distribution. Let F X be the distribution function of the random variable X, the characteristic function of X is the function φ X given by E[e iξx ] = φ X (ξ) = e iξx df X (x) = f X (x)e iξx dx, where f X is the density function of X (if it exists). Notice the relation to Fourier transform if the density f X exists. Convergence in distribution and convergence in characteristic are equivalent.
14 14 / 36 Outline Examples Statement of Theorem Modes of Convergence Fourier Transform and Convolution Outline of Proof Generalizations
15 15 / 36 Fourier Transform Pair The convention we will be using is that the (1 dimensional) Fourier transform of a function f (x) is f (ξ) = f (x)e iξx dx and the inverse Fourier transform of a function f (ξ) is f (x) = 1 2π f (ξ)e iξx dξ.
16 16 / 36 Convolution If f and g are integrable functions, we dene the convolution f g by (f g)(x) = f (x y)g(y) dy. Convolution is sometimes also known by its German name, faltung ("folding"). Later, in the proof section, we see n-fold convolution which means convolution repeated n times.
17 17 / 36 Basic Properties of Fourier Transform There are a few basic properties of the Fourier transform that we will need to know. In particular, we need to know what the Fourier transform does to scaling, a Gaussian distribution, and convolution. Scaling: For a non-zero real number α, if g(x) = f (αx), then ĝ(ξ) = α f 1 ( ) ξ. α Gaussian: If f (x) = 1 2π e x2 2, then f (ξ) = 2πf (ξ) Convolution: Under Fourier transforms the convolution becomes multiplication. (f g)(ξ) = f (ξ)ĝ(ξ)
18 18 / 36 Outline Examples Statement of Theorem Modes of Convergence Fourier Transform and Convolution Outline of Proof Generalizations
19 19 / 36 Overview, View, Review! Tell them what you're going to tell them, tell them, and tell them what you told them. Paul Halmos ( ) [image source: Wikipedia]
20 An Overview of the Outline of the Proof Our goal is to outline the steps in showing: ( ) Pr a S n b n 1 2π b a e t2 /2 dt 1. Write density of sum S n in terms of density of its i.i.d terms X i (by using an n-fold convolution) to go from f to f Sn. 2. Find eect of scaling on density (by using a substitution in the integral) to go from f Sn to f Sn/ n. 3. Use the scaling results for Fourier transform and density as well as convolution to go from f Sn/ n to f Sn/ n. 4. Expand f around zero to nd a useful converging expression. 5. Rewrite that converging expression for f Sn/ n to get convergence to a Gaussian density 6. Take inverse Fourier transform to arrive at the standard Gaussian density. 20 / 36
21 21 / 36 Step 1: From f to f Sn : n-fold Convolution We show the result for two iid variables, X 1 and X 2, with identical distributions F X1 F X2 =: F and densities f X1 f X2 =: f. f X1 +X2 (a) = d da F X1+X2 (a) = d da Pr{X 1 + X 2 a}. F X1 +X2 (a) is given by the integral over {(x 1, x 2 ): x 1 + x 2 a} of f X1 (x 1 )f X2 (x 2 ) = f (x 1 )f (x 2 ): F X1 +X2 (a) = Pr{X 1 + X 2 a} = Dierentiation gives f X1 +X2 (a) = d da = F (a x)f (x) dx = a x2 f (x 1 )f (x 2 ) dx 1 dx 2 F (a x)f (x) dx f (a x)f (x) dx = f f (a)
22 22 / 36 Step 2: From f Sn to f Sn / : Eect of Scaling on Density n The Central Limit Theorem involves the probability ( ) Pr a S n b. n Notice that if the density of S n is f Sn (t), then ( ) Pr a S n b = Pr ( a n S n ) b n n by making the substitution s = Sn n is nf Sn ( nt). = = b n a n b a f Sn (t) dt nfsn ( ns) ds n t. This shows that the density of
23 23 / 36 Step 3: From f Sn / to f n Sn / n Now, we have everything we need to get from the density f of a sequence of i.i.d random variables to the characteristic f Sn/ n (ξ) of the corresponding normalized sum S n / n: f Sn (t) = f f (t). fsn (ξ) = (f f )(ξ) = ( f ) n (ξ) f Sn/ n (t) = nf Sn ( nt). f (ξ) = Sn/ n nf Sn ( nt)(ξ) = n 1 ( ) ξ fsn n n ( ) ( ) ξ ξ = f Sn = ( f ) n n n
24 24 / 36 Step 4: Taylor Expansion of f at 0 The Fourier Transform of the density f (identical for all) of X i is f (ξ) = e iξx f (x)dx Dierentiation under the integral sign can be done, so the Taylor Series is f (ξ) = f (0) + f (0)ξ + f (0)ξ ɛ(ξ)ξ 2 as ξ 0, in which limit ɛ(ξ) 0 also. Observe that f (0) = f (x)dx = 1 f (0) = i xf (x)dx = 0 (mean 0) f (0) = x 2 f (x)dx = 1 (variance 1)
25 25 / 36 Taylor Expansion of f at 0 (cont'd) So f (ξ) = 1 ξ 2 as ξ 0, which is the same as as ξ ɛ(ξ)ξ2 ( ) ξ f 2 (ξ) 1 ξ2 0 2
26 26 / 36 Step 5: Convergence of f Sn / n(ξ) to e ξ2 /2 Hoping that we may get a similar convergence result for f Sn/ n, we write ( f ) n (ξ/ n) (1 ξ2 2n ) n = f (ξ/ ) n) (1 ξ2 n 1 ( f ) k (ξ/ ) n) (1 n k 1 ξ2 2n 2n k=0 f (ξ/ ) n) (1 ξ2 n 1 f (ξ/ n) k 1 ξ2 2n 2n k=0 n k 1
27 27 / 36 Convergence of f Sn / n(ξ) to e ξ2 /2 Since f (ξ) f L f L 1 = 1, for n large enough we have ( f ) n (ξ/ ) n n) (1 ξ2 n 2n f (ξ/ n) It's clear that as n, ξ/ n 0, so as n, so ( f ) n (ξ/ ) n n) (1 ξ2 0 2n (1 ξ2 2n ) f Sn/ n (ξ) = ( f ) n (ξ/ n) e ξ2 /2
28 28 / 36 Step 6: Convergence of f Sn / n(x) to e x 2 /2 / 2π: Inverse Fourier Transform Taking the inverse Fourier Transform we obtain f Sn/ n (x) 1 2π e x2 /2 as n, which is the conclusion of the Central Limit Theorem! Observe that this is pointwise convergence in density (or equivalently in distribution).
29 29 / 36 Outline Examples Statement of Theorem Modes of Convergence Fourier Transform and Convolution Outline of Proof Generalizations
30 30 / 36 Directions for Generalization Three general versions of CLT will be discussed: Lyapunov's CLT which weakens the hypothesis of identical distribution with a tradeback on the hypothesis of nite variance (Lyapunov's Condition). Lindeberg's CLT which weakens Lyapunov's Condition (nite variance) and maintains the same weak requirements on the distribution of the random variables. Multivariate CLT which uses the covariance matrix of the random variables for the generalization.
31 31 / 36 Lyapunov's CLT Suppose X 1, X 2,..., X n is a sequence of independent random variables, each with nite expected value µ i and variance σ 2 i (i.e. not identically distributed). Let s 2 n = and for some δ > 0, the following condition (called Lyapunov condition), holds lim n 1 s 2+δ n n E i=1 n i=1 σ 2 i [ X i µ i 2+δ] = 0 then a sum of X i µ i converges in distribution to a standard normal sn random variable, as n.
32 32 / 36 Lindeberg's CLT Suppose X 1, X 2,..., X n is a sequence of independent random variables, each with nite expected value µ i and variance σ 2 i (i.e. not identically distributed). Let s 2 n = and for every ɛ > 0, the following condition (called Lindeberg condition), holds n i=1 σ 2 i lim n 1 s 2 n n [ E (X i µ i ) 2 1 { Xi µ i >ɛsn}] = 0 i=1 then a sum of X i µ i converges in distribution to a standard normal sn random variable, as n.
33 33 / 36 Comparison of Finite Variance Conditions Lindeberg: Classical: Lyapunov: Xi µ i >ɛsn (X i µ i ) 2 df i < (X i µ i ) 2 df i < R X i µ i 2+δ df i < R Observe that, in the Classical CLT, µ i = µ and f i (x) = f (x) i
34 34 / 36 Generalizations in a Nutshell: CLT is Robust If one has a lot of small random terms which are mostly independent and each contributes a small fraction of the total sum, then the total sum must be approximately normally distributed.
35 35 / 36 Multivariate CLT Suppose {X 1, X 2,..., X n } R d is a sequence of independent random vectors, with nite mean vector E[X i ] = µ and nite covariance matrix Σ, then ( n ) 1 X i nµ N d (0, Σ) n i=1 in distribution as n, where N d (0, Σ) is the multivariate normal distribution with mean vector 0 and covariance matrix Σ. Note: Addition is done componentwise.
36 Thank you for your attention! Figure: Laplace 36 / 36 4
37 37 / 36 Outline More Details
38 38 / 36 Almost Sure convergence and Convergence in Probability Because of their relationship to Convergence in Distribution, it is useful to review Almost Sure Convergence and Convergence in Probability. We let X 1, X 2,..., X n,... be a sequence of random variables dened on the probability space (Ω, F, P) Almost Sure Convergence (Strong convergence): X 1, X 2,..., X n,... converges almost surely to a random variable X if, for every ε > 0 ( ) P lim X n X < ε = 1 n Convergence in Probability (Weak convergence): X 1, X 2,..., X n,... converges in probability to X if, for for every ε > 0 lim P ( X n X < ε) = 1 or n lim P ( X n X ε) = 0 n
39 39 / 36 Notable Relationship between Convergence Concepts (A.S.) Conv = Conv in Prob = Conv in Distribution
CS145: Probability & Computing
CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,
More informationProving the central limit theorem
SOR3012: Stochastic Processes Proving the central limit theorem Gareth Tribello March 3, 2019 1 Purpose In the lectures and exercises we have learnt about the law of large numbers and the central limit
More informationwhere r n = dn+1 x(t)
Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution
More informationContinuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014
Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem 18.5 Spring 214.5.4.3.2.1-4 -3-2 -1 1 2 3 4 January 1, 217 1 / 31 Expected value Expected value: measure of
More information18.440: Lecture 19 Normal random variables
18.440 Lecture 19 18.440: Lecture 19 Normal random variables Scott Sheffield MIT Outline Tossing coins Normal random variables Special case of central limit theorem Outline Tossing coins Normal random
More information8 Laws of large numbers
8 Laws of large numbers 8.1 Introduction We first start with the idea of standardizing a random variable. Let X be a random variable with mean µ and variance σ 2. Then Z = (X µ)/σ will be a random variable
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationAsymptotic Statistics-III. Changliang Zou
Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (
More informationLecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN
Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and
More informationSection 9.1. Expected Values of Sums
Section 9.1 Expected Values of Sums Theorem 9.1 For any set of random variables X 1,..., X n, the sum W n = X 1 + + X n has expected value E [W n ] = E [X 1 ] + E [X 2 ] + + E [X n ]. Proof: Theorem 9.1
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More informationChapter 7: Special Distributions
This chater first resents some imortant distributions, and then develos the largesamle distribution theory which is crucial in estimation and statistical inference Discrete distributions The Bernoulli
More informationExample continued. Math 425 Intro to Probability Lecture 37. Example continued. Example
continued : Coin tossing Math 425 Intro to Probability Lecture 37 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan April 8, 2009 Consider a Bernoulli trials process with
More informationConvergence in Distribution
Convergence in Distribution Undergraduate version of central limit theorem: if X 1,..., X n are iid from a population with mean µ and standard deviation σ then n 1/2 ( X µ)/σ has approximately a normal
More informationStochastic Processes for Physicists
Stochastic Processes for Physicists Understanding Noisy Systems Chapter 1: A review of probability theory Paul Kirk Division of Molecular Biosciences, Imperial College London 19/03/2013 1.1 Random variables
More informationSystem Identification
System Identification Arun K. Tangirala Department of Chemical Engineering IIT Madras July 27, 2013 Module 3 Lecture 1 Arun K. Tangirala System Identification July 27, 2013 1 Objectives of this Module
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationII. FOURIER TRANSFORM ON L 1 (R)
II. FOURIER TRANSFORM ON L 1 (R) In this chapter we will discuss the Fourier transform of Lebesgue integrable functions defined on R. To fix the notation, we denote L 1 (R) = {f : R C f(t) dt < }. The
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationWeek 9 The Central Limit Theorem and Estimation Concepts
Week 9 and Estimation Concepts Week 9 and Estimation Concepts Week 9 Objectives 1 The Law of Large Numbers and the concept of consistency of averages are introduced. The condition of existence of the population
More information6.1 Moment Generating and Characteristic Functions
Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,
More informationGaussian vectors and central limit theorem
Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables
More informationLecture 2: Review of Basic Probability Theory
ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent
More informationECO227: Term Test 2 (Solutions and Marking Procedure)
ECO7: Term Test (Solutions and Marking Procedure) January 6, 9 Question 1 Random variables X and have the joint pdf f X, (x, y) e x y, x > and y > Determine whether or not X and are independent. [1 marks]
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More information18.175: Lecture 15 Characteristic functions and central limit theorem
18.175: Lecture 15 Characteristic functions and central limit theorem Scott Sheffield MIT Outline Characteristic functions Outline Characteristic functions Characteristic functions Let X be a random variable.
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More informationEcon 508B: Lecture 5
Econ 508B: Lecture 5 Expectation, MGF and CGF Hongyi Liu Washington University in St. Louis July 31, 2017 Hongyi Liu (Washington University in St. Louis) Math Camp 2017 Stats July 31, 2017 1 / 23 Outline
More informationA Probability Review
A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in
More informationThe Lindeberg central limit theorem
The Lindeberg central limit theorem Jordan Bell jordan.bell@gmail.com Department of Mathematics, University of Toronto May 29, 205 Convergence in distribution We denote by P d the collection of Borel probability
More informationExperimental Design and Statistics - AGA47A
Experimental Design and Statistics - AGA47A Czech University of Life Sciences in Prague Department of Genetics and Breeding Fall/Winter 2014/2015 Matúš Maciak (@ A 211) Office Hours: M 14:00 15:30 W 15:30
More informationLimiting Distributions
Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the
More informationMAS113 Introduction to Probability and Statistics
MAS113 Introduction to Probability and Statistics School of Mathematics and Statistics, University of Sheffield 2018 19 Identically distributed Suppose we have n random variables X 1, X 2,..., X n. Identically
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationA large deviation principle for a RWRC in a box
A large deviation principle for a RWRC in a box 7th Cornell Probability Summer School Michele Salvi TU Berlin July 12, 2011 Michele Salvi (TU Berlin) An LDP for a RWRC in a nite box July 12, 2011 1 / 15
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationLecture 11: Probability, Order Statistics and Sampling
5-75: Graduate Algorithms February, 7 Lecture : Probability, Order tatistics and ampling Lecturer: David Whitmer cribes: Ilai Deutel, C.J. Argue Exponential Distributions Definition.. Given sample space
More informationStat 5101 Notes: Algorithms (thru 2nd midterm)
Stat 5101 Notes: Algorithms (thru 2nd midterm) Charles J. Geyer October 18, 2012 Contents 1 Calculating an Expectation or a Probability 2 1.1 From a PMF........................... 2 1.2 From a PDF...........................
More informationGenerating and characteristic functions. Generating and Characteristic Functions. Probability generating function. Probability generating function
Generating and characteristic functions Generating and Characteristic Functions September 3, 03 Probability generating function Moment generating function Power series expansion Characteristic function
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationWe introduce methods that are useful in:
Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more
More informationCharacteristic Functions and the Central Limit Theorem
Chapter 6 Characteristic Functions and the Central Limit Theorem 6.1 Characteristic Functions 6.1.1 Transforms and Characteristic Functions. There are several transforms or generating functions used in
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More information18.175: Lecture 13 Infinite divisibility and Lévy processes
18.175 Lecture 13 18.175: Lecture 13 Infinite divisibility and Lévy processes Scott Sheffield MIT Outline Poisson random variable convergence Extend CLT idea to stable random variables Infinite divisibility
More informationElements of Asymptotic Theory. James L. Powell Department of Economics University of California, Berkeley
Elements of Asymtotic Theory James L. Powell Deartment of Economics University of California, Berkeley Objectives of Asymtotic Theory While exact results are available for, say, the distribution of the
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationUncertainty Quantification in Computational Science
DTU 2010 - Lecture I Uncertainty Quantification in Computational Science Jan S Hesthaven Brown University Jan.Hesthaven@Brown.edu Objective of lectures The main objective of these lectures are To offer
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationA6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011
A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao
More informationStat 5101 Notes: Algorithms
Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationFourier Sin and Cos Series and Least Squares Convergence
Fourier and east Squares Convergence James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University May 7, 28 Outline et s look at the original Fourier sin
More informationFunctions of Several Random Variables (Ch. 5.5)
(Ch. 5.5) Iowa State University Mar 7, 2013 Iowa State University Mar 7, 2013 1 / 37 Outline Iowa State University Mar 7, 2013 2 / 37 several random variables We often consider functions of random variables
More informationNational Sun Yat-Sen University CSE Course: Information Theory. Maximum Entropy and Spectral Estimation
Maximum Entropy and Spectral Estimation 1 Introduction What is the distribution of velocities in the gas at a given temperature? It is the Maxwell-Boltzmann distribution. The maximum entropy distribution
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More informationStatistics, Data Analysis, and Simulation SS 2015
Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler
More informationMonte Carlo Methods for Stochastic Programming
IE 495 Lecture 16 Monte Carlo Methods for Stochastic Programming Prof. Jeff Linderoth March 17, 2003 March 17, 2003 Stochastic Programming Lecture 16 Slide 1 Outline Review Jensen's Inequality Edmundson-Madansky
More informationFundamental Tools - Probability Theory IV
Fundamental Tools - Probability Theory IV MSc Financial Mathematics The University of Warwick October 1, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory IV 1 / 14 Model-independent
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationPractice Problem - Skewness of Bernoulli Random Variable. Lecture 7: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example
A little more E(X Practice Problem - Skewness of Bernoulli Random Variable Lecture 7: and the Law of Large Numbers Sta30/Mth30 Colin Rundel February 7, 014 Let X Bern(p We have shown that E(X = p Var(X
More informationSection 27. The Central Limit Theorem. Po-Ning Chen, Professor. Institute of Communications Engineering. National Chiao Tung University
Section 27 The Central Limit Theorem Po-Ning Chen, Professor Institute of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 3000, R.O.C. Identically distributed summands 27- Central
More informationLimiting Distributions
We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results
More informationStarting from Heat Equation
Department of Applied Mathematics National Chiao Tung University Hsin-Chu 30010, TAIWAN 20th August 2009 Analytical Theory of Heat The differential equations of the propagation of heat express the most
More informationMath Camp II. Calculus. Yiqing Xu. August 27, 2014 MIT
Math Camp II Calculus Yiqing Xu MIT August 27, 2014 1 Sequence and Limit 2 Derivatives 3 OLS Asymptotics 4 Integrals Sequence Definition A sequence {y n } = {y 1, y 2, y 3,..., y n } is an ordered set
More informationMATH Solutions to Probability Exercises
MATH 5 9 MATH 5 9 Problem. Suppose we flip a fair coin once and observe either T for tails or H for heads. Let X denote the random variable that equals when we observe tails and equals when we observe
More informationMathematical Preliminaries
Mathematical Preliminaries Economics 3307 - Intermediate Macroeconomics Aaron Hedlund Baylor University Fall 2013 Econ 3307 (Baylor University) Mathematical Preliminaries Fall 2013 1 / 25 Outline I: Sequences
More information7 Convergence in R d and in Metric Spaces
STA 711: Probability & Measure Theory Robert L. Wolpert 7 Convergence in R d and in Metric Spaces A sequence of elements a n of R d converges to a limit a if and only if, for each ǫ > 0, the sequence a
More informationLecture Tricks with Random Variables: The Law of Large Numbers & The Central Limit Theorem
Math 408 - Mathematical Statistics Lecture 9-10. Tricks with Random Variables: The Law of Large Numbers & The Central Limit Theorem February 6-8, 2013 Konstantin Zuev (USC) Math 408, Lecture 9-10 February
More informationSYSM 6303: Quantitative Introduction to Risk and Uncertainty in Business Lecture 4: Fitting Data to Distributions
SYSM 6303: Quantitative Introduction to Risk and Uncertainty in Business Lecture 4: Fitting Data to Distributions M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas Email: M.Vidyasagar@utdallas.edu
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More informationECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.
ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers
More informationUniversity of Regina. Lecture Notes. Michael Kozdron
University of Regina Statistics 252 Mathematical Statistics Lecture Notes Winter 2005 Michael Kozdron kozdron@math.uregina.ca www.math.uregina.ca/ kozdron Contents 1 The Basic Idea of Statistics: Estimating
More informationStat 5101 Lecture Slides: Deck 7 Asymptotics, also called Large Sample Theory. Charles J. Geyer School of Statistics University of Minnesota
Stat 5101 Lecture Slides: Deck 7 Asymptotics, also called Large Sample Theory Charles J. Geyer School of Statistics University of Minnesota 1 Asymptotic Approximation The last big subject in probability
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationOverview. CSE 21 Day 5. Image/Coimage. Monotonic Lists. Functions Probabilistic analysis
Day 5 Functions/Probability Overview Functions Probabilistic analysis Neil Rhodes UC San Diego Image/Coimage The image of f is the set of values f actually takes on (a subset of the codomain) The inverse
More informationProbability, CLT, CLT counterexamples, Bayes. The PDF file of this lecture contains a full reference document on probability and random variables.
Lecture 5 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Probability, CLT, CLT counterexamples, Bayes The PDF file of
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationThe random variable 1
The random variable 1 Contents 1. Definition 2. Distribution and density function 3. Specific random variables 4. Functions of one random variable 5. Mean and variance 2 The random variable A random variable
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More information3. Review of Probability and Statistics
3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture
More informationLarge sample covariance matrices and the T 2 statistic
Large sample covariance matrices and the T 2 statistic EURANDOM, the Netherlands Joint work with W. Zhou Outline 1 2 Basic setting Let {X ij }, i, j =, be i.i.d. r.v. Write n s j = (X 1j,, X pj ) T and
More informationContinuous distributions
CHAPTER 7 Continuous distributions 7.. Introduction A r.v. X is said to have a continuous distribution if there exists a nonnegative function f such that P(a X b) = ˆ b a f(x)dx for every a and b. distribution.)
More informationProbability A exam solutions
Probability A exam solutions David Rossell i Ribera th January 005 I may have committed a number of errors in writing these solutions, but they should be ne for the most part. Use them at your own risk!
More informationLecture 8: Continuous random variables, expectation and variance
Lecture 8: Continuous random variables, expectation and variance Lejla Batina Institute for Computing and Information Sciences Digital Security Version: autumn 2013 Lejla Batina Version: autumn 2013 Wiskunde
More informationMathematics 426 Robert Gross Homework 9 Answers
Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX
More informationMonte Carlo Methods for Statistical Inference: Variance Reduction Techniques
Monte Carlo Methods for Statistical Inference: Variance Reduction Techniques Hung Chen hchen@math.ntu.edu.tw Department of Mathematics National Taiwan University 3rd March 2004 Meet at NS 104 On Wednesday
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models. Fall 2011, Professor Whitt. Class Lecture Notes: Thursday, September 15.
IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2011, Professor Whitt Class Lecture Notes: Thursday, September 15. Random Variables, Conditional Expectation and Transforms 1. Random
More informationThe Central Limit Theorem
The Central Limit Theorem (A rounding-corners overiew of the proof for a.s. convergence assuming i.i.d.r.v. with 2 moments in L 1, provided swaps of lim-ops are legitimate) If {X k } n k=1 are i.i.d.,
More informationReview of Probability. CS1538: Introduction to Simulations
Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationEconomics 583: Econometric Theory I A Primer on Asymptotics
Economics 583: Econometric Theory I A Primer on Asymptotics Eric Zivot January 14, 2013 The two main concepts in asymptotic theory that we will use are Consistency Asymptotic Normality Intuition consistency:
More information3 Operations on One Random Variable - Expectation
3 Operations on One Random Variable - Expectation 3.0 INTRODUCTION operations on a random variable Most of these operations are based on a single concept expectation. Even a probability of an event can
More informationFundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes
Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of
More information