Probability Theory. The nth coefficient of the Taylor series of f(k), expanded around k = 0, gives the nth moment of x as ( ik) n n!

Similar documents
3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

π e ax2 dx = x 2 e ax2 dx or x 3 e ax2 dx = 1 x 4 e ax2 dx = 3 π 8a 5/2 (a) We are considering the Maxwell velocity distribution function: 2πτ/m

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

THE ROYAL STATISTICAL SOCIETY 2006 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE

Lecture 3: Probability Distributions

II.D Many Random Variables

Composite Hypotheses testing

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

b ), which stands for uniform distribution on the interval a x< b. = 0 elsewhere

More metrics on cartesian products

Limited Dependent Variables

8.592J: Solutions for Assignment 7 Spring 2005

Simulation and Random Number Generation

Why Monte Carlo Integration? Introduction to Monte Carlo Method. Continuous Probability. Continuous Probability

Prof. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Chapter 3 Differentiation and Integration

Error Probability for M Signals

SELECTED PROOFS. DeMorgan s formulas: The first one is clear from Venn diagram, or the following truth table:

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14

Probability and Random Variable Primer

PhysicsAndMathsTutor.com

Strong Markov property: Same assertion holds for stopping times τ.

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

Statistics and Probability Theory in Civil, Surveying and Environmental Engineering

Which Separator? Spring 1

Applied Stochastic Processes

MIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU

Chapter 1. Probability

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

PES 1120 Spring 2014, Spendier Lecture 6/Page 1

Convergence of random processes

CS-433: Simulation and Modeling Modeling and Probability Review

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur

Suites of Tests. DIEHARD TESTS (Marsaglia, 1985) See

NUMERICAL DIFFERENTIATION

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands

Expectation Maximization Mixture Models HMMs

Chapter 7 Channel Capacity and Coding

Global Sensitivity. Tuesday 20 th February, 2018

A REVIEW OF ERROR ANALYSIS

Assignment 2. Tyler Shendruk February 19, 2010

Marginal Effects in Probit Models: Interpretation and Testing. 1. Interpreting Probit Coefficients

Physics 181. Particle Systems

Statistical Inference. 2.3 Summary Statistics Measures of Center and Spread. parameters ( population characteristics )

STATISTICAL MECHANICS

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

A be a probability space. A random vector

RELIABILITY ASSESSMENT

Poisson brackets and canonical transformations

Lecture 17 : Stochastic Processes II

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

The written Master s Examination

Lecture 12: Discrete Laplacian

Chapter 14 Simple Linear Regression

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity

Chapter 7 Channel Capacity and Coding

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Georgia Tech PHYS 6124 Mathematical Methods of Physics I

Stat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors

THEOREMS OF QUANTUM MECHANICS

Calculating CLs Limits. Abstract

Lecture 4 Hypothesis Testing

UNIVERSITY OF TORONTO Faculty of Arts and Science. December 2005 Examinations STA437H1F/STA1005HF. Duration - 3 hours

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

Inductance Calculation for Conductors of Arbitrary Shape

Solutions to Problem Set 6

Basic Statistical Analysis and Yield Calculations

Equilibrium Analysis of the M/G/1 Queue

ECE559VV Project Report

Complex Numbers Alpha, Round 1 Test #123

Chapter 2 Transformations and Expectations. , and define f

Chapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise.

10.34 Fall 2015 Metropolis Monte Carlo Algorithm

Quantifying Uncertainty

A random variable is a function which associates a real number to each element of the sample space

Celestial Mechanics. Basic Orbits. Why circles? Tycho Brahe. PHY celestial-mechanics - J. Hedberg

Conjugacy and the Exponential Family

xp(x µ) = 0 p(x = 0 µ) + 1 p(x = 1 µ) = µ

TP A SOLUTION. For an ideal monatomic gas U=3/2nRT, Since the process is at constant pressure Q = C. giving ) =1000/(5/2*8.31*10)

Linear Approximation with Regularization and Moving Least Squares

9. Complex Numbers. 1. Numbers revisited. 2. Imaginary number i: General form of complex numbers. 3. Manipulation of complex numbers

Physics 5153 Classical Mechanics. Principle of Virtual Work-1

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

Causal Diamonds. M. Aghili, L. Bombelli, B. Pilgrim

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Solutions to Homework 7, Mathematics 1. 1 x. (arccos x) (arccos x) 1

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

PHYS 215C: Quantum Mechanics (Spring 2017) Problem Set 3 Solutions

Markov chains. Definition of a CTMC: [2, page 381] is a continuous time, discrete value random process such that for an infinitesimal

Sampling Theory MODULE VII LECTURE - 23 VARYING PROBABILITY SAMPLING

Lagrange Multipliers. A Somewhat Silly Example. Monday, 25 September 2013

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

Transcription:

8333: Statstcal Mechancs I Problem Set # 3 Solutons Fall 3 Characterstc Functons: Probablty Theory The characterstc functon s defned by fk ep k = ep kpd The nth coeffcent of the Taylor seres of fk epanded around k = gves the nth moment of as k n fk = n n! n= a A unform probablty dstrbuton for a < < a p = a otherwse for whch there est many eamples gves fk = a ep kd = a a a k ep k a a Therefore = ak snka = m akm m +! m= m = = and m = = 3 a b The Laplace PDF p = a ep a for eample descrbng lght absorpton through a turbd medum gves fk = d ep k a a = a = a d ep k /a + a k + /a = k /a = ak + ak 4 + ak d ep k + /a

Therefore m = = and m = = a c The Cauchy or Lorentz PDF descrbes the spectrum of lght scattered by dffusve modes and s gven by For ths dstrbuton fk = = π p = a π + a a ep k π + a d ep k a + a d The easest method for evaluatng the above ntegrals s to close the ntegraton contours n the comple plane and evaluate the resdue The vanshng of the ntegrand at nfnty determnes whether the contour has to be closed n the upper or lower half of the comple plane and leads to π fk = π C B ep k d = ep ka for k + a = ep ka ep k d = epka for k < a Note that fk s not an analytc functon n ths case and hence does not have a Taylor epanson The moments have to be determned by another method eg by drect evaluaton as m = = and m = = d π a + a The frst moment vanshes by symmetry whle the second and hgher moments dverge eplanng the non-analytc nature of fk d The Raylegh dstrbuton p = a ep a for can be used for the length of a random walk n two dmensons Its characterstc functon s fk = = ep k a ep d a cosk snk a ep a d

The ntegrals are not smple but can be evaluated as and resultng n cosk a ep d = a snk a ep d = a fk = n= n n! n! The moments can also be calculated drectly from m = = m = = a ep a 3 a ep = n= π ka ep n n! n! a k n snk a ep a k a a k n π ka ep k a a d = = a y ep ydy = a a ep d = a a ep a d π d = a d a e It s dffcult to calculate the characterstc functon for the Mawell dstrbuton p = π a 3 ep a say descrbng the speed of a gas partcle However we can drectly evaluate the mean and varance as and m = = π = π a = π a m = = 3 d a 3 ep a a ep a d a y ep ydy = π a 4 π o a 3 ep a d = 3a 3 a

At each step a drected random walk can move along angles θ and φ wth probablty pθ = θ π cos and pφ = π where the sold angle factor of sn θ s already ncluded n the defnton of pθ; pθdθ = π π cos θ π dθ = cos θ + dθ = π a From symmetry arguments = y = whle along the z-as z = z = Nz = Na cos θ = Na The last equalty follows from cos θ = = π π pθ cos θdθ = cos θ cos θ + dθ π π cos θ + dθ = The second moment of z s gven by z = j z z j = z z j + j z z = j z z j + = NN z + N z Notng that z a = π π π cos θcos θ + dθ = π cos θ + dθ = we fnd z a a = NN + N = NN + a 4 The second moments n the and y drectons are equal and gven by = j = N j = j j + 4

Usng the result we obtan a = sn θ cos φ = π π π dφ cos φ dθ sn θcos θ + = 4 = y = Na 4 Whle the varables y and z are not ndependent because of the constrant of unt length smple symmetry consderatons suffce to show that the three covarances are n fact zero e y = z = yz = b From the Central lmt theorem the probablty densty should be Gaussan However for correlated random varable we may epect cross terms that descrbe ther covarance Snce we showed above that the covarances between y and z are all zero we can treat them as three ndependent Gaussan varables and wrte y y p y z ep σ σy z z σz There wll be correlatons between y and z appearng n hgher cumulants but all such cumulants become rrelevant n the N lmt Usng the moments = y = and z = N a we obtan and p y z = σ = = N a 4 = σ y σ z = z z = NN + a 4 Na = N a 4 3/ ep + y + z Na/ πna Na / 3 Tchebycheff s Inequalty: By defnton for a system wth a PDF p and average λ the varance s σ = λ pd 5

Let us break the ntegral nto two parts as σ = λ pd + λ nσ λ <nσ λ pd resultng n Now snce we obtan and σ λ <nσ λ pd = λ pd λ nσ nσ pd σ λ nσ λ nσ λ nσ λ nσ λ <nσ pd n λ pd nσ pd λ pd σ 4 Optmal Selectons: a The probablty that the mamum of n random numbers falls between and + d s equal to the probablty that one outcome s n ths nterval whle all the others are smaller than e n p n = pr = r < r 3 < r n < where the second factor corresponds to the number of ways of choosng whch r α = As these events are ndependent n p n = pr = pr < pr 3 < pr n < = pr = pr < n n The probablty of r < s just a cumulatve probablty functon and n p n = n p prdr 6

b If each r α s unformly dstrbuted between and pr = prdr = dr = Wth ths PDF we fnd p n = n p and the mean s now gven by = The second moment of the mamum s resultng n a varance σ = = n n prdr = n dr = n n p n d = n n d = = n n+ d = n n + n n n + = n + n n + n n + n + Note that for large n the mean approaches the lmtng value of unty whle the varance vanshes as /n There s too lttle space at the top of the dstrbuton for a wde varance 5 Informaton: a For an unbased probablty estmaton we need to mamze entropy subject to the two constrants of normalzaton and of gven average speed v = c Usng Lagrange multplers α and β to mpose these constrants we need to mamze S = ln p = pv ln pvdv + α pvdv + β c pv v dv Etremzng the above epresson yelds S pv = ln pv α β v = whch s solved for or ln pv = α β v pv = Ce β v wth C = e α 7

The constrants can now be used to f the parameters C and β: = pvdv = Ce β v dv = C e βv dv = C β e βv = C β whch mples C = β From the second constrant we have c = Ce β v v dv = β e βv vdv whch when ntegrated by parts yelds or c = β β ve βv + β e βv dv = β e βv β = c = β The unbased PDF s then gven by pv = Ce β v = c ep v c b When the second constrant s on the average knetc energy mv / = mc / we have S = pv ln pvdv + α The correspondng etremzaton mc pvdv + β pv mv dv S pv = ln pv α β mv = results n The normalzaton constrant mples = pv = C ep βmv pvdv = C 8 e βmv / = C π/βm

or C = βm/π The second constrant mc = = m pv mv dv = m βm π π 3/ βm βm π = β ep βmv v dv gves for a full PDF of β = mc pv = C ep βmv = ep v πc c c The entropy of the frst PDF s gven by S = ln p = = lnc c ep v c v c ep c dv + c lnc v c ep v v c c dv = lnc ep v/c + c c ep v/c = lnc + = + ln + ln c For the second dstrbuton we obtan S = ln p = πc = ln πc πc = ln πc + dv ep v c ln πc v c ep v /c dv + πc πc c c πc = ln πc + = + lnπ + ln c For a dscrete probablty the nformaton content s I α = ln M S α / ln 9 v c ep v /c dv

where M denotes the number of possble outcomes Whle M and also the proper measure of probablty are not well defned for a contnuous PDF the ambgutes dsappear when we consder the dfference I I = S + S / ln = S S / ln ln π ln = ln 3956 Hence the constrant of constant energy provdes 3956 more bts of nformaton Ths s partly due to the larger varance of the dstrbuton wth constant speed 6 Benford s Law: Let us consder the observaton that the probablty dstrbuton for frst ntegers s unchanged under multplcaton by any e a random number Presumably we can repeat such multplcatons many tmes and t s thus suggestve that we should consder the propertes of the product of random numbers Why ths should be a good model for stock prces s not entrely clear but t seems to be as good an eplanaton as anythng else! Consder the = N = r where r are postve random varables taken from some reasonably well behaved probablty dstrbuton The random varable l ln = N = ln r s the sum of many random contrbutons and accordng to the central lmt theorem should have a Gaussan dstrbuton n the lmt of large N e lm pl = ep N l Nl Nσ πnσ where l and σ are the mean and varance of ln r respectvely The product s dstrbuted accordng to the log-normal dstrbuton p = pl dl d = ln Nl ep Nσ πnσ The probablty that the frst nteger of n a decmal representaton s s now obtaned appromately as follows: p = q q + q dp

The ntegral consders cases n whch s a number of magntude q e has q + dgts before the decmal pont Snce the number s qute wdely dstrbuted we then have to sum over possble magntudes q The range of the sum actually need not be specfed! The net stage s to change varables from to l = ln leadng to p = q q+ln+ q+ln dlpl = q q+ln+ q+ln dl ep l Nl Nσ πnσ We shall now make the appromaton that over the range of ntegraton q + ln to q + ln + the ntegrand s appromately constant The appromaton works best for q Nl where the ntegral s largest Ths leads to p q ep q Nl Nσ ln + ln ln + πnσ where we have gnored the constants of proportonalty whch come from the sum over q We thus fnd that the dstrbuton of the frst dgt s not unform and the properly normalzed proportons of ln+/ ndeed reproduce the probabltes p p 9 of 3 76 5 97 79 67 58 5 46 accordng to Benford s law For further nformaton check http://wwwtreasure-trovescom/math/benfordslawhtml