THE STRUCTURE OF MULTIVARIATE POISSON DISTRIBUTION
|
|
- Noel Willis
- 6 years ago
- Views:
Transcription
1 K KAWAMURA KODAI MATH J 2 (1979), THE STRUCTURE OF MULTIVARIATE POISSON DISTRIBUTION BY KAZUTOMO KAWAMURA Summary In this paper we shall derive a multivariate Poisson distribution and we shall discuss its structure Notations and Definitions n positive integer X (X U,, X n ) n dimensional random vector ϊ=(h, U, '-, hi) n dimensional vector with, 1 components k=(k ly k 2,, k n ) n dimensional vector with nonnegative integer components x=(x l9 x 2,, x n ) observation of X p(x, λ) Poisson density with parameter λ Main Results 1 Multivariate Bernoulli distribution B(l, pi) Multivariate Bernoulli distribution is defined by P(X=ί)=Pt where pi^o and ΣiA*=l The moment generating function (m g f) is given by The marginal distribution of this multivariate Bernoulli distribution is also degenerated Bernoulli The covariance matrix of B(\, pi) is given by Cov(Xj,X k )= Σ Pi Σ Pi- Σ Pt Σ ρ i 9 lj = l k = lj=l k = l l = ll k = lj=olk = l Received May 18,
2 338 KAZUTOMO KAWAMURA Proof Cov(X,, X k )= Σ xμkpi- Σ XjPi Σ x k pt 2J pi Σ Pi Σ Pi = ισ_ 1 Λ( ι _Σ_ o Λ + ι Σ _ίt+ ( Σ Jt+ v II Pi) -( Σ ft+ Σ Λ-X Σ />*+ Σ Pi) = Έ Pi Έ Pi- Σ Λ Σ ij=ι k = ij=ι k = i ij=i,ι k =o ij=o,ι k = ΣΛ-(Σί, )! t^i ι Γ i 2 Multivariate binomial distribution 5(Λ^ p t ) Multivariate binomial distribution is defined by Σk 11 a! This distribution is derived by N time convolution of B(l, p t ) The m g f of the distribution is given by The marginal distribution of this multivanate binomial distribution is also degenerated binomial Covariance matrix of B(N, pi) is given by Cov(X Jf X k )=N( Σ Pi Σ Pi- Σ Pi Σ ρ t ), 3 Multivariate Poisson distribution Multivariate Poisson distribution is a limiting distribution of B(N, pi) as N-^cx) under the condition of Npi=λ t where λ τ is a non-negative fixed parameter If a random vector X has a binomial distribution B(N, pi) and if we assume Npi=λ t then we have \imp(x=k)= Σ Πίk«where p(a t, λ % ) is an univariate Poisson density THEOREM 1 // a random vector X has a distribution B{N, pi) then we have
3 THE STRUCTURE OF MULTIVARIATE POISSON DISTRIBUTION 339 Proof By the condition we get lira P(X=k)= Σ ΠMJ,), therefore the limit value of each term is lim a π HPi> lia t! i p= ^! r(l (λ,γ7 ) λt Y'ττ 11-ττα λ *" 1 a o ΐ ΦO iφo Π = Π p{a,, λ,) iφo Therefore we have lim^ P{X=k)= \ψ N N χ ta ι Σaiij=kj iφq l iφo a t! = Σ_ fc Πί(αJ,) Σ^iV!Π-~^ THEOREM 2 The moment generating function of the limiting distribution is given by h(s)=exp{- Σ λt+ Σ λis*} Proof h(s)= lim g(s) N = lim N p λ N Npi λ iv-»o i^o N
4 34 KAZUTOMO KAWAMURA where s I =s 1 ll s 2 l2 s n ln ίφo iφo THEOREM 3 // a random vector X has the POISSOJI law, then we have an unique decomposition of the random vector by Xj= Σ X t where Xi(iΦ) are mutually independent, univariate Poisson variables with parameter λ t Proof This is a direct conclusion from the definition of the distribution of convolution type Mathematical proof is given as followings In the bivariate case n=2, if X=(X lf ) has the Poisson law, our m g f h(s) becomes h(β)=exp{ (λ 1 o+λo 1 +λ 11 )+λ lo s 1 +λois 2 +λ n s 1 s 2 } = exp( ^io + ^iosi)exp( ^oi + ^ois 2 )exp( ^π + ^nsisg) If we put Sg^l, then we get the m g f of X λ exρ( Λio+ΛiosOexpC Λn+ΛusO This is a m g f of convolution type, and if we put s ι l J we get exp( >ί ol +ΛiS 2 )exp( λ n +λ n s 2 ) the m g f of of convolution type Then we have a decomposition If we put X n 'φx n " with positive probability then this is contradictory to the fact that X n ', X n " has a bivariate m g f / Therefore we have X n =X n //=X 11 with probability one And we can express the decomposition of given X {X lf ) as X 1 =X 1 -{-X n, Xoi-\-Xu In this equality X 1, X 1 and X n are mutually independent Poisson distribution with parameter λ 1, λ Q1 and λ n respectively, as to be proved THEOREM 4 The covariance matrix of the multivanate Poisson distribution is given by,)=σλ, Cov(X,,X k )= Σ λ Proof This is a direct conclusion using the m g f of the distribution (Theorem 2) or the preceding decomposition Theorem Generally a m g f is given by
5 THE STRUCTURE OF MULTIVARIATE POISSON DISTRIBUTION 341 )=Σ s*p(x=k) And we have d 2 h(s) =Σ k i k J s 1 k is t k i' 1 s J k ri Sn k np(x=k), We use the result of Theorem 2, h(s) becomes then -= {Σ ι*ϋis j " s»'*- 1 s n ι A(β){ Σ i^ s ' s/'- s;"} {Σ r 9'Ms) I L dsβs k Jίj-ίz -«B -i =A(β)I,,,, -!:( Σ x^)( Σ :*^)+ Σ V*;,] 1 n iφo ΪVO iφo E(X,X k )= Σ W(ΣλXΣλ) And (Z y ), (Z A ) is given by E(X,)=Vnτ(X,)= Σ λ t, E(X k )=Var(X k ) Finally we have 1^ = 1 )= (^)= Σ Λ, THEOREM 5 // a random vector X has the Poisson law, then the marginal distribution is also a degenerated Poisson Proof Since X has a m g f h(s), it follows that X a >=(X l9, X 3 - u X J+1, -' > %n) = 1* 2,, n) has a mgf /ι(s) 5; =1
6 342 KAZUTOMO KAWAMURA A(β), _!=exp{- Σ/!i+ Σ to 1 },,-i =exp{- Σ ( Σ λ,)+ Σ ( Σ W vλ'-'w '^ V 1 } ioήo V = o > 1 io)#o V = o > 1 This means that if X has a Poisson distribution, it follows that X O) a generated Poisson distribution And, similarly, if we put has also then the m g f of the vector is given by x( Σ ( Σ l J 1 ' l J 2 '"' l J k Therefore, the random vector χ^iή ->Jk^ has a degenerated Poisson distribution as to be proved The marginal distribution X 3 of X is Poisson with a param- COROLLARY 1 eter Σ λ t lj = l (jφk), then X 3 and X k are mutually inde- COROLLARY 2 // Cov(Z ;, X k ) pendent random variables THEOREM 6 // X lf f, XN are mutually independent random vectors of the multivariate Poisson distribution, then the sum Σ Xj has a multivariate Poisson distribution Proof The m g f of the sum vector is given by h(s) N =exp N{- Σ λt+ Σ λiβ*} =exp{- Σ Nλi+ Σ NλiS*} ^ O This means that the sum vector is also a multivariate Poisson distribution with parameter Nλ t (i N Estimation of covariance matrix We assume that X lf,, X N are mutually independent multivariate Poisson random vectors with unknown parameter λ t Given a sequence of the
7 THE STRUCTURE OF MULTIVARIATE POISSON DISTRIBUTION 343 random vectors, we shall estimate the mean vector and the covariance matrix, in this section A sequence of multivariate Poisson random vectors X k =(X lk, -,Xnk) (*=1,2, -,N) k sum χ ίk Xik Xsk Xik Xδk Xsk Xik X%k sum This sequence of random vectors is of n=8 dimensional Poisson distribution and the sample size iv 2 In this paper, we use ^G T7 Σ X%k > SiJ Tr Σ \Xik~Xι)\Xjk iv fc=i iv k=i Xj) where we get easily S tj 5 ;i Estimated mean values and standard deviations 5 2 =4583 S S 4 =1235 Z 5 =785 J 6 -=73 J? 7 =85 Z 8 =12 S 5 =3328 S 6 =32573 S S 8 =177 Sample mean of the sum 1965 Standard deviation of the sum Estimated covariance matrix The estimated covariance matrix is given by
8 344 KAZUTOMO KAWAMURA s=ιs tj i The main components Έ( (z=l, 2, -, n) will be refined by using estimated mean vector _ 1 N And the estimated sample covariances in with negative values are not natural, because all parameters Σ Λ estimated by S XJ must be nonnegative Therefore we shall refine the estimeter S as S +z =[S li7+ ] where S τj + equals to S tj iff S ιj^ and iff S tj < And a more refined estimater will be given by S=[5 t J where S tj is defined by _ Xτ iff i=j, S υ + Because the parameter estimated by S ιt is the variance value of X x and equals to the mean value of X t And we get easily S ll +=S li '^, S + ι; =S^+ and S ιj =S ji Then we have iff iφj S= Conclusion of this section 1 The unknown mean values of (X lf, X n )
9 THE STRUCTURE OF MULTIVARIATE POISSON DISTRIBUTION 345 are to be estimated by EXi Σ A l}, EX n = Σ Λ, where X\> '" f X-n 7 X t =^Σ L X ik (ι=l,2,-,«) 2 The unknown covariance matrix of (X 1}, ZJ [Cov(Z,, Xj)l Cov( ^^) = Σ will be estimated by the sample covariance matrix S or S + 3 And a more refined covariance matrix will be given by S REFERENCES [1] FELLER, W, An introduction to probability theory and its applications, Vol 1, sec ed 6-th print (1961) [ 2 ] KAWAMURA, K, The structure of trivariate Poisson distribution, Vol 28, No 1, 1-8 (1976) Kodai Math Sem Rep [ 3] KENDALL, MG and A STUART, The advanced theory of statistics, Vol 1 Distribution theory, second ed Criffin (1963) DEPARTMENT OF MATHEMATICS TOKYO INSTITUTE OF TECHNOLOGY
DIRECT CALCULATION OF MAXIMUM LIKELIHOOD ESTIMATOR FOR THE BIVARIATE POISSON DISTRIBUTION
K KAWAMURA KODAI MATH J 7 (1984), 211 221 DIRECT CALCULATION OF MAXIMUM LIKELIHOOD ESTIMATOR FOR THE BIVARIATE POISSON DISTRIBUTION BY KAZUTOMO KAWAMURA Summary To estimate the parameter vector λ of bivariate
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationAsymptotic Statistics-III. Changliang Zou
Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationRandom Vectors and Multivariate Normal Distributions
Chapter 3 Random Vectors and Multivariate Normal Distributions 3.1 Random vectors Definition 3.1.1. Random vector. Random vectors are vectors of random 75 variables. For instance, X = X 1 X 2., where each
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationLecture 14: Multivariate mgf s and chf s
Lecture 14: Multivariate mgf s and chf s Multivariate mgf and chf For an n-dimensional random vector X, its mgf is defined as M X (t) = E(e t X ), t R n and its chf is defined as φ X (t) = E(e ıt X ),
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationProbability Lecture III (August, 2006)
robability Lecture III (August, 2006) 1 Some roperties of Random Vectors and Matrices We generalize univariate notions in this section. Definition 1 Let U = U ij k l, a matrix of random variables. Suppose
More informationBIOS 2083 Linear Models Abdus S. Wahed. Chapter 2 84
Chapter 2 84 Chapter 3 Random Vectors and Multivariate Normal Distributions 3.1 Random vectors Definition 3.1.1. Random vector. Random vectors are vectors of random variables. For instance, X = X 1 X 2.
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationSection 9.1. Expected Values of Sums
Section 9.1 Expected Values of Sums Theorem 9.1 For any set of random variables X 1,..., X n, the sum W n = X 1 + + X n has expected value E [W n ] = E [X 1 ] + E [X 2 ] + + E [X n ]. Proof: Theorem 9.1
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationA Few Special Distributions and Their Properties
A Few Special Distributions and Their Properties Econ 690 Purdue University Justin L. Tobias (Purdue) Distributional Catalog 1 / 20 Special Distributions and Their Associated Properties 1 Uniform Distribution
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationChapter 2. Discrete Distributions
Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation
More informationTAMS39 Lecture 2 Multivariate normal distribution
TAMS39 Lecture 2 Multivariate normal distribution Martin Singull Department of Mathematics Mathematical Statistics Linköping University, Sweden Content Lecture Random vectors Multivariate normal distribution
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationTwo hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45
Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions
More informationJoint p.d.f. and Independent Random Variables
1 Joint p.d.f. and Independent Random Variables Let X and Y be two discrete r.v. s and let R be the corresponding space of X and Y. The joint p.d.f. of X = x and Y = y, denoted by f(x, y) = P(X = x, Y
More informationREVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)
REVIEW OF MAIN CONCEPTS AND FORMULAS Boolean algebra of events (subsets of a sample space) DeMorgan s formula: A B = Ā B A B = Ā B The notion of conditional probability, and of mutual independence of two
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationStat 5101 Notes: Brand Name Distributions
Stat 5101 Notes: Brand Name Distributions Charles J. Geyer February 14, 2003 1 Discrete Uniform Distribution DiscreteUniform(n). Discrete. Rationale Equally likely outcomes. The interval 1, 2,..., n of
More information1.12 Multivariate Random Variables
112 MULTIVARIATE RANDOM VARIABLES 59 112 Multivariate Random Variables We will be using matrix notation to denote multivariate rvs and their distributions Denote by X (X 1,,X n ) T an n-dimensional random
More information1 Uniform Distribution. 2 Gamma Distribution. 3 Inverse Gamma Distribution. 4 Multivariate Normal Distribution. 5 Multivariate Student-t Distribution
A Few Special Distributions Their Properties Econ 675 Iowa State University November 1 006 Justin L Tobias (ISU Distributional Catalog November 1 006 1 / 0 Special Distributions Their Associated Properties
More informationSupermodular ordering of Poisson arrays
Supermodular ordering of Poisson arrays Bünyamin Kızıldemir Nicolas Privault Division of Mathematical Sciences School of Physical and Mathematical Sciences Nanyang Technological University 637371 Singapore
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationOn a simple construction of bivariate probability functions with fixed marginals 1
On a simple construction of bivariate probability functions with fixed marginals 1 Djilali AIT AOUDIA a, Éric MARCHANDb,2 a Université du Québec à Montréal, Département de mathématiques, 201, Ave Président-Kennedy
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationProbability and Statistics Notes
Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1
More informationCovariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance
Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture
More informationLecture 3 - Expectation, inequalities and laws of large numbers
Lecture 3 - Expectation, inequalities and laws of large numbers Jan Bouda FI MU April 19, 2009 Jan Bouda (FI MU) Lecture 3 - Expectation, inequalities and laws of large numbersapril 19, 2009 1 / 67 Part
More informationCopyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.
Copyright & License Copyright c 2006 Jason Underdown Some rights reserved. choose notation binomial theorem n distinct items divided into r distinct groups Axioms Proposition axioms of probability probability
More informationNotes on Random Vectors and Multivariate Normal
MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution
More informationCS Lecture 19. Exponential Families & Expectation Propagation
CS 6347 Lecture 19 Exponential Families & Expectation Propagation Discrete State Spaces We have been focusing on the case of MRFs over discrete state spaces Probability distributions over discrete spaces
More informationAsymptotic Variance Formulas, Gamma Functions, and Order Statistics
Statistical Models and Methods for Lifetime Data, Second Edition by Jerald F. Lawless Copyright 2003 John Wiley & Sons, Inc. APPENDIX Β Asymptotic Variance Formulas, Gamma Functions, and Order Statistics
More informationMATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours
MATH2750 This question paper consists of 8 printed pages, each of which is identified by the reference MATH275. All calculators must carry an approval sticker issued by the School of Mathematics. c UNIVERSITY
More information2. Matrix Algebra and Random Vectors
2. Matrix Algebra and Random Vectors 2.1 Introduction Multivariate data can be conveniently display as array of numbers. In general, a rectangular array of numbers with, for instance, n rows and p columns
More informationOn Extreme Bernoulli and Dependent Families of Bivariate Distributions
Int J Contemp Math Sci, Vol 3, 2008, no 23, 1103-1112 On Extreme Bernoulli and Dependent Families of Bivariate Distributions Broderick O Oluyede Department of Mathematical Sciences Georgia Southern University,
More information01 Probability Theory and Statistics Review
NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement
More informationChapter 7: Special Distributions
This chater first resents some imortant distributions, and then develos the largesamle distribution theory which is crucial in estimation and statistical inference Discrete distributions The Bernoulli
More informationLet X and Y be two real valued stochastic variables defined on (Ω, F, P). Theorem: If X and Y are independent then. . p.1/21
Multivariate transformations The remaining part of the probability course is centered around transformations t : R k R m and how they transform probability measures. For instance tx 1,...,x k = x 1 +...
More informationStat 5101 Notes: Brand Name Distributions
Stat 5101 Notes: Brand Name Distributions Charles J. Geyer September 5, 2012 Contents 1 Discrete Uniform Distribution 2 2 General Discrete Uniform Distribution 2 3 Uniform Distribution 3 4 General Uniform
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationRandom vectors X 1 X 2. Recall that a random vector X = is made up of, say, k. X k. random variables.
Random vectors Recall that a random vector X = X X 2 is made up of, say, k random variables X k A random vector has a joint distribution, eg a density f(x), that gives probabilities P(X A) = f(x)dx Just
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationThe Delta Method and Applications
Chapter 5 The Delta Method and Applications 5.1 Local linear approximations Suppose that a particular random sequence converges in distribution to a particular constant. The idea of using a first-order
More information5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors
EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we
More informationPreliminary statistics
1 Preliminary statistics The solution of a geophysical inverse problem can be obtained by a combination of information from observed data, the theoretical relation between data and earth parameters (models),
More informationPROBABILITY THEORY LECTURE 3
PROBABILITY THEORY LECTURE 3 Per Sidén Division of Statistics Dept. of Computer and Information Science Linköping University PER SIDÉN (STATISTICS, LIU) PROBABILITY THEORY - L3 1 / 15 OVERVIEW LECTURE
More informationA note on Gaussian distributions in R n
Proc. Indian Acad. Sci. (Math. Sci. Vol., No. 4, November 0, pp. 635 644. c Indian Academy of Sciences A note on Gaussian distributions in R n B G MANJUNATH and K R PARTHASARATHY Indian Statistical Institute,
More informationExam 2. Jeremy Morris. March 23, 2006
Exam Jeremy Morris March 3, 006 4. Consider a bivariate normal population with µ 0, µ, σ, σ and ρ.5. a Write out the bivariate normal density. The multivariate normal density is defined by the following
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationExpectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda
Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,
More information1 Exercises for lecture 1
1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )
More informationGaussian vectors and central limit theorem
Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables
More information(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.
54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationSTAT 200C: High-dimensional Statistics
STAT 200C: High-dimensional Statistics Arash A. Amini May 30, 2018 1 / 59 Classical case: n d. Asymptotic assumption: d is fixed and n. Basic tools: LLN and CLT. High-dimensional setting: n d, e.g. n/d
More informationExam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)
Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax
More informationMATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2
MATH 3806/MATH4806/MATH6806: MULTIVARIATE STATISTICS Solutions to Problems on Rom Vectors Rom Sampling Let X Y have the joint pdf: fx,y) + x +y ) n+)/ π n for < x < < y < this is particular case of the
More informationWe introduce methods that are useful in:
Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationModelling Dependent Credit Risks
Modelling Dependent Credit Risks Filip Lindskog, RiskLab, ETH Zürich 30 November 2000 Home page:http://www.math.ethz.ch/ lindskog E-mail:lindskog@math.ethz.ch RiskLab:http://www.risklab.ch Modelling Dependent
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationAdvanced topics from statistics
Advanced topics from statistics Anders Ringgaard Kristensen Advanced Herd Management Slide 1 Outline Covariance and correlation Random vectors and multivariate distributions The multinomial distribution
More informationWeek 9 The Central Limit Theorem and Estimation Concepts
Week 9 and Estimation Concepts Week 9 and Estimation Concepts Week 9 Objectives 1 The Law of Large Numbers and the concept of consistency of averages are introduced. The condition of existence of the population
More informationThe largest eigenvalues of the sample covariance matrix. in the heavy-tail case
The largest eigenvalues of the sample covariance matrix 1 in the heavy-tail case Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia NY), Johannes Heiny (Aarhus University)
More informationx. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).
.8.6 µ =, σ = 1 µ = 1, σ = 1 / µ =, σ =.. 3 1 1 3 x Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ ). The Gaussian distribution Probably the most-important distribution in all of statistics
More informationON A tf-space OF CONSTANT HOLOMORPHIC SECTIONAL CURVATURE
WATANABE, Y. AND K. TAKAMATSU KODAI MATH. SEM. REP 25 (1973). 297-306 ON A tf-space OF CONSTANT HOLOMORPHIC SECTIONAL CURVATURE BY YOSHIYUKI WATANABE AND KICHIRO TAKAMATSU 1. Introduction. It is well known
More informationDiscrete Distributions
Chapter 2 Discrete Distributions 2.1 Random Variables of the Discrete Type An outcome space S is difficult to study if the elements of S are not numbers. However, we can associate each element/outcome
More informationµ X (A) = P ( X 1 (A) )
1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration
More informationLecture 6: Special probability distributions. Summarizing probability distributions. Let X be a random variable with probability distribution
Econ 514: Probability and Statistics Lecture 6: Special probability distributions Summarizing probability distributions Let X be a random variable with probability distribution P X. We consider two types
More informationExpectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationECON 5350 Class Notes Review of Probability and Distribution Theory
ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one
More informationPart II Probability and Measure
Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)
More informationModule 9: Stationary Processes
Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.
More informationTHE SUM OF THE ELEMENTS OF THE POWERS OF A MATRIX
THE SUM OF THE ELEMENTS OF THE POWERS OF A MATRIX MARVIN MARCUS AND MORRIS NEWMAN l Introduction and results* In the first two sections of this paper A will be assumed to be an irreducible nonnegative
More informationTest Problems for Probability Theory ,
1 Test Problems for Probability Theory 01-06-16, 010-1-14 1. Write down the following probability density functions and compute their moment generating functions. (a) Binomial distribution with mean 30
More informationMULTIVARIATE DISCRETE PHASE-TYPE DISTRIBUTIONS
MULTIVARIATE DISCRETE PHASE-TYPE DISTRIBUTIONS By MATTHEW GOFF A dissertation submitted in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY WASHINGTON STATE UNIVERSITY Department
More informationMultivariate Gaussian Distribution. Auxiliary notes for Time Series Analysis SF2943. Spring 2013
Multivariate Gaussian Distribution Auxiliary notes for Time Series Analysis SF2943 Spring 203 Timo Koski Department of Mathematics KTH Royal Institute of Technology, Stockholm 2 Chapter Gaussian Vectors.
More informationNonparametric hypothesis tests and permutation tests
Nonparametric hypothesis tests and permutation tests 1.7 & 2.3. Probability Generating Functions 3.8.3. Wilcoxon Signed Rank Test 3.8.2. Mann-Whitney Test Prof. Tesler Math 283 Fall 2018 Prof. Tesler Wilcoxon
More informationIII - MULTIVARIATE RANDOM VARIABLES
Computational Methods and advanced Statistics Tools III - MULTIVARIATE RANDOM VARIABLES A random vector, or multivariate random variable, is a vector of n scalar random variables. The random vector is
More informationSTAT/MATH 395 PROBABILITY II
STAT/MATH 395 PROBABILITY II Chapter 6 : Moment Functions Néhémy Lim 1 1 Department of Statistics, University of Washington, USA Winter Quarter 2016 of Common Distributions Outline 1 2 3 of Common Distributions
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationANOVA: Analysis of Variance - Part I
ANOVA: Analysis of Variance - Part I The purpose of these notes is to discuss the theory behind the analysis of variance. It is a summary of the definitions and results presented in class with a few exercises.
More informationMarkov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains
Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationEstimation of arrival and service rates for M/M/c queue system
Estimation of arrival and service rates for M/M/c queue system Katarína Starinská starinskak@gmail.com Charles University Faculty of Mathematics and Physics Department of Probability and Mathematical Statistics
More information