Formulas for probability theory and linear models SF2941
|
|
- Darlene Dixon
- 6 years ago
- Views:
Transcription
1 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms Multivariate normal distribution Convergence Poisson process Series Expansions and Integrals 1 Probability 1.1 Two inequalities A B PA) PB). P A B) PA) + PB) 1.2 Change of variable in a probability density If X has the probability density f X x), Y = AX + b, and A is invertible, then Y has the probability density 1 f Y y) = det A f X A 1 y b) ) 1
2 2 Continuous bivariate distributions 2.1 Bivariate densities Definitions The bivariate vector X, Y ) T has a continuous joint distribution with density f X,Y x, y) if where f X,Y x, y) 0, PX x, Y y) = + + f X,Y x, y)dxdy = 1 Marginal distribution: f X x) = + f X,Y x, y)dy, f Y y) = + f X,Y x, y)dx. Distribution function Conditional densities: x F X x) = PX x) = y x f X,Y u, v)dudv. f X u)du. Pa < X b) = F X b) F X a). X Y = y, if f Y y) > 0. Y X = x if f X x) > 0. f X Y =y x) := f X,Y x, y), f Y y) f Y X=x y) := f X,Y x, y), f X x) 2
3 Bayes formula f X Y =y x) = f Y X=xy) f X x) = f Y y) f Y X=x y) f X x) = + f Y X=xy)f X x)dx. f X Y =y x) is a a posteriori density for X and f X x) is a priori density for X Independence X and Y are independent iff f X,Y x, y) = f X x) f Y y) for every x, y) Conditional density of X given an event B f X B x) = Normal distribution { fx x) PB) If X has the density fx; µ, σ) defined by f X x; µ, σ) := x B 0 elsewhere 1 2πσ 2 e x µ)2 2σ 2, then X Nµ, σ 2 ). X N0, 1) is standard normal with density φx) = f X x; 0, 1). The cumulative distribution function of X N0, 1) is for x > 0 Φx) = x φt)dt = 1 x 2 + φt)dt 0 and Φ x) = 1 Φx). 3
4 2.1.5 Numerical computation of Φx) Approximative values of the cumulative distribution function of X N0, 1), Φx), can be calculated for x > 0 by Φx) = 1 Qx), Qx) = where we use the following approximation 1 : Qx) 1 1 π 2.2 Mean and variance 1 ) x + 1 x2 + 2π π x φt)dt, 1 2π e x2 /2. The expectations or means EX), EY ) are defined if they exist) by EX) = EY ) = + + xf X x)dx, yf Y y)dy, respectively. Variances VarX), VarY ) are defined as VarX) = VarY ) = + + x EX)) 2 f X x)dx, y EY )) 2 f Y y)dy, respectively. We have VarX) = EX 2 ) EX)) 2. The function of a random variable gx) has expectation EgX)) = + gx)f X x)dx. 1 P.O. Börjesson and C.E.W. Sundberg: Simple Approximations of the Error Function Qx) for Communication Applications. IEEE Transactions on Communications, March 1979, pp
5 2.3 Chebyshev, s inequality P X EX) > ε) VarX) ε Conditional expectations The condtional expectations of X given Y = y is EX Y = y) := + xf X Y =y x)dx. This can be seen as y EX Y = y), as a function of Y. EX) = EEX Y )), VarX) = VarEX Y )) + EVarY X)). E [ Y gx)) 2] = E [Var[Y X]] + E [ E [Y X] gx)) 2]. 2.5 Covariance CovX, Y) := EXY ) EX) EY ) = = We have = E[X EX)][Y EY )]) + + x EX))y EY ))f X,Y x, y)dxdy. n Var a i X i ) = i=1 n m Cov a i X i, b j Y j ) = i=1 j=1 n n 1 a 2 i VarX n 1 i) + 2 a i a j CovX i, X j ), i=1 i=1 j=i+1 n m a i b j CovX i, X j ). i=1 j=1 2.6 Coefficient of correlation Coefficient of correlation between X and Y is defined as ρ := ρ X,Y := CovX, Y ) VarX) VarY ). 5
6 3 Best linear prediction α and β that minimize E [Y α + βx)] 2 are given by α = µ Y σ XY σ 2 X µ X = µ Y ρ σ Y σ X µ X β = σ XY σ 2 X = ρ σ Y σ X where µ Y = E [Y ], µ X = E [X], σ 2 Y = Var [Y ], σ 2 X = Var [X], σ XY = CovX, Y ), ρ = σ XY σ X σ Y. 4 Covariance matrix 4.1 Definition Covariance matrix C X := E [ X µ X ) X µ X ) T ] where the element i, j) C X i, j) = E [X i µ i ) X j µ j )] is the covariance between X i and X j. Covariance matrix is nonnegative definite, i.e., for all x we have x T C X x 0 Hence det C X 0. The covariance matrix is symmetric C X = C T X 6
7 Covariance Matrix The covariance matrix of a bivariate random variable X = X 1, X 2 ) T. ) σ 2 C X = 1 ρσ 1 σ 2 ρσ 1 σ 2 σ2 2, where ρ is the coefficient of correlation of X 1 and X 2, and σ1 2 = VarX 1), σ2 2 = Var X 2 ). C X is invertible iff ρ 2 1 Then the inverse is CX 1 1 σ = 2 ) 2 ρσ 1 σ 2 σ1σ ρ 2 ) ρσ 1 σ 2 σ Discrete Random Variables X is a discrete) random variable that assumes values in X and Y is a discrete) random variable that assumes values in Y. Remark 5.1 These are measurable maps Xω), ω Ω, from a basic probability space Ω, F, P) = outcomes, a sigma field of of subsets of Ω and probability measure P on F), to X. X and Y are two discrete state spaces, whose generic elements are called values or instantiations and denoted by x i and y j, respectively. X = {x 1,, x L }, Y = {y 1,, y J }. X := the number of elements in X) = L, Y = J. Unless otherwise stated the alphabets considered here are finite. 5.1 Joint Probability Distributions A two dimensional joint simultaneous) probability distribution is a probability defined on X Y px i, y j ) := PX = x i, Y = y j ). 5.1) Hence 0 px i, y j ) and L i=1 Lj=1 px i, y j ) = 1. Marginal distribution for X: J px i ) = px i, y j ). 5.2) j=1 7
8 Marginal distribution for Y : L py j ) = px i, y j ). 5.3) i=1 These notions can be extended to define the joint simultaneous) probability distribution and the marginal distributions of n random variables. 5.2 Conditional Probability Distributions The conditional probability for X = x i given Y = y j is px i y j ) := px i, y j ) py j ). 5.4) The conditional probability for Y = y j given X = x i is py j x i ) := px i, y j ) px i ). 5.5) Here we assume py j ) > 0 and px i ) > 0. If for example px i ) = 0, we can make the definition of py j x i ) arbitrarily through px i ) py j x i ) = px i, y j ). In other words Hence Next py j x i ) = prob. for the event {X = x i, Y = y j }. prob. for the event {X = x i } L px i y j ) = 1. i=1 P X A) := x i A px i ) 5.6) is the probability of the event that X assumes a value in A, a subset of X. From 5.6) one easily finds the complement rule P X A c ) = 1 P X A), 5.7) where A c is the complement of A, i.e., those outcomes which do not lie in A. Also P X A B) = P X A) + P X B) P X A B), 5.8) is immediate. 8
9 5.3 Conditional Probability Given an Event The conditional probability for X = x i given X A is denoted by P X x i A) and given by { PX x i ) P X x i A) = P X if x A) i A 5.9) 0 otherwise. 5.4 Independence X and Y are independent random variables if and only if px i, y j ) = px i ) py j ) 5.10) for all pairs x i, y j ) in X Y. In other words all events {X = x i } and {Y = y j } are to be independent. We say that X 1, X 2,..., X n are independent random variables if and only if the joint distribution p X1,X 2,...,X n x i1, x i2...,x in ) = P X 1 = x i1, X 2 = x i2,...,x n = x in ) 5.11) equals p X1,X 2,...,X n x i1, x i2...,x in ) = p X1 x i1 ) p X2 x i2 ) p Xn x in ) 5.12) for every x i1, x i2...,x in X n. We are here assuming for simplicity that X 1, X 2,...,X n take values in the same alphabet. 5.5 A Chain Rule Let Z be a discrete) random variable that assumes values in Z = {z k } K k=1. If pz k ) > 0, px i, y j z k ) = px i, y j, z k ). pz k ) Then we obtain as an identity px i, y j z k ) = px i, y j, z k ) py j, z k ) py j, z k ) pz k ) and again by definition of conditional probability the right hand side is equal to px i y j, z k ) py j z k ). In other words, p X,Y Z x i, y j z k ) = px i y j, z k ) py j z k ). 5.13) 9
10 5.6 Conditional Independence The random variables X and Y are called conditionally independent given Z if px i, y j z k ) = px i z k ) py j z k ) 5.14) for all triples z k, x i, y j ) Z X Y cf. 5.13)). 6 Miscellaneous 6.1 A Marginalization Formula Let Y be discrete and X be continuous, and let their joint distribution be Then P Y = k, X x) = P Y = k) = = x P Y = k X = u)f X u)du. P Y = k, X u)du u P Y = k X = x) f X x)dx. 6.2 Factorial Moments X is an integer-valued discrete R.V., µ [r] def = E [XX 1) X r + 1)] = = x:integer is called the r:th factorial moment. 6.3 Binomial Moments xx 1) x r + 1))f X x). X is an integer-valued discrete R.V.. ) X E = E [XX 1) X r + 1)]/r! r is called the binomial moment. 10
11 7 Transforms 7.1 Probability Generating Function Definition Let X have values k = 0, 1, 2,...,. g X t) = E t X) = t k f X k) k=0 is called the probability generating function Prob. Gen. Fnct: Properties d dt g X1) = d n dt n g X 0) n! kt k 1 f X k) t=1 k=1 = E [X] = P X = n) µ [r] = E [XX 1) X r + 1)] = dr dt rg X1) Prob. Gen. Fnct: Properties Z = X + Y, X and Y non negative integer valued, independent, g Z t) = E t Z) = E t X+Y ) = E t X) E t Y ) = g X t) g Y t). 11
12 7.1.4 Prob. Gen. Fnct: Examples X Bep) Y Binn, p) Z Po λ) g X t) = 1 p + pt. g Y t) = 1 p + pt) n g Z t) = e λ t 1) 7.2 Moment Generating Functions Definition Moment generating function, for some h > 0, ψ X t) def = E [ e tx], t < h. { x ψ X t) = i e tx if X x i ) X discrete etx f X x)dx X continuous Moment Gen. Fnctn: Properties d ds ψ X0) = E [X] ψ X 0) = 1 d k ds kψ X0) = E [ X k]. S n = X 1 + X X n, X i independent. X i I.I.D., ψ Sn t) = E e tsn ) = E e tx 1+X X n) ) = E e tx 1 e tx2 e txn ) = E e tx 1) E e tx 2 ) E e tx n ) = ψx1 t) ψ X2 t) ψ Xn t) ψ Sn s) = ψ X t)) n. 12
13 7.2.3 Moment Gen. Fnctn: Examples X N µ, σ 2 ) Y Exp a) ψ X t) = e µt+1 2 σ2 t 2 ψ Y s) = 1, at < 1. 1 at Characteristic function Characteristic function exists for all t. ϕ X t) = E [ e itx] 7.3 Moment generating function, characteristic function of a vector random variable Moment generating function of X n 1 vector) is defined as ψ X t) def = Ee ttx = Ee t 1X 1 +t 2 X 2 + +t nx n Characteristic function of X is defined as ϕ X t) def = Ee ittx = Ee it 1X 1 +t 2 X 2 + +t nx n) 8 Multivariate normal distribution An n 1 random vector X has a normal distribution iff for every n 1- vector a the one-dimensional random vector a T X has a normal distribution. When vector X = X 1, X 2,, X n ) T has a multivariate normal distribution we write X N µ,c). 8.1) The moment generating function is ψ X s) = e st µ+ 1 2 st Cs 8.2) 13
14 The characteristic function of X N µ, C) is Let X N µ,c) and ϕ X t) = Ee ittx = e itt µ 1 2 tt Ct. 8.3) Y = a + BX. for an arbitrary m n -matrix B and arbitrary m 1 vector a. Then Y N a + Bµ, BCB T ). 8.4) 8.1 χ 2 -distribution for quadratic form of normal r.v. X N µ,c), C is invertible. Then where n is the dimension of X. 8.2 Four product rule X 1, X 2, X 3, X 4 ) T N 0,C).Then X µ) T C 1 X µ) χ 2 n), E [X 1 X 2 X 3 X 4 ] = E [X 1 X 2 ] E [X 3 X 4 ]+E [X 1 X 3 ] E [X 2 X 4 ]+E [X 1 X 4 ] E [X 2 X 3 ] 8.3 Conditional distributions for bivariate normal random variables X, Y ) T N µ,c ). The conditional distribution for Y given X = x is normal ) N µ Y + ρ σy x µ X ), σy 2 σ 1 ρ2 ), X where µ Y = EY ), µ X = EX), σ Y = VarY ), σ X = VarY ) and ρ = CovX, Y ) VarX) VarY ). Z 1 och Z 2 are independent N0, 1). ) ) µ1 σ1 0 µ =,B = ρσ 2 σ 2 1 ρ 2. µ 2 14
15 If then with X Y C = ) X Y = µ + B ) Z1 Z 2 N µ,c ) σ 2 1 ρσ 1 σ 2 ρσ 1 σ 2 σ 2 2 ) )., 9 Poisson process Xt) = number of occurrences of some event in in 0, t]. Definition 9.1 {Xt) t 0} is a Poisson process with parameter λ > 0, if 1) X0) = 0. 2) The increments Xt k ) Xt k 1 ) are independent stochastic variables 1 k n, 0 t 0 t 1 t 2... t n 1 t n and all n. 3) Xt) Xs) Poλt s)), 0 s < t. T k = the time of occurrence of the kth event. T 0 = 0. We have {T k t} = {Xt) k} τ k = T k T k 1, is the kth interoccurrence time. τ 1, τ 2...,τ k... are independent and identically distributed, τ i Exp ) 1 λ. 10 Convergence 10.1 Definitions We say that X n P X, as n 15
16 if for all ǫ > 0 P X n X > ǫ) 0, as n We say that if X n q X E X n X 2 0, as n We say that X d n X, as n if F Xn x) F X x), as n for all x, where F X x) is continuous Relations between convergences as n. If c is a constant, X n q X X n P X X n P X X n d X X n P c X n d c as n. If ϕ Xn t) are the characteristic functions of X n, then X n d X ϕ Xn t) ϕ X t) If ϕ X t) is a characteristic function continuous at t = 0, then ϕ Xn t) ϕ X t) X n d X 16
17 10.3 Law of Large Numbers X 1, X 2,... are independent, identically distributed i.i.d.) random variables with finite expectation µ. We set Then S n = X 1 + X X n, n 1. S n n P µ, as n Central Limit Theorem X 1, X 2,... are independent, identically distributed i.i.d.) random variables with finite expectation µ and finite variance σ 2. We set Then S n = X 1 + X X n, n 1. S n nµ σ n d N0, 1), as n. 11 Series Expansions and Integrals 11.1 Exponential Function e x = 11.2 Geometric Series k=0 x k k! < x <. c n c 1 + c ) n n e c. n 1 1 x = k=0 1 1 x) 2 = n k=0 k=0 x k, x < 1. kx k 1, x < 1. x k = 1 xn+1 1 x, x 1. 17
18 11.3 Logarithm function ln1 x) = k= Euler Gamma Function Γt) = 0 x k, 1 x < 1. k x t 1 e x dx, t > 0 1 Γ = 2) π Γn) = n 1)! 0 x t e λx dx = n is a nonnegative integer. Γt + 1) λ t+1, λ > 0, t > A formula with a probabilistic proof) t k 1 1 Γk) λk x k 1 e λx dx = e λtλt)j j! j=0 18
Continuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationChp 4. Expectation and Variance
Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationLecture 11. Probability Theory: an Overveiw
Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationTwo hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45
Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More information5 Operations on Multiple Random Variables
EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More informationMultivariate Gaussian Distribution. Auxiliary notes for Time Series Analysis SF2943. Spring 2013
Multivariate Gaussian Distribution Auxiliary notes for Time Series Analysis SF2943 Spring 203 Timo Koski Department of Mathematics KTH Royal Institute of Technology, Stockholm 2 Chapter Gaussian Vectors.
More informationSTT 441 Final Exam Fall 2013
STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are
More informationConditional distributions. Conditional expectation and conditional variance with respect to a variable.
Conditional distributions Conditional expectation and conditional variance with respect to a variable Probability Theory and Stochastic Processes, summer semester 07/08 80408 Conditional distributions
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationLecture 1: Review on Probability and Statistics
STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationThe Multivariate Normal Distribution. In this case according to our theorem
The Multivariate Normal Distribution Defn: Z R 1 N(0, 1) iff f Z (z) = 1 2π e z2 /2. Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) T with the Z i independent and each Z i N(0, 1). In this
More informationProbability- the good parts version. I. Random variables and their distributions; continuous random variables.
Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationStat410 Probability and Statistics II (F16)
Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two
More informationAvd. Matematisk statistik
Avd. Matematisk statistik TENTAMEN I SF940 SANNOLIKHETSTEORI/EXAM IN SF940 PROBABILITY THE- ORY TUESDAY THE 9 th OF OCTOBER 03 08.00 a.m. 0.00 p.m. Examinator : Timo Koski, tel. 070 370047, email: tjtkoski@kth.se
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationPROBABILITY THEORY LECTURE 3
PROBABILITY THEORY LECTURE 3 Per Sidén Division of Statistics Dept. of Computer and Information Science Linköping University PER SIDÉN (STATISTICS, LIU) PROBABILITY THEORY - L3 1 / 15 OVERVIEW LECTURE
More informationChapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables
Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationStatistics 1B. Statistics 1B 1 (1 1)
0. Statistics 1B Statistics 1B 1 (1 1) 0. Lecture 1. Introduction and probability review Lecture 1. Introduction and probability review 2 (1 1) 1. Introduction and probability review 1.1. What is Statistics?
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationChapter 7: Special Distributions
This chater first resents some imortant distributions, and then develos the largesamle distribution theory which is crucial in estimation and statistical inference Discrete distributions The Bernoulli
More informationActuarial Science Exam 1/P
Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,
More informationStatistics, Data Analysis, and Simulation SS 2015
Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationProbability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.
Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel
More informationProbability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27
Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More informationExpectation of Random Variables
1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete
More information01 Probability Theory and Statistics Review
NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement
More informationLecture 19: Properties of Expectation
Lecture 19: Properties of Expectation Dan Sloughter Furman University Mathematics 37 February 11, 4 19.1 The unconscious statistician, revisited The following is a generalization of the law of the unconscious
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationMultivariate distributions
CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density
More informationMATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)
MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem
More informationExam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)
Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction
More informationJoint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:
Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,
More informationIntroduction to Computational Finance and Financial Econometrics Probability Review - Part 2
You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /
More informationLet X and Y denote two random variables. The joint distribution of these random
EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More informationIntroduction to Normal Distribution
Introduction to Normal Distribution Nathaniel E. Helwig Assistant Professor of Psychology and Statistics University of Minnesota (Twin Cities) Updated 17-Jan-2017 Nathaniel E. Helwig (U of Minnesota) Introduction
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationStat 5101 Notes: Brand Name Distributions
Stat 5101 Notes: Brand Name Distributions Charles J. Geyer September 5, 2012 Contents 1 Discrete Uniform Distribution 2 2 General Discrete Uniform Distribution 2 3 Uniform Distribution 3 4 General Uniform
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More information4. CONTINUOUS RANDOM VARIABLES
IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We
More informationLecture 11. Multivariate Normal theory
10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationReview of probability
Review of probability Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Goals for the lecture you should understand the following concepts definition of probability random variables
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationLycka till!
Avd. Matematisk statistik TENTAMEN I SF294 SANNOLIKHETSTEORI/EXAM IN SF294 PROBABILITY THE- ORY MONDAY THE 14 T H OF JANUARY 28 14. p.m. 19. p.m. Examinator : Timo Koski, tel. 79 71 34, e-post: timo@math.kth.se
More informationECON 5350 Class Notes Review of Probability and Distribution Theory
ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one
More informationPractice Examination # 3
Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single
More informationFinal Exam # 3. Sta 230: Probability. December 16, 2012
Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets
More informationFundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes
Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of
More informationStatistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }
Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v
More informationLecture 6: Special probability distributions. Summarizing probability distributions. Let X be a random variable with probability distribution
Econ 514: Probability and Statistics Lecture 6: Special probability distributions Summarizing probability distributions Let X be a random variable with probability distribution P X. We consider two types
More informationBMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs
Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem
More informationTom Salisbury
MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom
More informationChapter 4 : Expectation and Moments
ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average
More informationLecture 25: Review. Statistics 104. April 23, Colin Rundel
Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April
More informationFundamental Tools - Probability Theory II
Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random
More information