Multivariate random variables
|
|
- Pauline Manning
- 5 years ago
- Views:
Transcription
1 Multivariate random variables DS GA 1002 Statistical and Mathematical Models Carlos Fernandez-Granda
2 Joint distributions Tool to characterize several uncertain numerical quantities of interest within the same probabilistic model We can group the variables into random vectors X 1 X = X 2 X n
3 Discrete random variables Continuous random variables Joint distributions of discrete and continuous random variables
4 Joint probability mass function The joint pmf of X and Y is defined as p X,Y (x, y) := P (X = x, Y = y) It is the probability of X, Y being equal to x, y respectively By the definition of a probability measure p X,Y (x, y) 0 for any x R X, y R Y x R X y R Y p X,Y (x, y) = 1
5 Joint probability mass function The joint pmf of a discrete random vector X is p X ( x) := P (X 1 = x 1, X 2 = x 2,..., X n = x n ) It is the probability of X being equal to x By the definition of a probability measure p X ( x) 0 x 1 R 1 p X ( x) = 1 x 2 R 2 x n R n
6 Joint probability mass function By the Law of Total Probability, for any set S R X R y, P ((X, Y ) S) = P ( (x,y) S {X = x, Y = y} ) (union of disjoint events) = P (X = x, Y = y) (x,y) S = (x,y) S p X,Y (x, y) Similarly, for any discrete set S R n ( ) P X S = p X ( x) x S
7 Marginalization To compute the marginal pmf of X from the joint pmf p X,Y p X (x) = P (X = x) = P ( y RY {X = x, Y = y}) (union of disjoint events) = y R Y P (X = x, Y = y) = y R Y p X,Y (x, y) This is called marginalizing over Y
8 Marginalization Marginal pmf of a subvector X I, I {1, 2,..., n}, p XI ( x I ) = p X ( x) j 2 R j2 j n m R jn m j 1 R j1 {j 1, j 2,..., j n m } := {1, 2,..., n} /I
9 Conditional probability mass function The conditional pmf of Y given X is p Y X (y x) = P (Y = y X = x) = p X,Y (x, y), as long as p X (x) > 0 p X (x) Valid pmf parametrized by x Chain rule for discrete random variables p X,Y (x, y) = p X (x) p Y X (y x)
10 Conditional probability mass function The conditional pmf of a random subvector X I, I {1, 2,..., n}, given another subvector X J is p XI X J ( x I x J ) := p X ( x) p XJ ( x J ) {j 1, j 2,..., j n m } := {1, 2,..., n} /I Chain rule for discrete random vectors p X ( x) = p X1 (x 1 ) p X2 X 1 (x 2 x 1 )... p Xn X1,...,X n 1 (x n x 1,..., x n 1 ) n ( ) = p Xi X xi x {1,...,i 1} {1,...,i 1} i=1 Any order works!
11 Example: Flights and rain (continued) Probabilistic model for late arrivals at an airport P (late, no rain) = 2 14, P (on time, no rain) = 20 20, P (late, rain) = 3 20, P (on time, rain) = 1 20 { 1 if plane is late L = 0 otherwise { 1 it rains R = 0 otherwise
12 Example: Flights and rain (continued) R p L,R 0 1 p L p L R ( 0) p L R ( 1) L p R p R L ( 0) p R L ( 1)
13 Independence of discrete random variables X and Y are independent if and only if p X,Y (x, y) = p X (x) p Y (y), for all x R X, y R Y, Equivalently p X Y (x y) = p X (x) p Y X (y x) = p Y (y) for all x R X, y R Y
14 Mutually independent random variables The n entries X 1, X 2,..., X n in a random vector X are mutually independent if and only if n p X ( x) = p Xi (x i ) i=1
15 Conditionally mutually independent random variables The components of a subvector X I, I {1, 2,..., n} are conditionally mutually independent given another subvector X J, J {1, 2,..., n}, if and only if p XI X J ( x I x J ) = i I p Xi X J (x i x J )
16 Pairwise independence X 1 and X 2 are outcomes of independent unbiased coin flips X 3 = { 1 if X 1 = X 2, 0 if X 1 X 2. Are X 1, X 2 and X 3 independent?
17 Pairwise independence X 1 and X 2 are independent by assumption
18 Pairwise independence X 1 and X 2 are independent by assumption The pmf of X 3 is p X3 (1) = p X3 (0) =
19 Pairwise independence X 1 and X 2 are independent by assumption The pmf of X 3 is p X3 (1) = p X1,X 2 (1, 1) + p X1,X 2 (0, 0) = 1 2, p X3 (0) = p X1,X 2 (0, 1) + p X1,X 2 (1, 0) = 1 2
20 Pairwise independence Are X 1 and X 3 independent?,
21 Pairwise independence Are X 1 and X 3 independent? p X1,X 3 (0, 0) =,
22 Pairwise independence Are X 1 and X 3 independent? p X1,X 3 (0, 0) = p X1,X 2 (0, 1) = 1 4,
23 Pairwise independence Are X 1 and X 3 independent? p X1,X 3 (0, 0) = p X1,X 2 (0, 1) = 1 4 = p X 1 (0) p X3 (0),
24 Pairwise independence Are X 1 and X 3 independent? p X1,X 3 (0, 0) = p X1,X 2 (0, 1) = 1 4 = p X 1 (0) p X3 (0), p X1,X 3 (1, 0) = p X1,X 2 (1, 0) = 1 4 = p X 1 (1) p X3 (0),
25 Pairwise independence Are X 1 and X 3 independent? p X1,X 3 (0, 0) = p X1,X 2 (0, 1) = 1 4 = p X 1 (0) p X3 (0), p X1,X 3 (1, 0) = p X1,X 2 (1, 0) = 1 4 = p X 1 (1) p X3 (0), p X1,X 3 (0, 1) = p X1,X 2 (0, 0) = 1 4 = p X 1 (0) p X3 (1),
26 Pairwise independence Are X 1 and X 3 independent? p X1,X 3 (0, 0) = p X1,X 2 (0, 1) = 1 4 = p X 1 (0) p X3 (0), p X1,X 3 (1, 0) = p X1,X 2 (1, 0) = 1 4 = p X 1 (1) p X3 (0), p X1,X 3 (0, 1) = p X1,X 2 (0, 0) = 1 4 = p X 1 (0) p X3 (1), p X1,X 3 (1, 1) = p X1,X 2 (1, 1) = 1 4 = p X 1 (1) p X3 (1)
27 Pairwise independence Are X 1 and X 3 independent? p X1,X 3 (0, 0) = p X1,X 2 (0, 1) = 1 4 = p X 1 (0) p X3 (0), p X1,X 3 (1, 0) = p X1,X 2 (1, 0) = 1 4 = p X 1 (1) p X3 (0), p X1,X 3 (0, 1) = p X1,X 2 (0, 0) = 1 4 = p X 1 (0) p X3 (1), p X1,X 3 (1, 1) = p X1,X 2 (1, 1) = 1 4 = p X 1 (1) p X3 (1) Yes
28 Pairwise independence X 1, X 2 and X 3 are pairwise independent Are X 1, X 2 and X 3 mutually independent?
29 Pairwise independence X 1, X 2 and X 3 are pairwise independent Are X 1, X 2 and X 3 mutually independent? p X1,X 2,X 3 (1, 1, 1) = p X1 (1) p X2 (1) p X3 (1) =
30 Pairwise independence X 1, X 2 and X 3 are pairwise independent Are X 1, X 2 and X 3 mutually independent? p X1,X 2,X 3 (1, 1, 1) = P (X 1 = 1, X 2 = 1) = 1 4 p X1 (1) p X2 (1) p X3 (1) = 1 8
31 Pairwise independence X 1, X 2 and X 3 are pairwise independent Are X 1, X 2 and X 3 mutually independent? p X1,X 2,X 3 (1, 1, 1) = P (X 1 = 1, X 2 = 1) = 1 4 p X1 (1) p X2 (1) p X3 (1) = 1 8 No!
32 Discrete random variables Continuous random variables Joint distributions of discrete and continuous random variables
33 Continuous random variables We consider events that are composed of unions of Cartesian products of intervals (Borel sets) The joint cumulative distribution function (cdf) of X and Y is F X,Y (x, y) := P (X x, Y y) In words, probability of X and Y being smaller than x and y respectively The cdf of X is F X ( x) := P (X 1 x 1, X 2 x 2,..., X n x n )
34 Joint cumulative distribution function Every joint cdf satisfies lim F X,Y (x, y) = 0, x lim F X,Y (x, y) = 0, y lim F X,Y (x, y) = 1 x,y F X,Y (x 1, y 1 ) F X,Y (x 2, y 2 ) if x 2 x 1, y 2 y 1 (nondecreasing)
35 Joint cumulative distribution function For any two-dimensional interval P (x 1 X x 2, y 1 Y y 2 ) = P ({X x 2, Y y 2 } {X > x 2 } {Y > y 2 }) = P (X x 2, Y y 2 ) P (X x 1, Y y 2 ) P (X x 2, Y y 1 ) + P (X x 1, Y y 1 ) = F X,Y (x 2, y 2 ) F X,Y (x 1, y 2 ) F X,Y (x 2, y 1 ) + F X,Y (x 1, y 1 ) Completely characterizes the distribution of the random variables / random vector
36 Joint probability density function If the joint cdf is differentiable f X,Y (x, y) := 2 F X,Y (x, y) x y n F X ( x) f X ( x) := x 1 x 2 x n
37 Joint probability density function Probability of (X, Y ) (x, x + x ) (y, y + y ) for x, y 0 is f X,Y (x, y) x y It is a density, not a probability measure! From the monotonicity of the joint cdf f X,Y (x, y) 0 f X ( x) 0
38 Joint probability density function For any Borel set S R 2 P ((X, Y ) S) = In particular, x= y= S f X,Y (x, y) dx dy f X,Y (x, y) dx dy = 1
39 Joint probability density function For any Borel set S R n In particular, ( ) P X S = f X ( x) d x S R n f X ( x) d x = 1
40 Example: Triangle lake E F 0.5 B C D 0 A
41 Example: Triangle lake 0 if x 1 < 0 or x 2 < 0, 2x 1 x 2, if x 1 0, x 2 0, x 1 + x 2 1, 2x 1 + 2x 2 x2 2 F X ( x) = x 1 2 1, if x 1 1, x 2 1, x 1 + x 2 1, 2x 2 x2 2, if x 1 1, 0 x 2 1, 2x 1 x1 2, if 0 x 1 1, x 2 1, 1, if x 1 1, x 2 1
42 Marginalization We can compute the marginal cdf from the joint cdf or from the joint pdf F X (x) = P (X x) = lim y F X,Y (x, y) F X (x) = P (X x) = Differentiating we obtain f X (x) = x u= y= y= f X,Y (x, y) dy f X,Y (u, y) du dy
43 Marginalization Marginal pdf of a subvector X I, I := {i 1, i 2,..., i m }, f XI ( x I ) = x j1 f X ( x) dx j1 dx j2 dx jn m x j2 x jn m where {j 1, j 2,..., j n m } := {1, 2,..., n} /I
44 Example: Triangle lake (continued) Marginal cdf of x 1 F X1 (x 1 ) = Marginal pdf of x 1 lim F x 2 X ( x) = f X1 (x 1 ) = df X 1 (x 1 ) dx 1 = 0 if x 1 < 0, 2x 1 x1 2 if 0 x 1 1, 1 if x 1 1 { 2 (1 x 1 ) if 0 x otherwise
45 Joint conditional cdf and pdf given an event If we know that (X, Y ) S for any Borel set in R 2 F X,Y (X,Y ) S (x, y) := P (X x, Y y (X, Y ) S) = = P (X x, Y y, (X, Y ) S) P ((X, Y ) S) u x,v y,(u,v) S f X,Y (u, v) du dv (u,v) S f X,Y (u, v) du dv f X,Y (X,Y ) S (x, y) := 2 F X,Y (X,Y ) S (x, y) x y
46 Conditional cdf and pdf Distribution of Y given X = x? The event has zero probability!
47 Conditional cdf and pdf Distribution of Y given X = x? The event has zero probability! Define f Y X (y x) := f X,Y (x, y), if f X (x) > 0 f X (x) F Y X (y x) := y u= f Y X (u x) du Chain rule for continuous random variables f X,Y (x, y) = f X (x) f Y X (y x)
48 Conditional cdf and pdf P (x X x + x ) f X (x) = lim x 0 x 1 P (x X x + x, Y y) f X,Y (x, y) = lim x 0 x y
49 Conditional cdf and pdf F Y X (y x) = y lim u= x 0, y 0 1 = lim x 0 P (x X x + x ) 1 P (x X x + x ) y u= P (x X x + x, Y y) = lim x 0 P (x X x + x ) = lim x 0 P (Y y x X x + x) P (x X x + x, Y u) y P (x X x + x, Y u) y du du
50 Conditional pdf of a random subvector Conditional pdf of a random subvector X I, I {1, 2,..., n}, given another subvector X {1,...,n}/I is f XI X {1,...,n}/I ( xi x {1,...,n}/I ) := f X ( x) f X{1,...,n}/I ( x{1,...,n}/i ) Chain rule for continuous random vectors f X ( x) = f X1 (x 1 ) f X2 X 1 (x 2 x 1 )... f Xn X1,...,X n 1 (x n x 1,..., x n 1 ) n ( ) = f Xi X xi x {1,...,i 1} {1,...,i 1} i=1 Any order works!
51 Example: Triangle lake (continued) Conditioned on {x 1 = 0.75} what is the pdf and cdf of x 2?
52 Example: Triangle lake (continued) f X2 X 1 (x 2 x 1 )
53 Example: Triangle lake (continued) f X2 X 1 (x 2 x 1 ) = f X ( x) f X1 (x 1 )
54 Example: Triangle lake (continued) f X2 X 1 (x 2 x 1 ) = f X ( x) f X1 (x 1 ) = 1 1 x 1, 0 x 2 1 x 1
55 Example: Triangle lake (continued) f X2 X 1 (x 2 x 1 ) = f X ( x) f X1 (x 1 ) = 1 1 x 1, 0 x 2 1 x 1 F X2 X 1 (x 2 x 1 ) = x2 = x 2 1 x 1 f X2 X 1 (u x 1 ) du
56 Example: Desert Car traveling through the desert Time until the car breaks down: T State of the motor: M State of the road: R Model: M uniform between 0 (no problem) and 1 (very bad) R uniform between 0 (no problem) and 1 (very bad) M and R independent T exponential with parameter M + R
57 Example: Desert Joint pdf?
58 Example: Desert Joint pdf? f M,R,T (m, r, t)
59 Example: Desert Joint pdf? f M,R,T (m, r, t) = f M (m) f R M (r m) f T M,R (t m, r)
60 Example: Desert Joint pdf? f M,R,T (m, r, t) = f M (m) f R M (r m) f T M,R (t m, r) = f M (m) f R (r) f T M,R (t m, r) by independence
61 Example: Desert Joint pdf? f M,R,T (m, r, t) = f M (m) f R M (r m) f T M,R (t m, r) = f M (m) f R (r) f T M,R (t m, r) by independence { (m + r) e (m+r)t for t 0, 0 m 1, 0 r 1, = 0 otherwise
62 Example: Desert Car breaks down after 15 min (0.25 h), T = 0.25 Road seems OK, R = 0.2 What was the state of the motor M?
63 Example: Desert Car breaks down after 15 min (0.25 h), T = 0.25 Road seems OK, R = 0.2 What was the state of the motor M? f M R,T (m r, t) = f M,R,T (m, r, t) f R,T (r, t)
64 Example: Desert f R,T (r, t) =
65 Example: Desert f R,T (r, t) = 1 m=0 f M,R,T (m, r, t) dm
66 Example: Desert f R,T (r, t) = 1 m=0 = e tr ( 1 f M,R,T (m, r, t) dm m=0 me tm dm + r 1 m=0 ) e tm dm
67 Example: Desert 1 f R,T (r, t) = f M,R,T (m, r, t) dm m=0 ( 1 1 ) = e tr me tm dm + r e tm dm m=0 m=0 ( 1 (1 + t) e = e tr t t 2 + r (1 ) e t ) t
68 Example: Desert 1 f R,T (r, t) = f M,R,T (m, r, t) dm m=0 ( 1 1 ) = e tr me tm dm + r e tm dm m=0 m=0 ( 1 (1 + t) e = e tr t t 2 + r (1 ) e t ) t = e tr t 2 ( 1 + tr e t (1 + t + tr) ) for t 0, 0 r 1
69 Example: Desert f M R,T (m r, t) = f M,R,T (m, r, t) f R,T (r, t) = = e tr t 2 (m + r) e (m+r)t (1 + tr e t (1 + t + tr)) (m + r) t 2 e tm 1 + tr e t (1 + t + tr)
70 Example: Desert f M R,T (m r, t) = f M,R,T (m, r, t) f R,T (r, t) = = e tr t 2 (m + r) e (m+r)t (1 + tr e t (1 + t + tr)) (m + r) t 2 e tm 1 + tr e t (1 + t + tr) f M R,T (m 0.2, 0.25) = (m + 0.2) e 0.25m e 0.25 ( ) = 1.66 (m + 0.2) e 0.25m for 0 m 1
71 State of the car 1.5 f M R,T (m 0.2, 0.25) m
72 Independent continuous random variables Two random variables X and Y are independent if and only if F X,Y (x, y) = F X (x) F Y (y), for all (x, y) R 2 Equivalently, F X Y (x y) = F X (x) F Y X (y x) = F Y (y) for all (x, y) R 2
73 Independent continuous random variables Two random variables X and Y with joint pdf f X,Y are independent if and only if f X,Y (x, y) = f X (x) f Y (y), for all (x, y) R 2 Equivalently, f X Y (x y) = f X (x) f Y X (y x) = f Y (y) for all (x, y) R 2
74 Mutually independent continuous random variables The components of a random vector X are mutually independent if and only if Equivalently, F X ( x) = f X ( x) = n F Xi (x i ) i=1 n f Xi (x i ) i=1
75 Mutually conditionally independent random variables The components of a subvector X I, I {1, 2,..., n} are mutually conditionally independent given another subvector X J, J {1, 2,..., n}, if and only if F XI X J ( x I x J ) = i I F Xi X J (x i x J ) Equivalently, f XI X J ( x I x J ) = i I f Xi X J (x i x J )
76 Functions of random variables U = g (X, Y ) and V = h (X, Y ) F U,V (u, v) = P (U u, V v) = P (g (X, Y ) u, h (X, Y ) v) = f X,Y (x, y) dx dy {(x,y) g(x,y) u,h(x,y) v}
77 Sum of independent random variables X and Y are independent random variables, what is the pdf of Z = X + Y?
78 Sum of independent random variables X and Y are independent random variables, what is the pdf of Z = X + Y? F Z (z)
79 Sum of independent random variables X and Y are independent random variables, what is the pdf of Z = X + Y? F Z (z) = P (X + Y z)
80 Sum of independent random variables X and Y are independent random variables, what is the pdf of Z = X + Y? F Z (z) = P (X + Y z) = = y= y= z y x= f X (x) f Y (y) dx dy F X (z y) f Y (y) dy
81 Sum of independent random variables X and Y are independent random variables, what is the pdf of Z = X + Y? F Z (z) = P (X + Y z) = = y= y= z y x= f X (x) f Y (y) dx dy F X (z y) f Y (y) dy f Z (z) = d u dz lim F X (z y) f Y (y) dy u y= u
82 Sum of independent random variables X and Y are independent random variables, what is the pdf of Z = X + Y? F Z (z) = P (X + Y z) = = y= y= z y x= f X (x) f Y (y) dx dy F X (z y) f Y (y) dy f Z (z) = d dz lim = u y= Convolution of individual pdfs u y= u F X (z y) f Y (y) dy f X (z y) f Y (y) dy
83 Example: Coffee beans Company buys coffee beans from two local producers Beans from Colombia: C tons/year Beans from Vietnam: V tons/year Model: C uniform between 0 and 1 V uniform between 0 and 2 C and V independent What is the distribution of the total amount of beans B?
84 Example: Coffee beans f B (b) =
85 Example: Coffee beans f B (b) = u= f C (b u) f V (u) du
86 Example: Coffee beans f B (b) = = 1 2 u= 2 u=0 f C (b u) f V (u) du f C (b u) du
87 Example: Coffee beans f B (b) = = 1 2 = u= 2 u= u=b 1 f C (b u) f V (u) du f C (b u) du b u=0 du = b 2 if b 1 b u=b 1 du = 1 2 if 1 b 2 du = 3 b 2 if 2 b 3
88 Example: Coffee beans 1 f C f V 1 f B
89 Gaussian random vector A Gaussian random vector X has a joint pdf of the form f X ( x) = 1 ( (2π) n Σ exp 1 ) 2 ( x µ)t Σ 1 ( x µ) where the mean µ R n and the covariance matrix Σ is a symmetric positive definite matrix
90 Linear transformation of Gaussian random vectors X is a Gaussian r.v. of dimension n with mean µ and covariance matrix Σ For any matrix A R m n and b R m Y = AX + b is Gaussian with mean A µ + b and covariance matrix AΣA T
91 Marginal distributions are Gaussian Gaussian random vector, [ ] [ ] X µ Z :=, with mean µ := X Y µ Y and covariance matrix Σ Z = [ ] Σ X Σ X Y Σ T X Y Σ Y X is a Gaussian random vector with mean µ X and covariance matrix Σ X
92 Marginal distributions are Gaussian f Y (y) f X (x) fx,y (X, Y ) x y
93 Discrete random variables Continuous random variables Joint distributions of discrete and continuous random variables
94 Discrete and continuous random variables How do we model the relation between a continuous random variable C and a discrete random variable D? Conditional cdf and pdf of C given D By the Law of Total Probability F C D (c d) := P (C c D = d) f C D (c d) := df C D (c d) dc F C (c) = f C (c) = d R D p D (d) F C D (c d) d R D p D (d) f C D (c d)
95 Mixture models Data are drawn from continuous distribution whose parameters are chosen from a discrete set Important example: Gaussian mixture models
96 Grizzlies in Yellowstone Model for the weight of grizzly bears in Yellowstone: Males: Gaussian with µ := 240 kg and σ := 40kg Females: Gaussian with µ := 140 kg and σ := 20kg There are about the same number of females and males
97 Grizzlies in Yellowstone The distribution of the weight of all bears W can be modeled as a Gaussian mixture with two random variables: S (sex) and W (weight)
98 Grizzlies in Yellowstone The distribution of the weight of all bears W can be modeled as a Gaussian mixture with two random variables: S (sex) and W (weight) f W (w)
99 Grizzlies in Yellowstone The distribution of the weight of all bears W can be modeled as a Gaussian mixture with two random variables: S (sex) and W (weight) 1 f W (w) = p S (s) f W S (w s) s=0
100 Grizzlies in Yellowstone The distribution of the weight of all bears W can be modeled as a Gaussian mixture with two random variables: S (sex) and W (weight) f W (w) = 1 p S (s) f W S (w s) s=0 = 1 2 2π (w 240) 2 e (w 140) 2 e
101 Grizzlies in Yellowstone f W S ( 0) f W S ( 1) f W ( )
102 Continuous and discrete random variables Conditional pmf of D given C? Event {C = c} has zero probability
103 Continuous and discrete random variables Conditional pmf of D given C? Event {C = c} has zero probability p D C (d c) := lim 0 P (D = d, c C c + ) P (c C c + )
104 Continuous and discrete random variables Conditional pmf of D given C? Event {C = c} has zero probability p D C (d c) := lim 0 P (D = d, c C c + ) P (c C c + ) By the Law of Total Probability and a limit argument p D (d) = c= f C (c) p D C (d c) dc
105 Bayesian coin flip Bayesian methods often endow parameters of discrete distributions with a continuous marginal distribution
106 Bayesian coin flip Bayesian methods often endow parameters of discrete distributions with a continuous marginal distribution You suspect a coin is biased You are uncertain about the bias so you model it as a random variable with pdf What is the probability of heads? f B (b) = 2b for b [0, 1]
107 Bayesian coin flip 2 f B ( )
108 Bayesian coin flip p X (1)
109 Bayesian coin flip p X (1) = b= f B (b) p X B (1 b) db
110 Bayesian coin flip p X (1) = = b= 1 b=0 f B (b) p X B (1 b) db 2b 2 db
111 Bayesian coin flip p X (1) = = = 2 3 b= 1 b=0 f B (b) p X B (1 b) db 2b 2 db
112 Chain rule for continuous and discrete random variables P (c C c + D = d) p D (d) f C D (c d) = lim P (D = d) 0 = lim 0 P (D = d, c C c + ) P (c C c + ) P (D = d, c C c + ) = lim 0 P (c C c + ) = f C (c) p D C (d c)
113 Grizzlies in Yellowstone You spot a grizzly that is about 180 kg What is the probability that it is male?
114 Grizzlies in Yellowstone You spot a grizzly that is about 180 kg What is the probability that it is male? p S W (0 180)
115 Grizzlies in Yellowstone You spot a grizzly that is about 180 kg What is the probability that it is male? p S W (0 180) = p S (0) f W S (180 0) f W (180)
116 Grizzlies in Yellowstone You spot a grizzly that is about 180 kg What is the probability that it is male? p S W (0 180) = p S (0) f W S (180 0) f W (180) ( 1 40 exp = ( 1 40 exp ) ) exp ( ) = 0.545
117 Bayesian coin flip Coin flip is tails What is the distribution of the bias now?
118 Bayesian coin flip Coin flip is tails What is the distribution of the bias now? f B X (b 0)
119 Bayesian coin flip Coin flip is tails What is the distribution of the bias now? f B X (b 0) = f B (b) p X B (0 b) p X (0)
120 Bayesian coin flip Coin flip is tails What is the distribution of the bias now? f B X (b 0) = f B (b) p X B (0 b) p X (0) 2b (1 b) = 1/3 = 6b (1 b)
121 Bayesian coin flip f B ( ) f B X ( 0)
Multivariate random variables
DS-GA 002 Lecture notes 3 Fall 206 Introduction Multivariate random variables Probabilistic models usually include multiple uncertain numerical quantities. In this section we develop tools to characterize
More informationExpectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,
More informationExpectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda
Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,
More informationProbability (continued)
DS-GA 1002 Lecture notes 2 September 21, 15 Probability (continued) 1 Random variables (continued) 1.1 Conditioning on an event Given a random variable X with a certain distribution, imagine that it is
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationBasics on Probability. Jingrui He 09/11/2007
Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationE X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.
E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationIntroduction to Machine Learning
What does this mean? Outline Contents Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola December 26, 2017 1 Introduction to Probability 1 2 Random Variables 3 3 Bayes
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationDS-GA 1002 Lecture notes 2 Fall Random variables
DS-GA 12 Lecture notes 2 Fall 216 1 Introduction Random variables Random variables are a fundamental tool in probabilistic modeling. They allow us to model numerical quantities that are uncertain: the
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationMath 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is
Math 416 Lecture 3 Expected values The average or mean or expected value of x 1, x 2, x 3,..., x n is x 1 x 2... x n n x 1 1 n x 2 1 n... x n 1 n 1 n x i p x i where p x i 1 n is the probability of x i
More informationProbability and Statistics for Data Science. Carlos Fernandez-Granda
Probability and Statistics for Data Science Carlos Fernandez-Granda Preface These notes were developed for the course Probability and Statistics for Data Science at the Center for Data Science in NYU.
More informationMath 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,
Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: PMF case: p(x, y, z) is the joint Probability Mass Function of X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) PDF case: f(x, y, z) is
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More information1 Random variables and distributions
Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set
More informationRandom Processes. DS GA 1002 Probability and Statistics for Data Science.
Random Processes DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Modeling quantities that evolve in time (or space)
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationReview of probability
Review of probability Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Goals for the lecture you should understand the following concepts definition of probability random variables
More informationLecture 14: Multivariate mgf s and chf s
Lecture 14: Multivariate mgf s and chf s Multivariate mgf and chf For an n-dimensional random vector X, its mgf is defined as M X (t) = E(e t X ), t R n and its chf is defined as φ X (t) = E(e ıt X ),
More informationECE 302: Probabilistic Methods in Electrical Engineering
ECE 302: Probabilistic Methods in Electrical Engineering Test I : Chapters 1 3 3/22/04, 7:30 PM Print Name: Read every question carefully and solve each problem in a legible and ordered manner. Make sure
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationA6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011
A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao
More informationWhy study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables
ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Definition: n-dimensional random vector, joint pmf (pdf), marginal pmf (pdf) Theorem: How to calculate marginal pmf (pdf) given joint pmf (pdf) Example: How to calculate
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationA Probability Review
A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in
More informationMultivariate probability distributions and linear regression
Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationMath 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14
Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationWe introduce methods that are useful in:
Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More information6 The normal distribution, the central limit theorem and random samples
6 The normal distribution, the central limit theorem and random samples 6.1 The normal distribution We mentioned the normal (or Gaussian) distribution in Chapter 4. It has density f X (x) = 1 σ 1 2π e
More informationChapter 2 Random Variables
Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung
More informationStatistics: Learning models from data
DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions
More informationPreliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com
1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationProbability, CLT, CLT counterexamples, Bayes. The PDF file of this lecture contains a full reference document on probability and random variables.
Lecture 5 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Probability, CLT, CLT counterexamples, Bayes The PDF file of
More informationDISCRETE RANDOM VARIABLES: PMF s & CDF s [DEVORE 3.2]
DISCRETE RANDOM VARIABLES: PMF s & CDF s [DEVORE 3.2] PROBABILITY MASS FUNCTION (PMF) DEFINITION): Let X be a discrete random variable. Then, its pmf, denoted as p X(k), is defined as follows: p X(k) :=
More informationLearning from Sensor Data: Set II. Behnaam Aazhang J.S. Abercombie Professor Electrical and Computer Engineering Rice University
Learning from Sensor Data: Set II Behnaam Aazhang J.S. Abercombie Professor Electrical and Computer Engineering Rice University 1 6. Data Representation The approach for learning from data Probabilistic
More informationProblem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},
ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where
More informationECON Fundamentals of Probability
ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,
More informationUNIT Define joint distribution and joint probability density function for the two random variables X and Y.
UNIT 4 1. Define joint distribution and joint probability density function for the two random variables X and Y. Let and represent the probability distribution functions of two random variables X and Y
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationExercises with solutions (Set D)
Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative
More informationProbability Background
CS76 Spring 0 Advanced Machine Learning robability Background Lecturer: Xiaojin Zhu jerryzhu@cs.wisc.edu robability Meure A sample space Ω is the set of all possible outcomes. Elements ω Ω are called sample
More informationLecture 1: Basics of Probability
Lecture 1: Basics of Probability (Luise-Vitetta, Chapter 8) Why probability in data science? Data acquisition is noisy Sampling/quantization external factors: If you record your voice saying machine learning
More informationStat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1
Stat 366 A1 Fall 6) Midterm Solutions October 3) page 1 1. The opening prices per share Y 1 and Y measured in dollars) of two similar stocks are independent random variables, each with a density function
More informationReview (Probability & Linear Algebra)
Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint
More informationReview (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology
Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adopted from Prof. H.R. Rabiee s and also Prof. R. Gutierrez-Osuna
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationProbability Review. Gonzalo Mateos
Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationMore on Distribution Function
More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution
More informationECE Lecture #10 Overview
ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions
More informationLecture 3: Random variables, distributions, and transformations
Lecture 3: Random variables, distributions, and transformations Definition 1.4.1. A random variable X is a function from S into a subset of R such that for any Borel set B R {X B} = {ω S : X(ω) B} is an
More informationLecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN
Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and
More informationIntroduction to Machine Learning
Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB
More informationPreliminary statistics
1 Preliminary statistics The solution of a geophysical inverse problem can be obtained by a combination of information from observed data, the theoretical relation between data and earth parameters (models),
More informationLecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs
Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions
More informationProbability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27
Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple
More informationCS 361: Probability & Statistics
February 12, 2018 CS 361: Probability & Statistics Random Variables Monty hall problem Recall the setup, there are 3 doors, behind two of them are indistinguishable goats, behind one is a car. You pick
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationChapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:
4.1 Bivariate Distributions. Chapter 4. Multivariate Distributions For a pair r.v.s (X,Y ), the Joint CDF is defined as F X,Y (x, y ) = P (X x,y y ). Obviously, the marginal distributions may be obtained
More information2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.
CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook
More informationProbability theory. References:
Reasoning Under Uncertainty References: Probability theory Mathematical methods in artificial intelligence, Bender, Chapter 7. Expert systems: Principles and programming, g, Giarratano and Riley, pag.
More informationStatistics and Econometrics I
Statistics and Econometrics I Random Variables Shiu-Sheng Chen Department of Economics National Taiwan University October 5, 2016 Shiu-Sheng Chen (NTU Econ) Statistics and Econometrics I October 5, 2016
More informationThis exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.
TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to
More informationIntroduction to Probability Theory
Introduction to Probability Theory Ping Yu Department of Economics University of Hong Kong Ping Yu (HKU) Probability 1 / 39 Foundations 1 Foundations 2 Random Variables 3 Expectation 4 Multivariate Random
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationMATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM
MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM YOUR NAME: KEY: Answers in blue Show all your work. Answers out of the blue and without any supporting work may receive no credit even if they are
More informationwhere r n = dn+1 x(t)
Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution
More informationL2: Review of probability and statistics
Probability L2: Review of probability and statistics Definition of probability Axioms and properties Conditional probability Bayes theorem Random variables Definition of a random variable Cumulative distribution
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationReview of probability. Nuno Vasconcelos UCSD
Review of probability Nuno Vasconcelos UCSD robability probability is the language to deal with processes that are non-deterministic examples: if I flip a coin 00 times how many can I expect to see heads?
More informationPROBABILITY THEORY REVIEW
PROBABILITY THEORY REVIEW CMPUT 466/551 Martha White Fall, 2017 REMINDERS Assignment 1 is due on September 28 Thought questions 1 are due on September 21 Chapters 1-4, about 40 pages If you are printing,
More informationAppendix A : Introduction to Probability and stochastic processes
A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of
More informationStochastic processes Lecture 1: Multiple Random Variables Ch. 5
Stochastic processes Lecture : Multiple Random Variables Ch. 5 Dr. Ir. Richard C. Hendriks 26/04/8 Delft University of Technology Challenge the future Organization Plenary Lectures Book: R.D. Yates and
More informationMAT 271E Probability and Statistics
MAT 7E Probability and Statistics Spring 6 Instructor : Class Meets : Office Hours : Textbook : İlker Bayram EEB 3 ibayram@itu.edu.tr 3.3 6.3, Wednesday EEB 6.., Monday D. B. Bertsekas, J. N. Tsitsiklis,
More informationIntroduction to Probability and Statistics (Continued)
Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:
More information1 Joint and marginal distributions
DECEMBER 7, 204 LECTURE 2 JOINT (BIVARIATE) DISTRIBUTIONS, MARGINAL DISTRIBUTIONS, INDEPENDENCE So far we have considered one random variable at a time. However, in economics we are typically interested
More informationECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1
EE 650 Lecture 4 Intro to Estimation Theory Random Vectors EE 650 D. Van Alphen 1 Lecture Overview: Random Variables & Estimation Theory Functions of RV s (5.9) Introduction to Estimation Theory MMSE Estimation
More information