Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Size: px
Start display at page:

Download "Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda"

Transcription

1 Expectation DS GA 1002 Statistical and Mathematical Models Carlos Fernandez-Granda

2 Aim Describe random variables with a few numbers: mean, variance, covariance

3 Expectation operator Mean and variance Covariance Conditional expectation

4 Discrete random variables Average of the values of a function weighted by the pmf E (g (X )) = x R g (x) p X (x) E (g (X, Y )) = x R X x R Y g (x, y) p X,Y (x, y) ( ( )) E g X = x1 g ( x) p X ( x) x 2 x n

5 Continuous random variables Average of the values of a function weighted by the pdf E (g (X )) = x= g (x) f X (x) dx E (g (X, Y )) = x= y= g (x, y) f X,Y (x, y) dx dy ( ( )) E g X = g ( x) f X ( x) dx 1 dx 2... dx n x 1 = x 2 = x n=

6 Discrete and continuous random variables E (g (C, D)) = = c= d R D d R D g (c, d) f C (c) p D C (d c) dc c= g (c, d) p D (d) f C D (c d) dc

7 St Petersburg paradox A casino offers you a game Flip an unbiased coin until it lands on heads You get 2 k dollars where k = number of flips Expected gain?

8 St Petersburg paradox E (Gain) = 2 k 1 2 k k=1

9 St Petersburg paradox E (Gain) = 2 k 1 2 k k=1 =

10 Linearity of expectation For any constants a and b and any functions g 1 and g 2 E (a g 1 (X, Y ) + b g 2 (X, Y )) = a E (g 1 (X, Y )) + b E (g 2 (X, Y )) Follows from linearity of sums and integrals

11 Example: Coffee beans Company buys coffee beans from two local producers Beans from Colombia: C tons/year Beans from Vietnam: V tons/year Model: C uniform between 0 and 1 V uniform between 0 and 2 C and V independent What is the expected total amount of beans B?

12 Example: Coffee beans E (C + V )

13 Example: Coffee beans E (C + V ) = E (C) + E (V )

14 Example: Coffee beans E (C + V ) = E (C) + E (V ) = = 1.5 tons

15 Example: Coffee beans E (C + V ) = E (C) + E (V ) = = 1.5 tons Holds even if C and V are not independent

16 Independence If X, Y are independent then E (g (X ) h (Y )) = E (g (X )) E (h (Y ))

17 Independence E (g (X ) h (Y )) = x= y= g (x) h (y) f X,Y (x, y) dx dy

18 Independence E (g (X ) h (Y )) = = x= y= x= y= g (x) h (y) f X,Y (x, y) dx dy g (x) h (y) f X (x) f Y (y) dx dy

19 Independence E (g (X ) h (Y )) = = x= y= x= y= = E (g (X )) E (h (Y )) g (x) h (y) f X,Y (x, y) dx dy g (x) h (y) f X (x) f Y (y) dx dy

20 Expectation operator Mean and variance Covariance Conditional expectation

21 Mean The mean or first moment of X is E (X ) It s the center of mass of the distribution

22 Bernoulli E (X ) = 0 p X (0) + 1 p X (1) = p

23 Binomial A binomial is a sum of n Bernoulli random variables X = n i=1 B i

24 Binomial A binomial is a sum of n Bernoulli random variables X = n i=1 B i ( n ) E (X ) = E B i i=1

25 Binomial A binomial is a sum of n Bernoulli random variables X = n i=1 B i ( n ) E (X ) = E B i = i=1 n E (B i ) i=1

26 Binomial A binomial is a sum of n Bernoulli random variables X = n i=1 B i ( n ) E (X ) = E B i = i=1 n E (B i ) i=1 = np

27 Mean of important random variables Random variable Parameters Mean Bernoulli p p Geometric p 1 p Binomial n, p np Poisson λ λ Uniform a, b a+b 2 Exponential λ 1 λ Gaussian µ, σ µ

28 Cauchy random variable 0.3 fx (x) x f X (x) = 1 π(1 + x 2 ).

29 Cauchy random variable E(X ) = = 0 x π(1 + x 2 ) dx x π(1 + x 2 ) dx 0 x π(1 + x 2 ) dx

30 Cauchy random variable E(X ) = = 0 x π(1 + x 2 ) dx x π(1 + x 2 ) dx 0 x π(1 + x 2 ) dx 0 x π(1 + x 2 ) dx = 0 1 2π(1 + t) dt = lim t log(1 + t) 2π

31 Cauchy random variable E(X ) = = 0 x π(1 + x 2 ) dx x π(1 + x 2 ) dx 0 x π(1 + x 2 ) dx 0 x π(1 + x 2 ) dx = 1 0 2π(1 + t) dt log(1 + t) = lim t 2π =

32 Mean of a random vector Vector formed by the means of its components E (X 1 ) ( ) E X := E (X 2 ) E (X n ) By linearity of expectation, for any matrix A R m n and b R m ( E AX + ) ( ) b = A E X + b

33 The mean as a typical value The mean is a typical value of the random variable The probability that X equals E (X ) can be zero The mean can be severely distorted by a subset of extreme values

34 Density with subset of extreme values 0.1 fx (x) x Uniform random variable X with support [ 4.5, 4.5] [99.5, 100.5]

35 Density with subset of extreme values E (X ) = x f X (x) dx + x f X (x) dx x= 4.5 x=99.5 = = 10

36 Density with subset of extreme values 0.1 fx (x) x

37 Median Midpoint of the distribution: number m such that P (X m) 1 2 and P (X m) 1 2 For continuous random variables F X (m) = m f X (x) dx = 1 2

38 Density with subset of extreme values F X (m) = m 4.5 = m f X (x) dx

39 Density with subset of extreme values F X (m) = m 4.5 = m = 1 2 f X (x) dx m = 0.5

40 Density with subset of extreme values 0.1 Mean Median fx (x) x

41 Variance The mean square or second moment of X is E ( X 2) The variance of X is Var (X ) := E ((X E (X )) 2) = E ( X 2 2X E (X ) + E 2 (X ) ) = E ( X 2) E 2 (X ) The standard deviation of X is σ X := Var (X )

42 Bernoulli E ( X 2) = 0 p X (0) + 1 p X (1) = p Var (X ) = E ( X 2) E 2 (X ) = p p 2 = p (1 p)

43 Variance of common random variables Random variable Parameters Variance Bernoulli p p (1 p) Geometric p 1 p p 2 Binomial n, p np (1 p) Poisson λ λ Uniform a, b (b a) 2 12 Exponential λ 1 λ 2 Gaussian µ, σ σ 2

44 Geometric (p = 0.2) px (k) k

45 Binomial (n = 20, p = 0.5) k

46 Poisson (λ = 25) k

47 Uniform [0, 1] fx (x) x

48 Exponential (λ = 1) x

49 Gaussian (µ = 0, σ = 1) x

50 Variance The variance operator is not linear, but Var (a X + b) = E ((a X + b E (a X + b)) 2) = E ((a X + b ae (X ) b) 2) = a 2 E ((X E (X )) 2) = a 2 Var (X )

51 Bounding probabilities using expectations Aim: Characterize behavior of X to some extent using E (X ) and Var (X )

52 Markov s inequality For any nonnegative random variable X and any a > 0 P (X a) E (X ) a

53 Markov s inequality Consider the indicator variable 1 X a X a 1 X a 0

54 Markov s inequality Consider the indicator variable 1 X a X a 1 X a 0 E (X ) a E (1 X a )

55 Markov s inequality Consider the indicator variable 1 X a X a 1 X a 0 E (X ) a E (1 X a ) = a P (X a)

56 Age of students at NYU Mean: 20 years How many are younger than 30?

57 Age of students at NYU Mean: 20 years How many are younger than 30? P(A 30) E (A) 30

58 Age of students at NYU Mean: 20 years How many are younger than 30? At least 1/3 P(A 30) E (A) 30 = 2 3

59 Chebyshev s inequality For any positive constant a > 0, P ( X E (X ) a) Var (X ) a 2

60 Chebyshev s inequality For any positive constant a > 0, P ( X E (X ) a) Var (X ) a 2 Corollary: If Var (X ) = 0 then P (X E (X )) = 0

61 Chebyshev s inequality For any positive constant a > 0, P ( X E (X ) a) Var (X ) a 2 Corollary: If Var (X ) = 0 then P (X E (X )) = 0 For any ɛ > 0 P ( X E (X ) ɛ) Var (X ) ɛ 2 = 0

62 Chebyshev s inequality Define Y := (X E (X )) 2 By Markov s inequality P ( X E (X ) a) = P ( Y a 2)

63 Chebyshev s inequality Define Y := (X E (X )) 2 By Markov s inequality P ( X E (X ) a) = P ( Y a 2) E (Y ) a 2

64 Chebyshev s inequality Define Y := (X E (X )) 2 By Markov s inequality P ( X E (X ) a) = P ( Y a 2) E (Y ) = a 2 Var (X ) a 2

65 Age of students at NYU Mean: 20 years, standard deviation: 3 years How many are younger than 30?

66 Age of students at NYU Mean: 20 years, standard deviation: 3 years How many are younger than 30? P(A 30) P( A 20 10)

67 Age of students at NYU Mean: 20 years, standard deviation: 3 years How many are younger than 30? At least 91 % P(A 30) P( A 20 10) Var (A) 100 = 9 100

68 Expectation operator Mean and variance Covariance Conditional expectation

69 Covariance The covariance of X and Y is Cov (X, Y ) := E ((X E (X )) (Y E (Y ))) = E (XY Y E (X ) X E (Y ) + E (X ) E (Y )) = E (XY ) E (X ) E (Y ) If Cov (X, Y ) = 0, X and Y are uncorrelated

70 Covariance Cov (X, Y ) Cov (X, Y )

71 Variance of the sum Var (X + Y ) = E ((X + Y E (X + Y )) 2) ( = E (X E (X )) 2) + E ((Y E (Y )) 2) + 2E ((X E (X )) (Y E (Y ))) = Var (X ) + Var (Y ) + 2 Cov (X, Y )

72 Variance of the sum Var (X + Y ) = E ((X + Y E (X + Y )) 2) ( = E (X E (X )) 2) + E ((Y E (Y )) 2) + 2E ((X E (X )) (Y E (Y ))) = Var (X ) + Var (Y ) + 2 Cov (X, Y ) If X and Y are uncorrelated, then Var (X + Y ) = Var (X ) + Var (Y )

73 Independence implies uncorrelation Cov (X, Y ) = E (XY ) E (X ) E (Y ) = E (X ) E (Y ) E (X ) E (Y ) = 0

74 Uncorrelation does not imply independence X, Y are independent Bernoulli with parameter 1 2 Let U = X + Y and V = X Y Are U and V independent? Are they uncorrelated?

75 Uncorrelation does not imply independence p U (0) p V (0) p U,V (0, 0)

76 Uncorrelation does not imply independence p U (0) = P (X = 0, Y = 0) = 1 4 p V (0) p U,V (0, 0)

77 Uncorrelation does not imply independence p U (0) = P (X = 0, Y = 0) = 1 4 p V (0) = P (X = 1, Y = 1) + P (X = 0, Y = 0) = 1 2 p U,V (0, 0)

78 Uncorrelation does not imply independence p U (0) = P (X = 0, Y = 0) = 1 4 p V (0) = P (X = 1, Y = 1) + P (X = 0, Y = 0) = 1 2 p U,V (0, 0) = P (X = 0, Y = 0) = 1 4

79 Uncorrelation does not imply independence p U (0) = P (X = 0, Y = 0) = 1 4 p V (0) = P (X = 1, Y = 1) + P (X = 0, Y = 0) = 1 2 p U,V (0, 0) = P (X = 0, Y = 0) = 1 4 p U (0) p V (0) = 1 8

80 Uncorrelation does not imply independence Cov (U, V ) = E (UV ) E (U) E (V ) = E ((X + Y ) (X Y )) E (X + Y ) E (X Y ) = E ( X 2) E ( Y 2) E 2 (X ) + E 2 (Y )

81 Uncorrelation does not imply independence Cov (U, V ) = E (UV ) E (U) E (V ) = E ((X + Y ) (X Y )) E (X + Y ) E (X Y ) = E ( X 2) E ( Y 2) E 2 (X ) + E 2 (Y ) = 0

82 Correlation coefficient Pearson correlation coefficient of X and Y ρ X,Y := Cov (X, Y ) σ X σ Y. Covariance between X /σ X and Y /σ Y

83 Correlation coefficient σ Y = 1, Cov (X, Y ) = 0.9, ρ X,Y = 0.9 σ Y = 3, Cov (X, Y ) = 0.9, ρ X,Y = 0.3 σ Y = 3, Cov (X, Y ) = 2.7, ρ X,Y = 0.9

84 Cauchy-Schwarz inequality For any X and Y E (XY ) E (X 2 ) E (Y 2 ). and E (XY ) = E (X 2 ) E (Y 2 E (Y ) Y = 2 ) E (X 2 ) X E (XY ) = E (X 2 ) E (Y 2 E (Y ) Y = 2 ) E (X 2 ) X

85 Cauchy-Schwarz inequality We have Cov (X, Y ) σ X σ Y and equivalently ρ X,Y 1 In addition ρ X,Y = 1 Y = c X + d where c := { σy σ X if ρ X,Y = 1, σ Y σ X if ρ X,Y = 1, d := E (Y ) ce (X )

86 Covariance matrix of a random vector The covariance matrix of X is defined as Var (X 1 ) Cov (X 1, X 2 ) Cov (X 1, X n ) Cov (X 2, X 1 ) Var (X 2 ) Cov (X 2, X n ) Σ X = Cov (X n, X 2 ) Cov (X n, X 2 ) Var (X n ) ( = E X X ) ( ) ( ) T T E X E X

87 Covariance matrix after a linear transformation Σ A X + b

88 Covariance matrix after a linear transformation ( ( Σ AX + b = E AX + ) ( b AX + ) ) T ( b E AX + ) ( b E AX + ) T b

89 Covariance matrix after a linear transformation ( ( Σ AX + b = E AX + ) ( b AX + ) ) T ( b E AX + ) ( b E AX + ) T b ( = A E X X ) T A T + ( ) T ( ) b E X A T + A E X b T + b b T ( ) ( ) T ( ) A E X E X A T A E X b T ( ) T b E X A T b b T

90 Covariance matrix after a linear transformation ( ( Σ AX + b = E AX + ) ( b AX + ) ) T ( b E AX + ) ( b E AX + ) T b ( = A E X X ) T A T + ( ) T ( ) b E X A T + A E X b T + b b T ( ) ( ) T X X A T b b T = A A E ( E ( X X T ) E ( ) T ( ) E X A T A E X b T b E ( ) X E ( X ) T ) A T

91 Covariance matrix after a linear transformation ( ( Σ AX + b = E AX + ) ( b AX + ) ) T ( b E AX + ) ( b E AX + ) T b ( = A E X X ) T A T + ( ) T ( ) b E X A T + A E X b T + b b T ( ) ( ) T X X A T b b T = A A E ( E ( X X T ) E = AΣ X A T ( ) T ( ) E X A T A E X b T b E ( ) X E ( X ) T ) A T

92 Variance in a fixed direction For any unit vector u ) Var ( u T X = u T Σ X u

93 Direction of maximum variance To find direction of maximum variance we must solve arg max u 2 =1 ut Σ X u

94 Linear algebra Symmetric matrices have orthogonal eigenvectors Σ X = UΛU T λ = [ ] u 1 u 2 u n 0 λ 2 0 [ u1 u 2 ] T u n 0 0 λ n

95 Linear algebra λ 1 = max u 2 =1 ut Au u 1 = arg max u 2 =1 ut Au λ k = max u 2 =1,u u 1,...,u k 1 u T Au u k = arg max u T Au u 2 =1,u u 1,...,u k 1

96 Direction of maximum variance λ1 = 1.22, λ2 = 0.71 λ1 = 1, λ 2 = 1 λ1 = 1.38, λ2 = 0.32

97 Whitening Let Σ X = UΛU T be full rank All the entries of Λ 1 U T X, where 1 λ1 0 0 Λ := λ2 0, λn are uncorrelated

98 Whitening Σ Λ 1 U T X = Λ 1 U T Σ X U Λ 1

99 Whitening Σ Λ 1 U T X = Λ 1 U T Σ X U Λ 1 = Λ 1 U T UΛU T U Λ 1

100 Whitening Σ Λ 1 U T X = Λ 1 U T Σ X U Λ 1 = Λ 1 U T UΛU T U Λ 1 = Λ 1 Λ Λ 1 because U T U = I

101 Whitening Σ Λ 1 U T X = Λ 1 U T Σ X U Λ 1 = Λ 1 U T UΛU T U Λ 1 = Λ 1 Λ Λ 1 because U T U = I = I

102 Whitening X U T X Λ 1 U T X

103 For Gaussian rvs uncorrelation implies mutual independence Uncorrelation implies σ σ2 2 0 Σ X = σn 2 which in turn implies 1 f X ( x) = ( (2π) n Σ exp 1 ) 2 ( x µ)t Σ 1 ( x µ) = = n i=1 ( ) 1 exp (x i µ i ) 2 (2π)σi 2σi 2 n f Xi (x i ) i=1

104 Expectation operator Mean and variance Covariance Conditional expectation

105 Conditional expectation Expectation of g (X, Y ) given X = x? E (g (X, Y ) X = x) = Can be interpreted as a function y= h (x) := E (g (X, Y ) X = x) g(x, y) f Y X (y x) dy, The conditional expectation of g (X, Y ) given X is It s a random variable E (g (X, Y ) X ) := h (X )

106 Iterated expectation For any X and Y and any function g : R 2 R E (g (X, Y )) = E (E (g (X, Y ) X ))

107 Iterated expectation h (x) := E (g (X, Y ) X = x) = y= g (x, y) f Y X (y x) dy

108 Iterated expectation h (x) := E (g (X, Y ) X = x) = y= g (x, y) f Y X (y x) dy E (E (g (X, Y ) X )) = E (h (X ))

109 Iterated expectation h (x) := E (g (X, Y ) X = x) = y= g (x, y) f Y X (y x) dy E (E (g (X, Y ) X )) = E (h (X )) = x= h (x) f X (x) dx

110 Iterated expectation h (x) := E (g (X, Y ) X = x) = y= g (x, y) f Y X (y x) dy E (E (g (X, Y ) X )) = E (h (X )) = = x= x= h (x) f X (x) dx y= f X (x) f Y X (y x) g (x, y) dy dx

111 Iterated expectation h (x) := E (g (X, Y ) X = x) = y= g (x, y) f Y X (y x) dy E (E (g (X, Y ) X )) = E (h (X )) = = x= x= h (x) f X (x) dx y= = E (g (X, Y )) f X (x) f Y X (y x) g (x, y) dy dx

112 Example: Desert Car traveling through the desert Time until the car breaks down: T State of the motor: M State of the road: R Model: M uniform between 0 (no problem) and 1 (very bad) R uniform between 0 (no problem) and 1 (very bad) M and R independent T exponential with parameter M + R

113 Example: Desert E (T ) = E (E (T M, R))

114 Example: Desert E (T ) = E (E (T M, R)) ( ) 1 = E M + R

115 Example: Desert E (T ) = E (E (T M, R)) ( ) 1 = E M + R = dm dr m + r

116 Example: Desert E (T ) = E (E (T M, R)) ( ) 1 = E M + R = = dm dr m + r log (r + 1) log (r) dr

117 Example: Desert E (T ) = E (E (T M, R)) ( ) 1 = E M + R = = dm dr m + r log (r + 1) log (r) dr = log 4 = 1.39

118 Grizzlies in Yellowstone Model for the weight of grizzly bears in Yellowstone: Males: Gaussian with µ := 240 kg and σ := 40kg Females: Gaussian with µ := 140 kg and σ := 20kg There are about the same number of females and males

119 Grizzlies in Yellowstone E (W ) = E (E (W S))

120 Grizzlies in Yellowstone E (W ) = E (E (W S)) = E (W S = 1) + E (W S = 1) 2

121 Grizzlies in Yellowstone E (W ) = E (E (W S)) E (W S = 1) + E (W S = 1) = 2 = 170 kg

122 Bayesian coin flip Bayesian methods often endow parameters of discrete distributions with a continuous marginal distribution You suspect a coin is biased You are uncertain about the bias so you model it as a random variable with pdf f B (b) = 2t for t [0, 1] What is the expected value of the coin flip X?

123 Bayesian coin flip E (X ) = E (E (X B))

124 Bayesian coin flip E (X ) = E (E (X B)) = E (B)

125 Bayesian coin flip E (X ) = E (E (X B)) = E (B) = 1 0 2b 2 db

126 Bayesian coin flip E (X ) = E (E (X B)) = E (B) = 1 0 = 2 3 2b 2 db

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Expectation. DS GA 1002 Probability and Statistics for Data Science.   Carlos Fernandez-Granda Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,

More information

Multivariate random variables

Multivariate random variables Multivariate random variables DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Joint distributions Tool to characterize several

More information

Multivariate random variables

Multivariate random variables DS-GA 002 Lecture notes 3 Fall 206 Introduction Multivariate random variables Probabilistic models usually include multiple uncertain numerical quantities. In this section we develop tools to characterize

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Descriptive Statistics

Descriptive Statistics Descriptive Statistics DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Descriptive statistics Techniques to visualize

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

6 The normal distribution, the central limit theorem and random samples

6 The normal distribution, the central limit theorem and random samples 6 The normal distribution, the central limit theorem and random samples 6.1 The normal distribution We mentioned the normal (or Gaussian) distribution in Chapter 4. It has density f X (x) = 1 σ 1 2π e

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

Review (Probability & Linear Algebra)

Review (Probability & Linear Algebra) Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

Convergence of Random Processes

Convergence of Random Processes Convergence of Random Processes DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Define convergence for random

More information

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis. Vector spaces DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Vector space Consists of: A set V A scalar

More information

MATH Notebook 5 Fall 2018/2019

MATH Notebook 5 Fall 2018/2019 MATH442601 2 Notebook 5 Fall 2018/2019 prepared by Professor Jenny Baglivo c Copyright 2004-2019 by Jenny A. Baglivo. All Rights Reserved. 5 MATH442601 2 Notebook 5 3 5.1 Sequences of IID Random Variables.............................

More information

Basics on Probability. Jingrui He 09/11/2007

Basics on Probability. Jingrui He 09/11/2007 Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Lecture 4: Sampling, Tail Inequalities

Lecture 4: Sampling, Tail Inequalities Lecture 4: Sampling, Tail Inequalities Variance and Covariance Moment and Deviation Concentration and Tail Inequalities Sampling and Estimation c Hung Q. Ngo (SUNY at Buffalo) CSE 694 A Fun Course 1 /

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

Expectation of Random Variables

Expectation of Random Variables 1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

1 Exercises for lecture 1

1 Exercises for lecture 1 1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

We introduce methods that are useful in:

We introduce methods that are useful in: Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Probability Review. Gonzalo Mateos

Probability Review. Gonzalo Mateos Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction

More information

Twelfth Problem Assignment

Twelfth Problem Assignment EECS 401 Not Graded PROBLEM 1 Let X 1, X 2,... be a sequence of independent random variables that are uniformly distributed between 0 and 1. Consider a sequence defined by (a) Y n = max(x 1, X 2,..., X

More information

Statistical Data Analysis

Statistical Data Analysis DS-GA 0 Lecture notes 8 Fall 016 1 Descriptive statistics Statistical Data Analysis In this section we consider the problem of analyzing a set of data. We describe several techniques for visualizing the

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

6.041/6.431 Fall 2010 Quiz 2 Solutions

6.041/6.431 Fall 2010 Quiz 2 Solutions 6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential

More information

Chapter 4 : Expectation and Moments

Chapter 4 : Expectation and Moments ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average

More information

Statistical Methods in Particle Physics

Statistical Methods in Particle Physics Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Probability Background

Probability Background Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Product measure and Fubini s theorem

Product measure and Fubini s theorem Chapter 7 Product measure and Fubini s theorem This is based on [Billingsley, Section 18]. 1. Product spaces Suppose (Ω 1, F 1 ) and (Ω 2, F 2 ) are two probability spaces. In a product space Ω = Ω 1 Ω

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Continuous RVs. 1. Suppose a random variable X has the following probability density function: π, zero otherwise. f ( x ) = sin x, 0 < x < 2

Continuous RVs. 1. Suppose a random variable X has the following probability density function: π, zero otherwise. f ( x ) = sin x, 0 < x < 2 STAT 4 Exam I Continuous RVs Fall 7 Practice. Suppose a random variable X has the following probability density function: f ( x ) = sin x, < x < π, zero otherwise. a) Find P ( X < 4 π ). b) Find µ = E

More information

Kousha Etessami. U. of Edinburgh, UK. Kousha Etessami (U. of Edinburgh, UK) Discrete Mathematics (Chapter 7) 1 / 13

Kousha Etessami. U. of Edinburgh, UK. Kousha Etessami (U. of Edinburgh, UK) Discrete Mathematics (Chapter 7) 1 / 13 Discrete Mathematics & Mathematical Reasoning Chapter 7 (continued): Markov and Chebyshev s Inequalities; and Examples in probability: the birthday problem Kousha Etessami U. of Edinburgh, UK Kousha Etessami

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

X = X X n, + X 2

X = X X n, + X 2 CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk

More information

Review of Probabilities and Basic Statistics

Review of Probabilities and Basic Statistics Alex Smola Barnabas Poczos TA: Ina Fiterau 4 th year PhD student MLD Review of Probabilities and Basic Statistics 10-701 Recitations 1/25/2013 Recitation 1: Statistics Intro 1 Overview Introduction to

More information

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Random Processes. DS GA 1002 Probability and Statistics for Data Science. Random Processes DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Modeling quantities that evolve in time (or space)

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

Relationship between probability set function and random variable - 2 -

Relationship between probability set function and random variable - 2 - 2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Review. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with

More information

Polytechnic Institute of NYU MA 2212 MIDTERM Feb 12, 2009

Polytechnic Institute of NYU MA 2212 MIDTERM Feb 12, 2009 Polytechnic Institute of NYU MA 2212 MIDTERM Feb 12, 2009 Print Name: Signature: Section: ID #: Directions: You have 55 minutes to answer the following questions. You must show all your work as neatly

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Review of probability

Review of probability Review of probability Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Goals for the lecture you should understand the following concepts definition of probability random variables

More information

1 Random variables and distributions

1 Random variables and distributions Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adopted from Prof. H.R. Rabiee s and also Prof. R. Gutierrez-Osuna

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

Probability and Statistics Notes

Probability and Statistics Notes Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 PROBABILITY. Prof. Steven Waslander

ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 PROBABILITY. Prof. Steven Waslander ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 Prof. Steven Waslander p(a): Probability that A is true 0 pa ( ) 1 p( True) 1, p( False) 0 p( A B) p( A) p( B) p( A B) A A B B 2 Discrete Random Variable X

More information

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3. Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Lecture Notes 1: Vector spaces

Lecture Notes 1: Vector spaces Optimization-based data analysis Fall 2017 Lecture Notes 1: Vector spaces In this chapter we review certain basic concepts of linear algebra, highlighting their application to signal processing. 1 Vector

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

S n = x + X 1 + X X n.

S n = x + X 1 + X X n. 0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Exercises with solutions (Set D)

Exercises with solutions (Set D) Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

MATH Notebook 4 Fall 2018/2019

MATH Notebook 4 Fall 2018/2019 MATH442601 2 Notebook 4 Fall 2018/2019 prepared by Professor Jenny Baglivo c Copyright 2004-2019 by Jenny A. Baglivo. All Rights Reserved. 4 MATH442601 2 Notebook 4 3 4.1 Expected Value of a Random Variable............................

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Stochastic Processes. Review of Elementary Probability Lecture I. Hamid R. Rabiee Ali Jalali

Stochastic Processes. Review of Elementary Probability Lecture I. Hamid R. Rabiee Ali Jalali Stochastic Processes Review o Elementary Probability bili Lecture I Hamid R. Rabiee Ali Jalali Outline History/Philosophy Random Variables Density/Distribution Functions Joint/Conditional Distributions

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

Mathematical Statistics

Mathematical Statistics Mathematical Statistics Chapter Three. Point Estimation 3.4 Uniformly Minimum Variance Unbiased Estimator(UMVUE) Criteria for Best Estimators MSE Criterion Let F = {p(x; θ) : θ Θ} be a parametric distribution

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,

More information

Continuous Distributions

Continuous Distributions Continuous Distributions 1.8-1.9: Continuous Random Variables 1.10.1: Uniform Distribution (Continuous) 1.10.4-5 Exponential and Gamma Distributions: Distance between crossovers Prof. Tesler Math 283 Fall

More information

PROBABILITY DISTRIBUTION

PROBABILITY DISTRIBUTION PROBABILITY DISTRIBUTION DEFINITION: If S is a sample space with a probability measure and x is a real valued function defined over the elements of S, then x is called a random variable. Types of Random

More information

CSE 312 Final Review: Section AA

CSE 312 Final Review: Section AA CSE 312 TAs December 8, 2011 General Information General Information Comprehensive Midterm General Information Comprehensive Midterm Heavily weighted toward material after the midterm Pre-Midterm Material

More information

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3) STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it

More information

COMP2610/COMP Information Theory

COMP2610/COMP Information Theory COMP2610/COMP6261 - Information Theory Lecture 9: Probabilistic Inequalities Mark Reid and Aditya Menon Research School of Computer Science The Australian National University August 19th, 2014 Mark Reid

More information

Preliminary statistics

Preliminary statistics 1 Preliminary statistics The solution of a geophysical inverse problem can be obtained by a combination of information from observed data, the theoretical relation between data and earth parameters (models),

More information

Chapter 6: Functions of Random Variables

Chapter 6: Functions of Random Variables Chapter 6: Functions of Random Variables We are often interested in a function of one or several random variables, U(Y 1,..., Y n ). We will study three methods for determining the distribution of a function

More information

Random Variables. P(x) = P[X(e)] = P(e). (1)

Random Variables. P(x) = P[X(e)] = P(e). (1) Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment

More information

FINAL EXAM: Monday 8-10am

FINAL EXAM: Monday 8-10am ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information