Review of Mathematical Concepts. Hongwei Zhang

Size: px
Start display at page:

Download "Review of Mathematical Concepts. Hongwei Zhang"

Transcription

1 Review of Mathematical Concepts Hongwei Zhang

2 Outline Limits of real number sequences A fixed-point theorem Probability and random processes Probability model Random variable Expectations Useful inequalities Random process/sequence Convergence concepts Laws of large numbers & central limit theorem Stationarity and ergodicity

3 Outline Limits of real number sequences A fixed-point theorem Probability and random processes Probability model Random variable Expectations Useful inequalities Random process/sequence Convergence concepts Laws of large numbers & central limit theorem Stationarity and ergodicity

4 Supremum and infimum? R: the set of real numbers A subset S of R is bounded from above if b R s.t. s S, s b Then the least upper bound (l.u.b.) of S is called the supremum (or simply sup) of S, denoted as sup{s: s S} Similarly, the greatest lower bound (g.l.b.) of a set S (S R) bounded from below is called the infimum (or simply inf) of S, denoted as inf{s: s S}

5 Supremum and infimum (contd.) Ex.: S = (-1, 1], then sup{s: s S} = 1, inf{s: s S} = -1 S = [-1, 1], then sup{s: s S} = 1, inf{s: s S} = -1 When sup{s: s S} S, we also call it the maximum of S, and in this case sup = max When inf{s: s S} S, we also call it the minimum, and in this case inf = min

6 Supremum and infimum (contd.) Proposition (why?) If S1 S2 are both bounded from below, then inf{s: s S1} >= inf{s: s S2} If S1 S2 are both bounded from above, then sup{s: s S1} <= sup{s: s S2}

7 Limit & limit points A sequence of real numbers x.n, n 1, is said to converge to the limit x in R if ε>0, n ε s.t. n > n ε, x.n x < ε This is written as lim. n > x n = x A limit, if it exists, is unique (why?) If x.k, k 1, viewed as a set, is bounded from above and is nondecreasing, then the limit of x.k, k 1 exists and equals the sup of the set of numbers x.k, k 1; If x.k, k 1, viewed as a set, is bounded from below and is nonincreasing, then the limit of x.k, k 1 exists and equals the inf of the set of numbers x.k, k 1.

8 Limit & limit points (contd.) If x.k, k 1 is bounded above and below, we define the sequence a.k, k 1 as a.k = inf{x.n: n k}; and the sequence b.k, k 1 as b.k = sup{x.n: n k} Then, a.k, k 1 and b.k, k 1 are both bounded a.k, k 1 is nondecreasing, and b.k, k 1 is nonincreasing Accordingly, lim k a. k exists, and we call it lim inf of the sequence x.k, k 1, written as b. k exists, and we call it lim sup of the sequence x.k, k 1, written as lim k lim inf n lim sup n x. n x. n

9 Limit & limit points (contd.) x R is a limit point of the sequence x.k, k 1, if ε>0 and n>0, m n,ε > n s.t. x.m n,ε x < ε i.e., for every ε>0 (no matter how small), the sequence comes within ε of x infinitely often A sequence can have None limit point: e.g., x.n = n One limit point: any convergent sequence, e.g., x.n = 1/n Several limit point: e.g., x.n = (-1) n

10 Limit & limit points (contd.) If x.n, n 1, is bounded above and below, there is at least one limit point lim inf x. n and lim sup x. n are both limit points n n Other limit points may also exist; let be the set of all limit points, then lim inf n x. n = inf{ x : x } lim sup n x. n = sup{ x : x } lim inf If x. n and lim sup x. n are equal, then there is only one limit point n n which is the limit of the sequence

11 Outline Limits of real number sequences A fixed-point theorem Probability and random processes Probability model Random variable Expectations Useful inequalities Random process/sequence Convergence concepts Laws of large numbers & central limit theorem Stationarity and ergodicity

12 Fixed point A function f mapping a set C ( R n ) into C is said to have a fixed point at x C if f(x) =x A set C R n is closed if every convergent sequence x.n, n 1, of points in C has its limit point also in C E.g.: C = (0, 1] is not closed, because the limit of the sequence 1/n, n 1 (i.e., 0) is not in C A set C R n is bounded if there is an n-dimensional ball centered at the origin and of finite radius s.t. C is entirely inside the ball

13 Fixed point (contd.) A set C R n is convex if x1,x2 C, and λ, 0<λ<1, λx1 + (1-λ)x2 C i.e., the entire line segment joining x1 and x2 is in C; thus C=[0,1] [2,3] is closed but nonconvex Brouwer s Fixed-Point Theorem Let C R n be a closed, bounded, and convex set. Then a continuous function f:c C has a fixed point in C

14 Proof of Brouwer s Fixed-Point Theorem: for n = 1, and C = [0, 1] Suppose f(0) 0 and f(1) 1 Otherwise, we are done! Define g(x) = f(x) x, then g(0) > 0, and G(1) < 0 Given that g(x) is continuous (because f is continuous), there must exist x in [0, 1] s.t. g(x ) = 0, i.e., f(x ) = x. DONE!

15 Illustration of Brouwer s fixed-point theorem f (x ) 1 x 3 x 2 x 1 0 x 1 x 2 x 3 1 x 1, x2, and x3 are all fixed points

16 Outline Limits of real number sequences A fixed-point theorem Probability and random processes Probability model Random variable Expectations Useful inequalities Random process/sequence Convergence concepts Laws of large numbers & central limit theorem Stationarity and ergodicity

17 Probability model/experiment Sample space (S) set of possible outcomes/sample-points Set of events (F) An event is a subset of the sample space Rule of assigning probabilities to events (P) Q: what is the probability model of two independent flips of a fair coin?

18 Axioms of probability Normalization P(S) = 1 Monotonicity B A => P(B) <= P(A) Additivity (disjoint events) If A B=φ, P(A B) = P(A) + P(B) Note: the set of events should form a sigma(δ)-field so as to ensure the existence of a probability function P satisfying the above axioms

19 Field & sigma-field A set of sets is a Field if it is closed under union, intersection, and complement; and empty-set and universal set are elements of the set A field is a δ-field if it is closed under any countable set of unions, intersections, and their combinations

20 Conditional probability For any two events A and B, with P(B) > 0, the conditional probability of A, conditional on B, is P(A B) = P(AB)/P(B) Two events A and B are independent if P(AB) = P(A)P(B) For P(B) > 0, this is equivalent to P(A B) = P(A) Two events A and B are conditionally independent given C if P(AB C) = P(A C)*P(B C) or equivalently P(A B C) = P(A C) Q: examples of conditional probability, independent events, conditionally independent events?

21 Outline Limits of real number sequences A fixed-point theorem Probability and random processes Probability model Random variable Expectations Useful inequalities Random process/sequence Convergence concepts Laws of large numbers & central limit theorem Stationarity and ergodicity

22 Random variables A random variable is a function: S -> R Given and a real number x, there is an event x, and P( x) = P({w S: (w) x}) P( x) is the distribution function of, and is usually denoted as F (x) Monotonically nondecreasing Complex and vector random variables: map sample to a set of finite complex numbers or vectors in some finite dimensional vector space

23 Random variables (contd.) If F (x) has a derivative f (x), we call f (x) the probability density of, i.e., f (x) = d(f (x))/dx If f (x) exists and is finite for all x, we say is a continuous r.v. If has only a countable number of possible outcomes, x 1, x 2,, we say is a discrete r.v. the probability of each outcome x i, {P (x i ): i>=1}, is called the probability mass function (PMF) of

24 Random variables (contd.) Joint distribution function of r.v. 1, 2,, n F 1,,n (x1, x2,, xn) = P(1 x1, 2 x2,, n xn) Then F i (xi) = F 1,,n (,,, xi,,, ) Joint probability density f 1,,n (x1, x2,, xn) is n F( x1, x2,..., xn) x1 x2... xn

25 Random variables (contd.) R.v. 1, 2,, n are independent, if, for all x1,, xn F( x1,..., xn) = n i= 1 P( i If the density or mass function exists, the above formula is equivalent to a product form for the density or mass function xi) Note: pairwise independence does not imply that the entire set is independent (example?)

26 Example of pairwise independence vs. set independence Suppose, Y, and Z have the following joint probability distribution: Then and Y are independent, and and Z are independent, and Y and Z are independent, but, Y, and Z are not independent, since any of them is just the mod 2 sum of the other two, and so is completely determined by the other two.

27 Outline Limits of real number sequences A fixed-point theorem Probability and random processes Probability model Random variable Expectations, moment-generating functions, characteristic functions Useful inequalities Convergence concepts Laws of large numbers & central limit theorem Stationarity and ergodicity

28 Expectations The expected value (or the mean) of a r.v. E[ ] xf ( x) dx; or E[ ] = x xp ( x) = For simplicity, write as E[ ] = = xdf ( x) For non-negative random variables, E[ ] = or E[ ] = 0 x= 1 P( P( x) dx, x), if if is continuous is integer - valued and discrete Exercise: prove them

29 Expectations (contd.) If Y = g(), then E[ Y ] = ydf ( y) = g( x) df ( x) Y Moments E[ n ], central moments E[(-E[]) n ] VAR() (or σ 2 ) = E[(-E[]) 2 ] = E[ 2 ] (E[x]) 2 Standard deviation σ

30 Z=+Y If and Y are independent, F Z ( z) = F ( z y) df ( y) = F ( z x) df ( x) And if and Y both have densities, f Z Let Sn=1+2+ +n, Whether or not 1, 2,, n are independent If 1, 2,, n are independent If 1, 2,, n are independent n 2 2 σ = σ (?) Sn n i= 1 i Y ( z) = f ( z y) f ( y) dy = f ( z x) f ( x) dx E [ Sn] = E[ 1] + E[ 2] E[ n] (?) E [ i] = E[ i] (?) i= 1 n i= 1 Y Y Y (convolution)

31 Example: overhead in bit-stuffing Bit-stuffing in data-link framing: > Q: expected number of inserted bits in a string of length n?

32 Solution i = 1, if the insertion occurred after the i-th bit 0, otherwise Note: E[i] = 0 for i 5, and E[i] = 2-6 otherwise Number of bits inserted (Sn) = 1+x n Thus, E[Sn] = (n-5) 2-6

33 Moments r-th moment ξ r of a r.v. ξ r r = [ ] = r E x f ( x) dx, where r = 0,1,2,3,... If is a discrete r.v., ξ r r = x P r i ( x ), where = i 0,1,2,3,... r-th central moment of m r = E[( ) ], where r = r µ 0,1,2,3,... If is a discrete r.v., m = r r ( x ) P ( x ), where = r µ i i 0,1,2,3,...

34 Moments (contd.) In general Joint moments: ij-th joint moment of and Y If and Y are discrete = = r i i r i i r i r m 0 1) ( ξ µ = = dxdy y x f y x Y E Y j i j i ij ), ( ] [ ξ = l m m l Y j m i l ij y x P y x ), (, ξ

35 Moments (contd.) ij-th joint central moment of and Y m ij = E[( i ) ( Y Y) j ] ξ 11 and m 11 are known as the correlation and covariance of and Y respectively Correlation coefficient of and Y: and Y are uncorrelated, if COV[, Y] = 0 (i.e., ρ Y = 0) If and Y are uncorrelated, then If and Y are uncorrelated, then they are also independent; but not vice versa except for the case when and Y are jointly Gaussian ρ Y 2 + Y = 2 m m 20 δ = δ + δ 11 = m 2 Y 02 COV[ x, δ δ x y y]

36 Moment generating function (MGF) The moment generating function (MGF), if it exists, of a r.v. is t t θ ( t) = E[ e ] = e f ( x) dx, where t is a complex variable Except for a sign reversal in the exponent, MGF is the two-sided Laplace transform for which there is a known inversion formula; thus, in general, knowing θ(t) is equivalent to knowing f (x) and vice versa If is discrete = t txi θ ( t) = E[ e ] e P ( xi ) i

37 MGF (contd.) Benefits of using θ(t) It enables convenient computation of the moments of It can be used to estimate f (x) from experimental measurements of the moments It can be used to solve problems involving computing the sums of r.v.s E.g., Z = with 1 θ ( t) = θ Z, 1 2 ( t) θ being indep. and having MGF, then 2 ( t) It is an important analytical instrument that can be used to demonstrate basic results such as the central limit theorem

38 moments & MGF Note that θ ( t) 2 t ( t ) = E[ e ] = E[1 + t ! 2 n t t = 1+ tµ + ξ ξn ! n! ( t ) n! n +...] Thus, if θ(t) exists k ( k ) d ξk = θ ( 0) = ( θ ( t)) t= 0, k = 0,1,... k dt Note: if all the moments exist and are known, then θ(t) can be computed, based on which f (x) can, in principle, be derived through Laplace transform

39 moments & MGF (contd.) In practice, if is the r.v. whose pdf is desired and.i represents the i-th observation of, then we can estimate the r-th moment of as 1 n n i= 1 r i

40 MGF for multiple variables MGF θ Y (t 1, t 2 ) of two r.v. s and Y Through power series expansion Thus dxdy y x f y t x t e E t t Y Y t t Y + + = = ), ( ) exp( ] [ ), ( 2 1 ) ( θ = = = !! ), ( i j n j i Y j i t t t t ξ θ ), ( ln 2 1 )), ( ( 0,0) ( = = + = = t t Y n l n l n l Y t t t t θ θ ξ

41 MGF for multiple variables (contd.) MGF for N r.v. s 1,, N : N N N k N k k k k k N k N k N i i i N k t k t t E t t t !...!... ] [exp ),...,, ( = = = = = = θ

42 Chernoff bound Given a r.v. and a constant a, P[ a] e ( t) at θ The tightest bound, which occurs when the right-hand side is minimized w.r.t. t, is called the Chernoff Bound

43 Characteristic functions (CF) CF of a r.v. is Φ Except for a sign reversal in the exponent, CF is the Fourier transform (which is widely used in statistical communication theory) of f (x) Always exists For our purpose, CF is the same as MGF If is discrete, Φ jw jw ( w) = E[ e ] = e f ( x) dx, where j = 1 jw jwxi ( w) = E[ e ] e P ( xi ) CF is widely used in statistical communication theory, and is widely used to compute the sums of independent r.v. s; e.g., = i Z = w with 1, 2 being indep., then Φ Z ( w) = Φ ( w) Φ ( 1 2 )

44 Moments & CF Similar to MGF, Φ ( w) ( jw) = n= 0 n! n ξ n Thus ξ n 1 j d dw n ( n) ( n) = Φ (0), where Φ (0) = Φ ( w) n n w= 0

45 Joint characteristic functions CF for N r.v. s 1,, N : For two r.v. s and Y: )] [exp( ),...,, ( = = Φ N i i i N w j E w w w N ), ( ), ( 2 1 )), ( ( (0,0) where (0,0), ) ( ] [ = = + + Φ = Φ Φ = = w w Y n l k r k r Y k r Y k r k r rk w w w w j Y E ξ

46 Table of commonly-used r.v.

47 Outline Limits of real number sequences A fixed-point theorem Probability and random processes Probability model Random variable Expectations Useful inequalities Random process/sequence Convergence concepts Laws of large numbers & central limit theorem Stationarity and ergodicity

48 Markov Inequality If a non-negative r.v. Y has a mean E[Y], then E[ Y ] P( Y y), y y 0 Proof?

49 Chebyshev Inequality Let Z be an arbitrary r.v. with finite mean E[Z] and finite variance σ z2, then P 2 σ ε Z ( Z E[ Z] ε ) ε 0 2 Proof? Let Y = (Z-E[Z]) 2, then according to Markov Inequality P ( Z E Z ) y) 2 [ ] 2 σ Z y Replacing y with ε 2, we get the Chebyshev Inequality

50 Jensen s Inequality Let R n be a convex set; A function f: R is convex if x1, x2, λ [0,1], f ( λx1 + (1 λ) x2) λf ( x1) + (1 λ) f ( x2) Jensen s Inequality: for a convex function f(x) and a r.v. with finite expectation, E(f()) f(e(x)) Equality hold if 1) f(.) is a linear function, or 2) if is constant with probability 1

51 Holder s Inequality For p>1 and 1/p + 1/q = 1, E( Y ) ( E( p )) 1 p ( E( Y q )) 1 q E.g., take p=q=2, Holder s Inequality becomes Schwarz s Inequality ( E( Y )) 2 E( 2 ) E( Y 2 )

52 Outline Limits of real number sequences A fixed-point theorem Probability and random processes Probability model Random variable Expectations Useful inequalities Random process/sequence convergence concepts Laws of large numbers & central limit theorem Stationarity and ergodicity

53 Random process and random sequence Let (S, F, P) be a probability space. is called a random process if (t, ζ) is a r.v. on the probability space for each t R, where ζ is the sample function (i.e., specifying a certain outcome/event) For each fixed ζ, (t, ζ) is an ordinary, deterministic time function For a fixed t, (t, ζ) is a r.v. Usually written as (t) or t Discrete time version of random process [n, ζ] (n Z) is called random sequence; usually written as [n] or n A random process/sequence is statistically specified by the joint distribution of all the combinations of r.v. s of the process/sequence

54 Random process/sequence Mean function µ ( t) = E[ ( t)] Correlation function Covariance function Note: K K R ( t1, t2) = E[ ( t1) ( t2)] ( t1, t2) = E[( ( t1) µ ( t1))( ( t2) ( t µ * ( t1, t2) = R ( t1, t2) µ ( t1) ( t2) µ 2 )) * ]

55 Some elementary random sequence/process Bernoulli process Gaussian process Poisson process

56 Bernoulli process A sequence of i.i.d. Bernoulli trials Formally: a discrete-time random process (i.e., random sequence) consisting of a finite or infinite sequence of independent r.v. s [n], n = 1, 2,, such that For each i, the value of [i] is either 0 or 1; For all values of i, the probability that [i] = 1 is the same number p E.g., a sequence of independent coin toss

57 Gaussian process (sequence) A random process (sequence) all of whose n-th order joing pdf s or more generally PDF s are Gaussian µ (t), K (u, v), - <t, u, v<, are enough information to completely specify the r.p.

58 Poisson process Exponential Distribution Memoryless Property Poisson Distribution Poisson Process

59 Exponential Distribution A continuous R.V. follows the exponential distribution with parameter µ, if its pdf is: f ( x) = e x µ µ if x 0 0 if x < 0 => Probability distribution function: µ x 1 e if x 0 F ( x) = P{ x} = 0 if x < 0 Usually used for modeling service time

60 Exponential Distribution (contd.) Mean and Variance: 1 1 E[ ] =, Var( ) = 2 µ µ Proof: µ x E[ ] = x f ( x) dx = xµ e dx = 0 0 µ x µ x 1 = xe 0 + e dx = 0 µ 2 2 µ x 2 µ x µ x 2 2 E[ ] = x µ e dx = x e xe dx E[ ] 0 = = 0 2 µ µ Var( ) E[ ] ( E[ ]) µ µ µ 2 2 = = = 2 2 2

61 Memoryless Property Past history has no influence on the future P{ > x + t > t} = P{ > x} Proof: P{ > x + t, > t} P{ > x + t} P{ > x + t > t} = = P{ > t} P{ > t} µ ( x+ t ) e µ x = = e = P{ > x} µ t e Exponential: the only continuous distribution with the memoryless property

62 Poisson Distribution A discrete R.V. follows the Poisson distribution with parameter λ if its probability mass function is: k λ λ P{ = k} = e, k = 0,1,2,... k! Wide applicability in modeling the number of random events that occur during a given time interval (=>Poisson Process) Customers that arrive at a post office during a day Wrong phone calls received during a week Students that go to the instructor s office during office hours packets that arrive at a network switch etc

63 Poisson Distribution (contd.) Mean and Variance E[ ] = λ, Var( ) = λ Proof: k k λ λ λ λ E[ ] = kp{ = k} = e k = e k! ( k 1)! k= 0 k= 0 k= 0 j λ λ λ λ = e λ = e λe = λ j! j= 0 k λ λ k! ( k 1)! k 2 2 λ 2 λ E[ ] = k P{ = k} = e k = e k k= 0 k= 0 k= 0 λ λ λ = e ( j + 1) = je + e = + j! j! j! j j j λ λ λ 2 λ λ λ λ λ Var( ) E[ ] ( E[ ]) j= 0 j= 0 j= = = λ + λ λ = λ

64 Sum of Poisson Random Variables i, i =1,2,,n, are independent R.V.s i follows Poisson distribution with parameter λ i Sum Sn = n Follows Poisson distribution with parameter λ λ = λ1 + λ λn

65 Sum of Poisson Random Variables (contd.) Proof: For n = 2. Generalization by induction. The pmf of S= is PfS=mg = = = m k=0 m Pf 1 =k; 2 =m kg Pf 1 =kgpf 2 =m kg k=0 m 1 k1 e k=0 k! = e ( 1+ 2) 1 m! m k 2 e 2 m k=0 (m k)! = e ( 1+ 2) ( 1+ 2) m m! Poisson with parameter = k!(m k)! k1 m k m! 2

66 Sampling a Poisson Variable follows Poisson distribution with parameter λ Each of the arrivals is of type i with probability p i, i =1,2,,n, independent of other arrivals; p 1 + p p n = 1 i denotes the number of type i arrivals, then 1, 2, n are independent i follows Poisson distribution with parameter λ i = λp i

67 Sampling a Poisson Variable (contd.) Proof: Forn=2. Generalizebyinduction. Jointpmf: Pf 1 =k 1 ; 2 =k 2 g= = Pf 1 =k 1 ; 2 =k 2 j=k 1 +k 2 gpf=k 1 +k 2 g ³ k1 +k 2 = p k 1 1 k pk 2 2 e k1+k 2 1 (k 1 +k 2 )! 1 = k 1!k 2! ( p 1) k 1 ( p 2 ) k 2 e (p 1+p 2 ) = e p 1 ( p 1) k 1 k 1! e p 2 ( p 2) k 2 k 2! ² 1 and 2 are independent ² Pf 1 =k 1 g=e p ( p 1 1) k 1 k 1, Pf! 2 =k 2 g=e p ( p 2 2) k2 k 2! i followspoissondistributionwithparameter p i.

68 Poisson Approximation to Binomial Binomial distribution with parameters (n, p) n k P{ = k} = p (1 p) k n k As n and p 0, with np=λ moderate, binomial distribution converges to Poisson with parameter λ Proof: n k P{ = k} = p (1 p) k ( n k + 1)...( n 1) n λ λ = 1 k! n n ( n k + 1)...( n 1) n 1 k n n λ 1 n n k e n λ 1 1 n n P{ = k} e n λ λ k λ k! n k k n k

69 Poisson Process with Rate λ {A(t): t 0} counting process A(t) is the number of events (arrivals) that have occurred from time 0 to time t, when A(0)=0 A(t)-A(s) number of arrivals in interval (s, t] Number of arrivals in disjoint intervals are independent Number of arrivals in any interval (t, t+τ] of length τ Depends only on its length τ Follows Poisson distribution with parameter λτ n λτ ( λτ ) P{ A( t + τ ) A( t) = n} = e, n = 0,1,... n! => Average number of arrivals λτ; λ is the arrival rate

70 Interarrival-Time Statistics Interarrival times for a Poisson process are independent and follow exponential distribution with parameter λ λs P{ τ s} = 1 e, s 0 n t n : time of n th arrival; τ n =t n+1 -t n : n th interarrival time Proof: Probability distribution function s P{ τ s} = 1 P{ τ > s} = 1 P{ A( t + s) A( t ) = 0} = 1 e λ n n n n Independence follows from independence of number of arrivals in disjoint intervals

71 Small Interval Probabilities Interval (t+ δ, t] of length δ P{ A( t + δ ) A( t) = 0} = 1 λδ + ο( δ ) P{ A( t + δ ) A( t) = 1} = λδ + ο( δ ) P{ A( t + δ ) A( t) 2} = ο( δ ) Proof:

72 Merging & Splitting Poisson Processes λ 1 λ 2 λ 1 + λ 2 λ p 1-p λp λ(1-p) A 1,, A k independent Poisson processes with rates λ 1,, λ k Merged in a single process A= A A k A is Poisson process with rate λ= λ λ k A: Poisson processes with rate λ Split into processes A 1 and A 2 independently, with probabilities p and 1-p respectively A 1 is Poisson with rate λ 1 = λp A 2 is Poisson with rate λ 2 = λ(1-p)

73 Modeling Arrival Statistics Poisson process widely used to model packet arrivals in numerous networking problems Justification: provides a good model for aggregate traffic of a large number of independent users n traffic streams, with independent identically distributed (iid) interarrival times with PDF F(s) not necessarily exponential Arrival rate of each stream λ/n As n, combined stream can be approximated by Poisson under mild conditions on F(s) e.g., F(0)=0, F (0)>0 Most important reason for Poisson assumption: Analytic tractability of queueing models

74 Outline Limits of real number sequences A fixed-point theorem Probability and random processes Probability model Random variable Expectations Useful inequalities Random process/sequence Convergence concepts Laws of large numbers & central limit theorem Stationarity and ergodicity

75 Background Consider a random sequence n, n 1 For each sample path ζ, n (ζ) is real number sequence, and it may or may not converge Objective: To describe convergence behavior of n without requiring convergence for every sample path

76 Convergence in probability n, n 1, converges in probability to a r.v., if for each ε>0 lim Pr( n > ε ) = n 0 Written as n p

77 Convergence with probability 1 n, n 1, converges almost surely or with probability 1 to a r.v., if Pr(lim = ) = 1 n n Or, more explicitly Pr w : lim n ( w) = ( w) = 1 n Written as n a s. n w p.. 1

78 Convergence in distribution n, n 1, with distributions F n (.), n 1, converges in distribution to a r.v., with distribution F(.), if lim n F n ( x) = F( x) whenever x is not a point of discontinuity (i.e., a jump point) of F(.) Written as n dist

79 Outline Limits of real number sequences A fixed-point theorem Probability and random processes Probability model Random variable Expectations Useful inequalities Random process/sequence Convergence concepts Laws of large numbers & central limit theorem Stationarity and ergodicity

80 Weak Law of Large Numbers: assuming a finite variance Let 1, 2,, n be identically distributed, uncorrelated r.v. with a finite mean E[] and finite variance σ 2, let Sn=1+ +n, and consider the sample average Sn/n Note σ Sn2 = nσ 2, thus VAR(Sn/n)= σ Sn2 /n 2 = σ 2 /n; Therefore, lim n Sn VAR( ) n = limn E[ 2 ( Sn / n E[ ]) ] = 0 i.e., variance of sample average goes to 0 as n goes to, and sample average converges to E[] in the mean square sense

81 Weak Law of Large Numbers: assuming a finite variance (contd.) Apply Chebyshev inequality to sample average Sn/n, we have P( Sn / n E[ x] ε ) thus, for any ε>0, we have 2 σ 2 nε lim P( Sn / n E[ x] ε ) = n 0 i.e., the Weak Law of Large Numbers (with finite variance) ; Sn/n is said to converge to E[] in probability.

82 Weak Law of Large Numbers: without assuming finite variance Let 1, 2,, n be identically distributed, uncorrelated r.v. with a finite mean E[], let Sn=1+ +n. Then for any any ε>0, lim P( Sn / n E[ x] ε ) = n 0 (proof skipped)

83 Strong Law of large numbers (ver. 1) Let 1, 2,, n be i.i.d. r.v. with a finite mean E[], let Sn=1+ +n. Then for any any ε>0, Sm lim n P(sup E[ x] ε ) = m n m 0 That is, the sequence S1, S2/2, S3/3, has the property that it is increasingly unlikely (with increasing n) for any term in the sequence beyond n to deviate by more than ε from the mean Note: weak law says that it is increasingly unlikely for Sn/n to deviate from the mean by ε

84 Strong Law of large numbers (ver. 2): commonly used Let 1, 2,, n be i.i.d. r.v. with a finite mean E[], let Sn=1+ +n. Then with probability 1, lim Sn n n = E[ ] i.e., Sn P lim n = E[ ] = 1 n converge with prob. 1, or almost sure convergence

85 Central limit theorem n, n 1, is a sequence of i.i.d. r.v. with finite mean µ and finite variance σ 2. Then 1 σ n n k = 1 ( ) dist µ N(0,1) k

86 Outline Limits of real number sequences A fixed-point theorem Probability and random processes Probability model Random variable Expectations Useful inequalities Random process/sequence Convergence concepts Laws of large numbers & central limit theorem Stationarity and ergodicity

87 Strict stationarity n, n 1, is strictly stationary or stationary if for all k 1, and indices n 1, n 2,, n k, and m, dist = n1+ m n 2 + m k (,,..., ) (,,..., ) n 1 n 2 n k n + m i.e., all joint CDFs or pdfs of the r.p. are not a function of time shift (note: can still change with relative space in between)

88 Wide-sense stationarity (w.s.s.) n, n 1, is wide-sense stationary if µ (t) = µ, not a function of t; and R (t1, t2) = R (t1+t, t2+t), for all t1, t2, T; or equivalently K (t1, t2) = K (t1+t, t2+t) i.e., R (t1, t2) and K (t1, t2) is a function of (t2-t1) only Facts (Strict) Stationarity => w.s.s. For a Gaussian r.p., w.s.s. => stationarity

89 Ergodicity Consider a r.p. n, n 1 Invariant event Consider a question about the process whose answer does not depend on whether we ask the question about n, n 1, or about n+k, n 1, for any k 1 Each such question yields an event on which the answer is yes, and its complement on which the answer is no Such events are called invariant events A r.p. is ergodic if each invariant event has a probability of either 0 or 1

90 Birkhoff s strong ergodic theorem: a generalizaiton of strong law of large numbers Given a stationary and ergodic process n, n 1, and a function f(.) that maps realization of the process (i.e., n (w), n 1) such that E( f( n, n 1) ) <, then with probability 1, lim m m 1 1 k = 0 m f ( n+ k, n 1) = E[ f (, n 1)] An application: time averages converge to the ensemble average for a stationary and ergodic process Define f( n, n 1) = 1 ; thus f( n+k, n 1) = k ; Then lim m 1 m m 1 k = 0 k = E[ 1 ], which is the expectation of any of the r.v.'s of the process n

91 Summary Limits of real number sequences A fixed-point theorem Probability and random processes Probability model, Random variable Expectations Useful inequalities Random process/sequence, Convergence concepts Laws of large numbers & central limit theorem Stationarity and ergodicity

92 Homework #0 (total points: 150) 1. [60 points] Let 1, 2,, n, be a sequence of independent identically distributed (IID) continuous random variables with the common probability density function f (x); note that P(=α) = 0 for any α and that P(1 = 2) = 0. a) Find P(1 2). (Hint: give a numerical answer, not an expression; no computation is required and a one or two line explanation should be adequate.) b) Find P(1 2; 1 3) (in other words, find the probability that 1 is the smallest of 1, 2, and 3; again think --- do not compute) c) Let the random variable N be the index of the first r.v. in the sequence to be less than 1; that is, P(N = n) = P(1 2, 1 3,, 1 n-1, 1 n ). Find P(N > n) as a function of n. (Hint: generalize part b).) d) Show that E[N] =.

93 Homework #0 (contd.) 2. [30 points] A computer system has n users, each with a unique name and password. Due to a software error, the n passwords are randomly permuted internally (i.e., each of the n! possible permutations are equally likely). Only those users lucky enough to have had their passwords unchanged in the permutation are able to continue using the system. a) What is the probability that a particular user, say user 1, is able to continue using the system? b) What is the expected number of users able to continue using the system? (Hint: let i be a random variable with the value 1 if user i can use the system and 0 otherwise.)

94 Homework #0 (contd.) 3. [60 points] A town starts a mosquito control program and we let the r.v. Zn be the number of mosquitos at the end of nth year (n = 0, 1, 2, ). Let n be the growth rate of mosquitos in year n, i.e., Zn = n*z n-1, n 1. Assume that {n; n 1} is a sequence of IID random variables with the PMF P(=2) = ½, P(=1/2) = ¼, and P(=1/4) = ¼. Suppose that Z0, the initial number of mosquitos, is some known constant and assume for simplicity and consistency that Zn can take on non-integer values. a) Find E[Zn] as a function of n, and find lim E[ Zn]. b) Let Wn = log 2 n. Find E[Wn] and E[log 2 (Zn/Z0) ] as a function of n. c) Using the Strong Law of Large Number, show that with probability 1 and find the value of α. d) Using c), show that lim Zn n = β with probability 1 for some constant β and evaluate β. n limn (1/ n)[log2( Zn / Z0)] = α

Summarizing Measured Data

Summarizing Measured Data Performance Evaluation: Summarizing Measured Data Hongwei Zhang http://www.cs.wayne.edu/~hzhang The object of statistics is to discover methods of condensing information concerning large groups of allied

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Stochastic Models in Computer Science A Tutorial

Stochastic Models in Computer Science A Tutorial Stochastic Models in Computer Science A Tutorial Dr. Snehanshu Saha Department of Computer Science PESIT BSC, Bengaluru WCI 2015 - August 10 to August 13 1 Introduction 2 Random Variable 3 Introduction

More information

6.1 Moment Generating and Characteristic Functions

6.1 Moment Generating and Characteristic Functions Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Probability and Statistics

Probability and Statistics Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

Probability Review. Gonzalo Mateos

Probability Review. Gonzalo Mateos Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction

More information

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1 Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be

More information

Probability reminders

Probability reminders CS246 Winter 204 Mining Massive Data Sets Probability reminders Sammy El Ghazzal selghazz@stanfordedu Disclaimer These notes may contain typos, mistakes or confusing points Please contact the author so

More information

3. Review of Probability and Statistics

3. Review of Probability and Statistics 3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

The main results about probability measures are the following two facts:

The main results about probability measures are the following two facts: Chapter 2 Probability measures The main results about probability measures are the following two facts: Theorem 2.1 (extension). If P is a (continuous) probability measure on a field F 0 then it has a

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27 Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information

Chapter 2 Queueing Theory and Simulation

Chapter 2 Queueing Theory and Simulation Chapter 2 Queueing Theory and Simulation Based on the slides of Dr. Dharma P. Agrawal, University of Cincinnati and Dr. Hiroyuki Ohsaki Graduate School of Information Science & Technology, Osaka University,

More information

Markov processes and queueing networks

Markov processes and queueing networks Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

More information

Relationship between probability set function and random variable - 2 -

Relationship between probability set function and random variable - 2 - 2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

Part I Stochastic variables and Markov chains

Part I Stochastic variables and Markov chains Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

We introduce methods that are useful in:

We introduce methods that are useful in: Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more

More information

where r n = dn+1 x(t)

where r n = dn+1 x(t) Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution

More information

EE514A Information Theory I Fall 2013

EE514A Information Theory I Fall 2013 EE514A Information Theory I Fall 2013 K. Mohan, Prof. J. Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2013 http://j.ee.washington.edu/~bilmes/classes/ee514a_fall_2013/

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory Chapter 1 Statistical Reasoning Why statistics? Uncertainty of nature (weather, earth movement, etc. ) Uncertainty in observation/sampling/measurement Variability of human operation/error imperfection

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

CS145: Probability & Computing

CS145: Probability & Computing CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Queueing Theory and Simulation. Introduction

Queueing Theory and Simulation. Introduction Queueing Theory and Simulation Based on the slides of Dr. Dharma P. Agrawal, University of Cincinnati and Dr. Hiroyuki Ohsaki Graduate School of Information Science & Technology, Osaka University, Japan

More information

EE126: Probability and Random Processes

EE126: Probability and Random Processes EE126: Probability and Random Processes Lecture 18: Poisson Process Abhay Parekh UC Berkeley March 17, 2011 1 1 Review 2 Poisson Process 2 Bernoulli Process An arrival process comprised of a sequence of

More information

Chapter 2. Probability

Chapter 2. Probability 2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing

More information

Basics on Probability. Jingrui He 09/11/2007

Basics on Probability. Jingrui He 09/11/2007 Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability

More information

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9 MAT 570 REAL ANALYSIS LECTURE NOTES PROFESSOR: JOHN QUIGG SEMESTER: FALL 204 Contents. Sets 2 2. Functions 5 3. Countability 7 4. Axiom of choice 8 5. Equivalence relations 9 6. Real numbers 9 7. Extended

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Introduction and Preliminaries

Introduction and Preliminaries Chapter 1 Introduction and Preliminaries This chapter serves two purposes. The first purpose is to prepare the readers for the more systematic development in later chapters of methods of real analysis

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

Discrete Random Variable

Discrete Random Variable Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n

More information

8 Laws of large numbers

8 Laws of large numbers 8 Laws of large numbers 8.1 Introduction We first start with the idea of standardizing a random variable. Let X be a random variable with mean µ and variance σ 2. Then Z = (X µ)/σ will be a random variable

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

Elementary Probability. Exam Number 38119

Elementary Probability. Exam Number 38119 Elementary Probability Exam Number 38119 2 1. Introduction Consider any experiment whose result is unknown, for example throwing a coin, the daily number of customers in a supermarket or the duration of

More information

Module 3. Function of a Random Variable and its distribution

Module 3. Function of a Random Variable and its distribution Module 3 Function of a Random Variable and its distribution 1. Function of a Random Variable Let Ω, F, be a probability space and let be random variable defined on Ω, F,. Further let h: R R be a given

More information

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models

More information

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes

More information

2.1 Elementary probability; random sampling

2.1 Elementary probability; random sampling Chapter 2 Probability Theory Chapter 2 outlines the probability theory necessary to understand this text. It is meant as a refresher for students who need review and as a reference for concepts and theorems

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3) STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it

More information

Basic concepts of probability theory

Basic concepts of probability theory Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,

More information

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model. Poisson Processes Particles arriving over time at a particle detector. Several ways to describe most common model. Approach 1: a) numbers of particles arriving in an interval has Poisson distribution,

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Lectures on Elementary Probability. William G. Faris

Lectures on Elementary Probability. William G. Faris Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................

More information

STAT 7032 Probability Spring Wlodek Bryc

STAT 7032 Probability Spring Wlodek Bryc STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,

More information

7 Convergence in R d and in Metric Spaces

7 Convergence in R d and in Metric Spaces STA 711: Probability & Measure Theory Robert L. Wolpert 7 Convergence in R d and in Metric Spaces A sequence of elements a n of R d converges to a limit a if and only if, for each ǫ > 0, the sequence a

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. NAME: ECE 302 Division MWF 0:30-:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. If you are not in Prof. Pollak s section, you may not take this

More information

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. STAT 302 Introduction to Probability Learning Outcomes Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. Chapter 1: Combinatorial Analysis Demonstrate the ability to solve combinatorial

More information

Lecture 1: Review on Probability and Statistics

Lecture 1: Review on Probability and Statistics STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

Chap 2.1 : Random Variables

Chap 2.1 : Random Variables Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is

More information

Stat 5101 Notes: Algorithms

Stat 5101 Notes: Algorithms Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................

More information

Lecture 4: Probability and Discrete Random Variables

Lecture 4: Probability and Discrete Random Variables Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Week 2. Review of Probability, Random Variables and Univariate Distributions

Week 2. Review of Probability, Random Variables and Univariate Distributions Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 71E Probability and Statistics Spring 013 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 1.30, Wednesday EEB 5303 10.00 1.00, Wednesday

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Expectation. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,

More information

Chapter 4 : Expectation and Moments

Chapter 4 : Expectation and Moments ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average

More information