Chapter 3: Random Variables 1
|
|
- Letitia Page
- 5 years ago
- Views:
Transcription
1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof. Mao-Ching Chiu
2 Y. S. Han Random Variables The Notion of a Random Variable A random variable X is a function that assigns a real number, X(ζ), to each outcome ζ in the sample space.
3 Y. S. Han Random Variables 2 Example: Toss a coin three times. S = {HHH, HHT, HTH, THH, HTT, THT, TTH, TTT}. Let X be the number of heads. ζ HHH HHT HTH THH HTT THT TTH TTT X(ζ) X is a random variable with S X = {0, 1, 2, 3}.
4 Y. S. Han Random Variables 3 S: Sample space of a random experiment X: Random variable X : S S X S X is a new sample space Let B S X and A = {ζ : X(ζ) B}. Then P[B] = P[A] = P[{ζ : X(ζ) B}] A and B are equivalent events.
5 Y. S. Han Random Variables 4
6 Y. S. Han Random Variables Cumulative Distribution Function Cumulative distribution function (cdf) F X (x) = P[X x] for < x < In underlying sample space F X (x) = P[{ζ : X(ζ) x}] F X (x) is a function of the variable x.
7 Y. S. Han Random Variables 6 Properties of cdf 1. 0 F X (x) lim x F X (x) = lim x F X (x) = If a < b, then F X (a) F X (b). 5. F X (x) is continuous from the right, i.e., for h > 0 F X (b) = lim h 0 F X (b + h) = F X (b + ). 6. P[a < X b] = F X (b) F X (a), since {X a} {a < X b} = {X b}.
8 Y. S. Han Random Variables 7 7. P[X = b] = F X (b) F X (b ). 8. P[X > x] = 1 F X (x).
9 Y. S. Han Random Variables 8 X: the number of heads in three tosses of a fair coin Let δ be a small positive number. Then F X (1 δ) = P[X 1 δ] = P {0 heads} = 1/8, F X (1) = P[X 1] = P[0 or 1 heads] = 1/8+3/8 = 1/2, F X (1 + δ) = P[X 1 + δ] = P[0 or 1 heads] = 1/2. Write in unit step function u(x) = { 0, x < 0 1, x 0, F X (x) = 1 8 u(x) u(x 1) u(x 2) + 1 u(x 3). 8
10 Y. S. Han Random Variables 9
11 Y. S. Han Random Variables 10 Example: The transmission time X of messages in a communication system obeys the exponential probability law with parameter λ, i.e., P[X > x] = e λx x > 0 Find the cdf of X and P[T < X 2T], where T = 1/λ. F X (x) = P[X x] = 1 P[X > x] = { 0, x < 0 1 e λx, x 0. P[T < X 2T] = 1 e 2 (1 e 1 ) = e 1 e 2.
12 Y. S. Han Random Variables 11
13 Y. S. Han Random Variables 12 Types of random variables Discrete random variable, S X = x 0,x 1,... F X (x) = k p X (x k )u(x x k ), where x k S X and p X (x k ) = P[X = x k ] is the probability mass function (pmf) of X. Continuous random variable F X (x) = x Random variable of mixed type f(t)dt F X (x) = pf 1 (x) + (1 p)f 2 (x)
14 Y. S. Han Random Variables Probability Density Function Probability density function (pdf) of X, if it exists, is f X (x) = df X(x). dx
15 Y. S. Han Random Variables 14
16 Y. S. Han Random Variables 15 Properties of pdf 1. f X (x) 0 due to nondecreasing property of cdf. 2. P[a X b] = b a f X(x)dx. 3. F X (x) = x f X(t)dt. 4. f X(t)dt = 1.
17 Y. S. Han Random Variables 16 The pdf of the uniform random variable f X (x) = F X (x) = { 1 b a, a x b 0, otherwise 0, x < a x a, a x b b a 1, x > b
18 Y. S. Han Random Variables 17
19 Y. S. Han Random Variables 18 Pdf for discontinuous cdf Unit step function u(x) = { 0 x < 0 1 x 0 Delta function δ(t) u(x) = x δ(t)dt Cdf for a discrete random variable F X (x) = k p X (x k )u(x x k ) = x f X (t)dt
20 Y. S. Han Random Variables 19 f X (x) = k p X(x k )δ(x x k )
21 Y. S. Han Random Variables 20 Conditional cdf s and pdf s Conditional cdf of X given A F X (x A) = P[{X x} A] P[A] if P[A] > 0 Conditional pdf of X given A f X (x A) = d dx F X(x A)
22 Y. S. Han Random Variables Some Important Random Variables Bernoulli random variable: Let A be an event. The indicator function for A is { 0 ζ / A I A (ζ) = 1 ζ A. I A is the Bernoulli random variable. Ex: toss a coin. Binomial random variable: Let X be the number of times a event A occurs in n independent trials. Let I j be the indicator function for event A in the jth trial. Then X = I 1 + I I n
23 Y. S. Han Random Variables 22 and P[X = k] = ( n k) p k (1 p) n k, where p = P[I j = 1].
24 Y. S. Han Random Variables 23 Geometric random variable: Count the number M of independent Bernoulli trials until the first success of event A. P[M = k] = (1 p) k 1 p k = 1, 2,..., where p = P[A]. Another version of the geometric random variable is P[k] = (1 p) k p k = 0, 1, 2,....
25 Y. S. Han Random Variables 24 Exponential random variable f X (x) = { 0 x < 0 λe λx x 0 F X (x) = { 0 x < 0 1 e λx x 0 λ: rate at which events occur.
26 Y. S. Han Random Variables 25
27 Y. S. Han Random Variables 26 The Poisson random variable: The pmf is P[N = k] = αk k! e α k = 0, 1, 2,,..., where α is the average number of event occurrences in a specified time interval or region in space. The pmf of the Poison random variable sums to one, since k=0 α k k! e α = e α k=0 α k k! = e α e α = 1.
28 Y. S. Han Random Variables 27
29 Y. S. Han Random Variables 28 Gaussian (Normal) random variable: The pdf is f X (x) = 1 2πσ e (x m)2 /2σ 2 < x <, where m and σ are real numbers. The cdf is P[X x] = 1 2πσ x e (x m) 2 /2σ 2 dx. Change variable t = (x m)/σ and we have F X (x) = 1 (x m)/σ ( ) x m e t2 /2 dt = Φ 2π σ,
30 Y. S. Han Random Variables 29 where Φ(x) = 1 2π x e t2 /2 dt
31 Y. S. Han Random Variables 30
32 Y. S. Han Random Variables 31 Q-function is defined by Q(x) = 1 Φ(x) = 1 2π x e t2 /2 dt. Q-function is the probability of tail of the pdf. Q(0) = 1/2 and Q( x) = 1 Q(x). Q(x) can be obtained by look-up tables.
33 Y. S. Han Random Variables 32 Example: A communication system accepts a positive voltage V as input and output a voltage Y = αv + N, where α = 10 2 and N is a Gaussian random variable with parameters m = 0 and σ = 2. Find the value of V that gives P[Y < 0] = Sol: P[Y < 0] = P[αV + N < 0] = P[N < αv ] ( ) ( ) αv αv = Φ = Q = σ σ From the Q-function table, we have αv/σ = Thus, V = (4.753)σ/α =
34 Y. S. Han Random Variables Functions of a Random Variable Let X be a random variable. Define another random variable Y = g(x). Example: Let the function h(x) = (x) + be defined as (x) + = { 0 x < 0 x x 0. Let B = {x : g(x) C}. The probability of event C is P[Y C] = P[g(X) C] = P[X B]. Three types of equivalent events are useful in determining the cdf and pdf:
35 Y. S. Han Random Variables Discontinuity case: {g(x) = y k }; 2. cdf: {g(x) y}; 3. pdf: {y < g(x) y + h}.
36 Y. S. Han Random Variables 35 Example: Let X be a sample voltage of a speech waveform, and suppose that X has a uniform distribution in the interval [ 4d, 4d]. Let Y = q(x), where the quantizer input-output characteristic is shown below. Find the pmf for Y.
37 Y. S. Han Random Variables 36 Sol: The event {Y = q} for q in S Y is equivalent to the event {X I q }, where I q is an interval of points mapped into the
38 Y. S. Han Random Variables 37 representation point p. The pmf of Y P[Y = q] = f X (t)dt = 1/8 for all q. I q
39 Y. S. Han Random Variables 38 Example: Let the random variable Y be defined by Y = ax + b, where a is a nonzero constant. Suppose that X has cdf F X (x), find F Y (y). Sol: {Y y} and A = {ax + b y} are equivalent event. If a > 0 then A = {X (y b)/a}, and thus F Y (y) = P [ X y b ] a If a < 0, then A = {X (y b)/a} and F Y (y) = P [ X y b ] a = F X ( y b a ) a > 0. ( ) y b = 1 F X. a
40 Y. S. Han Random Variables 39 Therefore, we have f Y (y) = 1 a f X 1 a f X ( y b a ) ( y b a ) a > 0 a < 0 = 1 ( ) y b a f X. a
41 Y. S. Han Random Variables 40
42 Y. S. Han Random Variables 41 Example: Let X be a Gaussian random variable with mean m and standard deviation σ: f X (x) = 1 2πσ e (x m)2 /2σ 2 < x < Let Y = ax + b. Find the pdf of Y. Sol: From previous example, we have f Y (y) = 1 2π a σ e (y b am)2 /2(aσ) 2. Y also has a Gaussian distribution with mean am + b and standard deviation a σ.
43 Y. S. Han Random Variables 42 Example: Let random variable Y be defined by Y = X 2, where X is a continuous random variable. Find the cdf and pdf of Y. Sol: The event {Y y} occurs when {X 2 y} or equivalently { y X y} for y nonnegative. The event is null when y is negative. Then F Y (y) = 0 y < 0 F X ( y) F X ( y) y 0 and f Y (y) = f X( y) 2 y = f X( y) 2 y f X( y) 2 y + f X( y) 2 y. y > 0
44 Y. S. Han Random Variables 43
45 Y. S. Han Random Variables 44 Consider Y = g(x) as shown below Consider the event C y = {y < Y < y + dy}. Let B x be its equivalence in the x-axis. As shown in the figure, g(x) = y has three solutions and B x = {x 1 < X < x 1 + dx 1 } {x 2 < X < x 2 + dx 2 }
46 Y. S. Han Random Variables 45 Thus, {x 3 < X < x 3 + dx 3 }. P[C y ] = f Y (y) dy = P[B x ] = f X (x 1 ) dx 1 +f X (x 2 ) dx 2 +f X (x 3 ) dx 3. In general, we have f Y (y) = k f X (x) dy/dx = x=xk k f X (x) dx dy. x=xk
47 Y. S. Han Random Variables 46 Example : Let Y = X 2. For Y 0, the equation y = x 2 has two solutions, x 0 = y and x 1 = y. Since dy/dx = 2x, we have f Y (y) = f X( y) 2 y + f X( y) 2. y Example: Let Y = cos(x), where X is uniformly distributed in the interval (0, 2π]. Find the pdf of Y.
48 Y. S. Han Random Variables 47 Sol: Two solutions in the interval, x 0 = cos 1 (y) and x 1 = 2π x 0. dy dx = sin(x 0 ) = sin(cos 1 (y)) = 1 y 2. x0
49 Y. S. Han Random Variables 48 Since f X (x) = 1/(2π), The cdf of Y is f Y (y) = F Y (y) = = 1 2π 1 y π 1 y 2 1 π for 1 < y < 1. 1 y 2 0 y < sin 1 y π 1 y 1 1 y > 1.
50 Y. S. Han Random Variables f Y (y) y
51 Y. S. Han Random Variables Expected Value of Random Variables
52 Y. S. Han Random Variables 51 The Expected Value of X The expected value or mean of a random variable X is defined by E[X] = tf X (t)dt If X is a discrete random variable, then E[X] = k x k p X (x k ) Note that E[X] may not converge.
53 Y. S. Han Random Variables 52 The mean for a uniform random variable between a and b is given by E[X] = b a t b a dt = a + b 2 E[X] is the midpoint of the interval [a,b]. If the pdf of X is symmetric about a point m, then E[X] = m. That is, when f X (m x) = f X (m + x), we have 0 = + (m t)f X (t)dt = m + tf X (t)dt.
54 Y. S. Han Random Variables 53 The pdf of a Gaussian random variable is symmetric at x = m. Therefore, E[X] = m.
55 Y. S. Han Random Variables 54 Exercise: Show that if X is a nonnegative random variable, then E[X] = and E[X] = 0 (1 F X (t))dt if X continuous and nonnegative P[X > k] if X nonnegative, integer-valued. k=0
56 Y. S. Han Random Variables 55 Expected value of Y = g(x) Let Y = g(x), where X is a random variable with pdf f X (x). Y is also a random variable. Mean of Y is E[Y ] = + g(x)f X (x)dx.
57 Y. S. Han Random Variables 56 Variance of X Variance of the random variable X is defined by VAR[X] = E[(X E[X]) 2 ]. Standard deviation of X STD[X] = VAR[X] 1/2 measure of the spread of a distribution. Simplification VAR[X] = E[X 2 2E[X]X + E[X] 2 ] = E[X 2 ] 2E[X]E[X] + E[X] 2 = E[X 2 ] E[X] 2
58 Y. S. Han Random Variables 57 Example: Find the variance of the random variable X that is uniformly distributed in the interval [a,b]. E[X] = (a + b)/2, and VAR[X] = 1 b a b Let y = (x (a + b)/2). Then a ( x a + b ) 2 dx 2 VAR[X] = 1 b a (b a)/2 (b a)/2 y 2 dy = (b a)2 12.
59 Y. S. Han Random Variables 58 Example: Find the variance of a Gaussian random variable. Multiply the integral of the pdf of X by 2πσ to obtain + e (x m)2 /2σ 2 dx = 2πσ. Differentiate both sides with respect to σ to get + ( ) (x m) 2 e (x m)2 /2σ 2 dx = 2π. Then σ 3 VAR[X] = 1 + (x m) 2 e (x m)2 /2σ 2 dx = σ 2. 2πσ
60 Y. S. Han Random Variables 59 Properties Let c be a constant. Then VAR[c] = 0, VAR[X + c] = VAR[X], VAR[cX] = c 2 VAR[X]. nth moment of the random variable X is given by E[X n ] = + x n f X (x)dx.
61 Y. S. Han Random Variables Markov and Chebyshev Inequalities Markov Inequality Suppose X is a nonnegative random variable with mean E[X]. Then Since E[X] = P[X a] E[X] a a 0 a tf X (t)dt + a for X nonnegative tf X (t)dt af X (t)dt = ap[x a]. a tf X (t)dt
62 Y. S. Han Random Variables 61 Chebyshev Inequality Consider random variable X with E[X] = m and VAR[X] = σ 2. Then P[ X m a] σ2 a 2. Proof: Let D 2 = (X m) 2. Markov inequality for D 2 gives P[D 2 a 2 ] E[(X m)2 ] a 2 = σ2 a 2. {D 2 a 2 } and { X m a} are equivalent events.
63 Y. S. Han Random Variables Transfer Methods The Characteristic Function The characteristic function of a random variable X is defined by Φ X (ω) = E [ e jωx] = f X (x)e jωx dx, where j = 1 is the imaginary unit number. Φ X (ω) can be viewed as the expected value of a function of X, e jωx.
64 Y. S. Han Random Variables 63 Φ X (ω) is the Fourier transform of the pdf f X (x) with a reversal in the sign of the exponent. From the Fourier transform inversion formula we have f X (x) = 1 2π Φ X (ω)e jωx dω.
65 Y. S. Han Random Variables 64 Example: The characteristic function for an exponentially distributed random variable with parameter λ is given by Φ X (ω) = = 0 λ λ jω. λe λx e jωx dx = 0 λe (λ jω)x dx
66 Y. S. Han Random Variables 65 If X is a discrete random variable, we have Φ X (ω) = k p X (x k )e jωx k. If X is an integer-valued random variable, we have Φ X (ω) = k= p X (k)e jωk. The above is the Fourier transform of the sequence p X (k). It is a periodic function of ω with period 2π. By the inversion formula we have p X (k) = 1 2π 2π 0 Φ X (ω)e jωk dω k = 0, ±1, ±2,....
67 Y. S. Han Random Variables 66 Example: The characteristic function for a geometric random variable is given by Φ X (ω) = = pq k e jωk = p k=0 p 1 qe jω. ( ) qe jω k k=0
68 Y. S. Han Random Variables 67 The moment theorem states that the moments of X are given by E [X n ] = 1 d n j n dω nφ X(ω). ω=0 Proof: First we expend e jωx in power series in the definition of Φ X (ω): } Φ X (ω) = f X (x) {1 + jωx + (jωx)2 + dx. 2! Assuming that all the moments of X are finite and that the series can be integrated term by term, we have Φ X (ω) = 1 + jωe[x] + (jω)2 E [ X 2] + + (jω)n E [X n ] n! 2! +.
69 Y. S. Han Random Variables 68 If we differentiate n times and evaluate at ω = 0, we have d n dω nφ X(ω) = j n E [X n ]. ω=0
70 Y. S. Han Random Variables 69 Example: To find the mean of an exponentially distributed random variable, we differentiate Φ X (ω) = λ(λ jω) 1 once, and obtain Φ X(ω) = λj (λ jω) 2. Then E[X] = Φ X(0)/j = 1/λ.
71 Y. S. Han Random Variables 70 The Probability Generating Function The probability generating function G N (z) of a nonnegative integer-valued random variable N is defined by G N (z) = E [ z N] = p N (k)z k. k=0 G N (z) can be viewed as the expected value of a function of N, z N. G N (z) is the z-transform of the pmf p N (k) with a sign change in the exponent. Similar to the derivation of the moment theorem, we have p N (k) = 1 k! d k dz k G N(z). z=0
72 Y. S. Han Random Variables 71 d dz G N(z) = z=1 p N (k)kz k 1 z=1 = k=0 kp N (k) = E[N]. k=0 d 2 dz 2G N(z) = p N (k)k(k 1)z k 2 z=1 z=1 = k=0 k(k 1)p N (k) = E[N(N 1)] k=0 = E [ N 2] E[N]. Thus, the mean and variance of N are given by E[N] = G N(1)
73 Y. S. Han Random Variables 72 and V AR[N] = G N(1) + G N(1) (G N(1)) 2.
74 Y. S. Han Random Variables 73 Example: The probability generating function for the Poisson random variable with parameter α is given by G N (z) = k=0 α k k! e α z k = e α = e α e αz = e α(z 1). k=0 (αz) k k! The first two derivatives of G N (z) are given by G N(z) = αe α(z 1) and G N(z) = α 2 e α(z 1). Therefore, E[N] = α and V AR[N] = α 2 + α α 2 = α.
75 Y. S. Han Random Variables 74 The Laplace Transform of the pdf The Laplace transform of the pdf is given by X (s) = 0 f X (x)e sx dx = E [ e sx]. X (s) can be viewed as an expected value of a function of X, e sx. The moment theorem also holds for X (s): E [X n ] = ( 1) n dn ds n X (s). s=0
Chapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationChap 2.1 : Random Variables
Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is
More informationLecture 3: Random variables, distributions, and transformations
Lecture 3: Random variables, distributions, and transformations Definition 1.4.1. A random variable X is a function from S into a subset of R such that for any Borel set B R {X B} = {ω S : X(ω) B} is an
More informationDeep Learning for Computer Vision
Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair
More informationChapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.
Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept
More informationp. 4-1 Random Variables
Random Variables A Motivating Example Experiment: Sample k students without replacement from the population of all n students (labeled as 1, 2,, n, respectively) in our class. = {all combinations} = {{i
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationMore on Distribution Function
More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More informationChapter 6: Random Processes 1
Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationChapter 4. Continuous Random Variables 4.1 PDF
Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will
More informationProbability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27
Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple
More informationLecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationDiscrete Random Variable
Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n
More informationJoint Distribution of Two or More Random Variables
Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationECE 302 Division 2 Exam 2 Solutions, 11/4/2009.
NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More information3 Operations on One Random Variable - Expectation
3 Operations on One Random Variable - Expectation 3.0 INTRODUCTION operations on a random variable Most of these operations are based on a single concept expectation. Even a probability of an event can
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationSlides 8: Statistical Models in Simulation
Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An
More information6.1 Moment Generating and Characteristic Functions
Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,
More informationLecture 3. Discrete Random Variables
Math 408 - Mathematical Statistics Lecture 3. Discrete Random Variables January 23, 2013 Konstantin Zuev (USC) Math 408, Lecture 3 January 23, 2013 1 / 14 Agenda Random Variable: Motivation and Definition
More informationWe introduce methods that are useful in:
Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationF X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.
10.2 Properties of PDF and CDF for Continuous Random Variables 10.18. The pdf f X is determined only almost everywhere 42. That is, given a pdf f for a random variable X, if we construct a function g by
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationAnalysis of Engineering and Scientific Data. Semester
Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each
More informationGeneral Random Variables
1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability A general random variable is discrete, continuous, or mixed. A discrete random variable
More information4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur
4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur Laws of Probability, Bayes theorem, and the Central Limit Theorem Rahul Roy Indian Statistical Institute, Delhi. Adapted
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationConditional Probability
Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the
More informationExpected Value 7/7/2006
Expected Value 7/7/2006 Definition Let X be a numerically-valued discrete random variable with sample space Ω and distribution function m(x). The expected value E(X) is defined by E(X) = x Ω x m(x), provided
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationPhysics 403 Probability Distributions II: More Properties of PDFs and PMFs
Physics 403 Probability Distributions II: More Properties of PDFs and PMFs Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Last Time: Common Probability Distributions
More informationProbability, Random Processes and Inference
INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx
More informationCS206 Review Sheet 3 October 24, 2018
CS206 Review Sheet 3 October 24, 2018 After ourintense focusoncounting, wecontinue withthestudyofsomemoreofthebasic notions from Probability (though counting will remain in our thoughts). An important
More informationReview of Statistics I
Review of Statistics I Hüseyin Taştan 1 1 Department of Economics Yildiz Technical University April 17, 2010 1 Review of Distribution Theory Random variables, discrete vs continuous Probability distribution
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationFundamental Tools - Probability Theory II
Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationLecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages
Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages ELEC206 Probability and Random Processes, Fall 2014 Gil-Jin Jang gjang@knu.ac.kr School of EE, KNU page 1 / 15 Chapter 7. Sums of Random
More informationLecture Notes 2 Random Variables. Random Variable
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationProbability Review. Gonzalo Mateos
Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction
More informationExpectation of Random Variables
1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete
More informationp. 6-1 Continuous Random Variables p. 6-2
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables
More informationDiscrete random variables and probability distributions
Discrete random variables and probability distributions random variable is a mapping from the sample space to real numbers. notation: X, Y, Z,... Example: Ask a student whether she/he works part time or
More informationA6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011
A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More information27 Binary Arithmetic: An Application to Programming
27 Binary Arithmetic: An Application to Programming In the previous section we looked at the binomial distribution. The binomial distribution is essentially the mathematics of repeatedly flipping a coin
More informationIAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES
IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the
More informationProbability Distributions for Discrete RV
An example: Assume we toss a coin 3 times and record the outcomes. Let X i be a random variable defined by { 1, if the i th outcome is Head; X i = 0, if the i th outcome is Tail; Let X be the random variable
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/
More information3 Operations on One R.V. - Expectation
0402344 Engineering Dept.-JUST. EE360 Signal Analysis-Electrical - Electrical Engineering Department. EE360-Random Signal Analysis-Electrical Engineering Dept.-JUST. EE360-Random Signal Analysis-Electrical
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More information2 Continuous Random Variables and their Distributions
Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any
More informationMathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2
( : Focus on free Education) Exercise 16.2 Question 1: A die is rolled. Let E be the event die shows 4 and F be the event die shows even number. Are E and F mutually exclusive? Answer 1: When a die is
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationChapter 1. Sets and probability. 1.3 Probability space
Random processes - Chapter 1. Sets and probability 1 Random processes Chapter 1. Sets and probability 1.3 Probability space 1.3 Probability space Random processes - Chapter 1. Sets and probability 2 Probability
More informationCS145: Probability & Computing
CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationMath Spring Practice for the final Exam.
Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function
More informationDiscrete Probability Refresher
ECE 1502 Information Theory Discrete Probability Refresher F. R. Kschischang Dept. of Electrical and Computer Engineering University of Toronto January 13, 1999 revised January 11, 2006 Probability theory
More informationS n = x + X 1 + X X n.
0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each
More information3. Review of Probability and Statistics
3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture
More informationProbability reminders
CS246 Winter 204 Mining Massive Data Sets Probability reminders Sammy El Ghazzal selghazz@stanfordedu Disclaimer These notes may contain typos, mistakes or confusing points Please contact the author so
More informationChapter 1 Probability Theory
Review for the previous lecture Eample: how to calculate probabilities of events (especially for sampling with replacement) and the conditional probability Definition: conditional probability, statistically
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More informationThe random variable 1
The random variable 1 Contents 1. Definition 2. Distribution and density function 3. Specific random variables 4. Functions of one random variable 5. Mean and variance 2 The random variable A random variable
More informationECE 302: Probabilistic Methods in Electrical Engineering
ECE 302: Probabilistic Methods in Electrical Engineering Test I : Chapters 1 3 3/22/04, 7:30 PM Print Name: Read every question carefully and solve each problem in a legible and ordered manner. Make sure
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationWhy study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables
ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationStochastic processes Lecture 1: Multiple Random Variables Ch. 5
Stochastic processes Lecture : Multiple Random Variables Ch. 5 Dr. Ir. Richard C. Hendriks 26/04/8 Delft University of Technology Challenge the future Organization Plenary Lectures Book: R.D. Yates and
More informationRelationship between probability set function and random variable - 2 -
2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be
More informationProbability theory. References:
Reasoning Under Uncertainty References: Probability theory Mathematical methods in artificial intelligence, Bender, Chapter 7. Expert systems: Principles and programming, g, Giarratano and Riley, pag.
More informationProbability, CLT, CLT counterexamples, Bayes. The PDF file of this lecture contains a full reference document on probability and random variables.
Lecture 5 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Probability, CLT, CLT counterexamples, Bayes The PDF file of
More informationBasics of Stochastic Modeling: Part II
Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference
More informationEE514A Information Theory I Fall 2013
EE514A Information Theory I Fall 2013 K. Mohan, Prof. J. Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2013 http://j.ee.washington.edu/~bilmes/classes/ee514a_fall_2013/
More information