General Random Variables
|
|
- Mary Walton
- 5 years ago
- Views:
Transcription
1 1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability
2 A general random variable is discrete, continuous, or mixed. A discrete random variable has a countable image (of a sample space). A continuous random variable has an uncountable image which is a continuous set of numbers. A mixed random variable has both, 2/65
3 Continuous Random Variables A continuous random variable arises from an uncountable sample space, e.g. when the underlying random experiment involves a measurement such as time position the velocity of a vehicle on highway 3/65
4 Point vs. Neighborhood The way to see a continuous random variable is to consider a small neighborhood, rather than to look at an isolated point. The event of a continuous random variable X taking a value in the neighborhood of x of length δ will be denoted by X (x, δ) 4/65
5 Probability of an Infinitesimal Event Random variable X is a continuous. Consider P(X (x, δ)), the probability of event X (x, δ). P(X (x, δ)) depends on x. P(X (x, δ)) is proportional to δ. Such a probability is specified by a probability density function (PDF) f X evaluated at x X (x, δ) P(X (x, δ)) = f X (x)δ 5/65
6 Probability of an Event Random variable X is continuous and defined on a probability model (Ω, F, P), with image I X and PDF f X. Then the probability of event X B, where B is a countable union of intervals in I X, is an integration P(X B) = f X (x)dx B X (x, δ) P(X (x, δ)) = f X (x)δ B = (x i, δ i ) X B P(X B) = f X (x)dx B 6/65
7 Probability Based on a Continuous RV Refer to Figure 3.1 and Figure 3.2. A probability model based on a continuous random variable X is completely specified by image I X and PDF f X. We represent such a model by (X, I X, f X ) 7/65
8 Discrete RV vs. Continuous RV For a discrete random variable X of (X, I X, g X ), the image I X is countable, and g X = p X is a probability mass function which is used to assign probabilities to atomic events X = x i s. For a continuous random variable X of (X, I X, g X ), the image I X is uncountable, and g X = f X is a probability density function for assigning infinitesimal probabilities to infinitesimal events, which are extremely-small intervals of non-zero lengths. Infinitesimal events constitute a countable partition of Ω. An event is a countable union of infinitesimal events. The probability of an event is an integration of infinitesimal probabilities. 8/65
9 Properties of PDF totality fx dx = 1 Random variable X has PDF f X. Then we require non-negativity f X 0 9/65
10 Continuous Uniform Random Variable Random variable X of (X, I X, f X ) is uniform if f X is constant over I X. A continuous uniform random variable X with image I X = [a, b] is denoted by X uniform(a, b) The PDF of X is f X (x) = { 1 b a, a x b 0, otherwise 10/65
11 Example 3.1 A wheel of fortune is fair and continuously calibrated between 0 and 1. What is the PDF of the result of a spin of the wheel? Assuming that the wheel is fair, we can model this experiment in terms of a continuous uniform random variable X with PDF { c, 0 x 1 f X (x) = 0, otherwise The constant c is decided by f X (x)dx = c dx = 1 c = 1 11/65
12 Example 3.2 Alvin s driving time to work is between 15 and 20 minutes in a sunny day, and between 20 and 25 minutes in a rainy day, with all times being equally likely in each case. Assume that a day is sunny with probability 2/3, and rainy with probability 1/3. What is the PDF of the driving time, viewed as a random variable X? PDF f X is piece-wise constant = c 1 dx, = c 2 dx 20 c 1 = 2 15, c 2 = /65
13 Mean, Variance, Moments, etc. E[X] = xf X dx var(x) = E[(X E[X]) 2 ] = E[X 2 ] (E[X]) 2 ( ) 2 = x 2 f X dx xf X dx E[X n ] = x n f X dx E[g(X)] = g(x)f X (x)dx E[aX + b] = ae[x] + b, var(ax + b) = a 2 var(x) 13/65
14 Example 3.4 Random variable X is continuous with X uniform(a, b) What are the mean and variance of X? 14/65
15 Exponential Random Variable An exponential random variable T with parameter α > 0 has image I T = {t 0} and PDF This is denoted by f T (t) = { αe αt, t 0 0, t < 0 T exponential(α) See Figure 3.5 for examples of exponential PDFs. 15/65
16 Mean and Variance Random variable T is continuous with T exponential(α). So E[T ] = t f T (t)dt = = ( te αt ) 0 E[T 2 ] = t 2 f T (t)dt = = t2 e αt + 1 α α 0 0 t ( αe αt) dt + e αt dt = 1 α t 2 αe αt dt var(t ) = E[T 2 ] E 2 [T ] = 1 α 2 2te αt dt = 2 α 2 16/65
17 Example 3.5 Meteorite The time until a small meteorite first lands anywhere in the Sahara desert is modeled as an exponential random variable with a mean of 10 days. The time is currently midnight. What is the probability that a meteorite first lands sometime between 6 a.m. and 6 p.m. of the first day? T exponential(α), E[T ] = 10 = 1 α α = 1 10 ( 1 P 4 T = f 4) 1 T (t)dt = 4 = e αt = e 1 40 e αe αt dt 17/65
18 Cumulative Distribution Functions 18/65
19 Definition By definition, the cumulative distribution function (CDF) of a random variable X is F X (x) = P(X x) If X is discrete If X is continuous F X (x) = p X (x i ) x i I X,x i x x F X (x) = fx (t)dt See Figure 3.6 and Figure /65
20 Properties of CDF non-decreasing x 1 x 2 F X (x 1 ) F X (x 2 ) limits CDF and PDF lim F X (x) = 0, x lim F X (x) = 1 x CDF and PMF x F X (x) = fx (t)dt, f X = df X dx F X (x) = x i x p X (x i ), p X (x k ) = F X (x k ) F X (x k 1 ) 20/65
21 Example 3.6 Take 3 tests, and the final score is the maximum of the test scores. Assume that the score in each test is from 1 to 10, each with probability 1/10, independent of the scores in other tests. What is the PMF of the final score X? Random variable X i is the score for test i. Then X = max(x 1, X 2, X 3 ) (X x) = (X 1 x) (X 2 x) (X 3 x) ( x 3 F X (x) = F X1 (x)f X2 (x)f X3 (x) = 10) ( ) k 3 ( k 1 p X (k) = F X (k) F X (k 1) = ) 3 21/65
22 CDF of a Geometric Random Variable Random variable X is discrete with X geometric(p). The CDF of X is x F X (x) = p X (k) k=1 = 1 (1 p) x, x 0 At the positive integers F X (n) = 1 (1 p) n 22/65
23 CDF of an Exponential Random Variable Random variable Y is continuous with Y exponential(α). The CDF of Y is y F Y (y) = fy (t)dt = y 0 αe αt dt = 1 e αy, y 0 At the positive integers F Y (n) = 1 (e α ) n 23/65
24 Exponential and Geometric An exponential random variable, which is continuous, can be approximated by a geometric random variable, which is discrete. Random variable Y exponential(α) and X geometric(p). If we set p in X to have then 1 p = e α or equivalently p = 1 e α F Y (n) = F X (n) at the positive integers. The agreement between F Y and F X only at the positive integers is somewhat coarse. 24/65
25 Finer Approximation Refer to Figure 3.8. For a finer approximation, we introduce a granularity parameter δ and ask CDFs to agree at every δ. Specifically, we define discrete random variable W = δ X and require F Y (nδ) = F W (nδ), n = 1, 2,... Since (W nδ) = (X n), this is equivalent to Thus, we set p to F Y (nδ) = F W (nδ) = F X (n) 1 e αnδ = 1 (1 p) n 1 p = e αδ p = 1 e αδ In summary, Y exponential(α) is approximated by W = δx. W is the time of the first success in a sequence of independent Bernoulli trials conducted every δ. X is Bernoulli with parameter p = 1 e αδ 25/65
26 Normal Random Variables (Gaussians) 26/65
27 Parameters in PDF A normal random variable X with parameters µ and σ 2, denoted by X N (µ, σ 2 ) has image I X = R and PDF f X = 1 e (x µ)2 2σ 2 2πσ a single peak at µ large σ 2, short and fat PDF see Figure /65
28 Properties of Normal Random Variables Random variable X is normal, with X N (µ, σ 2 ). f X dx = 1 E[X] = µ var(x) = σ 2 Good exercise of calculus to prove these results. 28/65
29 Standard Normal Random variable Y is a standard normal random variable if Y is normal with zero mean and unit variance. That is Y N (0, 1) The CDF of Y is specifically denoted by Φ(y) Φ(y) = P(Y y) = y 1 2π e t2 /2 dt The values of Φ(y) for y > 0 are listed in a table. For y < 0 Φ(y) = P(Y y) = 1 P(Y y) = 1 Φ( y) 29/65
30 Normal and Standard Normal Easy conversion between a normal RV and a standard normal RV. Random variable Y is standard normal. Then X = σy + µ is a normal random variable with mean µ and variance σ 2. Random variable X is normal with mean µ and variance σ 2. Then Y = X µ σ is a standard normal random variable. 30/65
31 CDF of a Normal Random Variable By a simple linear transformation, the CDF of a normal RV is related to the CDF of a standard normal RV. Random variable X is normal with X N (µ, σ 2 ). Then ( X µ P(X x) = P x µ ) σ σ ( = P Y x µ ) σ ( ) x µ = Φ σ where random variable Y is standard normal. 31/65
32 Example 3.7 The yearly snowfall at Mountain Rainier is modeled as a normal random variable with a mean of µ = 60 and a standard deviation of σ = 20. What is the probability that this year s snowfall will be at least 80 inches? Random variable X is the snowfall this year. Then P(X 80) = 1 P(X 80) ( X 60 = 1 P 20 = 1 P(Y 1) = 1 Φ(1) = = ) /65
33 Example 3.8 Signal Detection A binary message is transmitted as a signal S, which is either 1 or +1. The channel corrupts the transmission with an additive normal noise N with mean 0 and variance σ 2. The receiver receives Y = S + N and decides that S = 1 (or S = +1) if Y < 0 (or Y 0). What is the probability of error event E? P(E) = P(E (S = 1)) + P(E (S = 1)) = P(E S = 1)P(S = 1) + P(E S = 1)P(S = 1) = P(Y < 0 S = 1)P(S = 1) + P(Y > 0 S = 1)P(S = 1) = P(N 1)P(S = 1) + P(N 1)P(S = 1) ( N 0 = P(N 1) = 1 P(N < 1) = 1 P < 1 0 ) σ σ ( ) 1 = 1 Φ σ 33/65
34 Multiple Continuous Random Variables 34/65
35 Joint Probability Density Function Random variables X and Y are continuous and defined on a model (Ω, F, P). We assign probability to infinitesimal event X (x, δ x ) Y (y, δ y ) The assignment is through a joint probability density function (joint PDF) f XY P(X (x, δ x ) Y (y, δ y )) = f XY (x, y)δ x δ y The probability model of X and Y is completely specified by image I XY and a joint PDF f XY, so it can be represented by ((X, Y ), I XY, f XY ) 35/65
36 The Probability of an Event Random variables X and Y are continuous. The probability of the event (X, Y ) B is a double integration of the joint PDF of X and Y P((X, Y ) B) = f XY dxdy B 36/65
37 Properties of Joint PDF Random variables X and Y are continuous. The joint PDF of X and Y must satisfy non-negativity totality f XY 0 f XY dxdy = 1 37/65
38 Marginalization Random variables X and Y are continuous. The joint PDF f XY contains complete information of the probability of X and Y. Marginal PDFs f X and f Y can be derived from joint PDF f XY. The probability of the event X A is Similarly P(X A) = P((X A) (Y (, ))) A f X dx = A f XY dydx f X = f XY dy f Y = f XY dx 38/65
39 Example 3.9 Uniform PDF Random variables X and Y are the arrival times of Romeo and Juliet. Assume them to be continuous and uniform. What is the joint PDF? Since there is no preference to any time, the joint PDF is constant { c, if 0 x 1 and 0 y 1 f XY (x, y) = 0, otherwise c is determined by f XY (x, y)dxdy = 1 c = 1 39/65
40 Example 3.10 Random variables X and Y are continuous. The joint PDF of X and Y is a constant c on S, as shown in Figure 3.12, and is zero outside S. Determine the value of c and f X. { c, if (x, y) S f XY (x, y) = 0, otherwise f XY dxdy = 1 c = 1 4 f X = 3 4, if 1 x 2 f XY dy = 1 4, if 2 x 3 0, otherwise 40/65
41 Example 3.11 Buffon s Needle A surface is ruled with parallel lines, which are at distance d from each other. A needle of length l is thrown on the surface at random. Assume that l < d. What is the probability that the needle will intersect one of the lines? 41/65
42 Refer to Figure Random variable X is the distance from the center of needle to the nearest line, and Θ is the acute angle between needle and line. The joint PDF f XΘ is uniform f XΘ = f X f Θ = 4 πd, 0 x d 2, 0 θ π 2 P(needle crossing line) = P (X l ) 2 sin Θ = f XΘ dxdθ x l 2 sin θ = π 2 0 = 2l πd = 2l πd l 2 sin θ 0 π 2 0 sin θdθ 4 πd dxdθ 42/65
43 Conditional Probability 43/65
44 Conditioning on an Event Random variable X is continuous and event A is non-null. The conditional probability of event X (x, δ) given A is proportional to δ and depends on x, so it can be written as P(X (x, δ) A) = f X A (x)δ We call f X A the conditional probability density function (conditional PDF) of X given A. The conditional probability of X B given A is an integration of conditional PDF P(X B A) = f X A (x)dx B 44/65
45 Conditioning on X C Conditioning on event X C, we have P((X (x, δ)) (X C)) P(X (x, δ) X C) = P(X C) f X (x)δ = ((x, δ) C)? P(X C) : 0 = f X {X C} (x)δ f X {X C} (x) = ((x, δ) C)? 1 That is, f X {X C} (x) either magnifies by depending on whether x C. Figure f X (x) P(X C) : 0 P(X C) or vanishes, 45/65
46 Example 3.13 A Memoryless Random Variable The time T until a new light bulb burns out is an exponential random variable with parameter α. Alice turns the light on, leaves the room, and when she returns, t time units later, finds that the light bulb is still on, which corresponds to event A = (T > t). X is the additional time until the light bulb eventually burns out. What is the conditional CDF of X given A? P(X x A) = P(T t + x T > t) = = P((T t + x) (T > t)) P(T > t) P(t < T t + x) P(T > t) = e αt e α(t+x) e αt Note that P(X x T > t) is independent of t. = 1 e αx 46/65
47 Total Probability Theorem Random variable X is continuous and defined on (Ω, F, P), and {A 1,..., A n } is a partition of Ω. Then n f X (x) = P(A i )f X Ai (x) i=1 n F X (x) = P(X x) = P(X x A i ) i=1 n n x = P(A i )P(X x A i ) = P(A i ) fx Ai (x )dx i=1 i=1 x n x = P(A i )f X Ai (x )dx = fx (x )dx i=1 n f X (x) = P(A i )f X Ai (x) i=1 47/65
48 Example 3.14 Waiting Time A train arrives at a station every quarter hour starting at 6:00am. You walk into the station between 7:10am and 7:30am, and your arrival time is uniform over this interval. What is the PDF of the time you have to wait for the first train to arrive? Event A = {catch the 7:15 train} and random variable X is the waiting time. P(A) = 1 4, f X A(x) = 1 5, 0 x 5 P(A c ) = 3 4, f X A c (x) = 1 15, 0 x 15 By the total probability theorem f X (x) = P(A)f X A (x) + P(A c )f X A c (x) { 1 = , 0 x , 5 x 15 48/65
49 Conditioning on Continuous Random Variable Random variables X and Y are continuous and defined on (Ω, F, P). The conditional probability of event X (x, δ x ) given event Y (y, δ y ) is P(X (x, δ x ) Y (y, δ y )) = P((X (x, δ x)) (Y (y, δ y ))) P(Y (y, δ y )) = f XY (x, y)δ x δ y = f XY (x, y) δ x f Y (y)δ y f Y (y) It does not depend on δ y and it is proportional to δ x, so we write P(X (x, δ x ) Y (y, δ y )) = P(X (x, δ x ) Y = y) = f X Y (x y)δ x where f X Y (x y), called conditional probability density function (conditional PDF), specifies the probability density of X at x when Y = y. 49/65
50 Chain Rule Equating the expressions of P(X (x, δ x ) Y (y, δ y )), we get f X Y = f XY f Y We have shown that marginal PDF f Y can be derived from joint PDF f XY. Thus, conditional PDF f X Y can also be derived from joint PDF f XY. Conversely, since f XY = f X Y f Y we can derive f XY given f Y and f X Y (or given f X and f Y X ). 50/65
51 Example 3.15 Darting Ben throws a dart at a circular target of radius r. The random point of impact is (X, Y ). We assume that he always hits the target, and that all points of impact are equally likely. What is the conditional PDF f X Y? f Y = f XY dx = f X Y = f XY f Y = f X Y = f XY f Y r 2 y 2 1 r 2 y 2 πr 2 dx = 2 πr 2 r 2 y r 2 y, x r 2 y /65
52 Example 3.16 Police Radar The speed of a vehicle that drives past a police radar is modeled as an exponential random variable X with mean 50 miles per hour. The police radar s measurement Y of the vehicle s speed has an error which is modeled as a normal random variable with zero mean and standard deviation equal to one tenth of the vehicle s speed. What is the joint PDF of X and Y? f XY = f X f Y X = αe αx 1 2πσY X (y µ Y X )2 2σ e 2 Y X = 1 50 e x 50 1 (y x)2 ( 2π x e 2( 10) 10) x 2 52/65
53 Conditional Expectation Random variable X is continuous and defined on (Ω, F, P). 1 conditioning on an event: A is a non-null event. The conditional expectation of X given A is E[X A] = x f X A (x)dx 2 conditioning on a random variable: Y is continuous and is defined on (Ω, F, P). The conditional expectation of X given Y = y is E[X Y = y] = x f X Y (x y)dx 53/65
54 Total Expectation Random variables X and Y are continuous. Then E[X] = E[X Y = y] f Y (y)dy E[X] = xf X (x)dx = x f XY (x, y)dydx = x f X Y (x y)f Y (y)dydx ( ) = xf X Y (x y)dx f Y (y)dy = E[X Y = y] f Y (y)dy 54/65
55 Total Expectation Theorem Random variable X is continuous and defined on (Ω, F, P), and {A 1,..., A n } is a partition of Ω with P(A i ) > 0. Then n E[X] = P(A i )E[X A i ] i=1 E[X] = xf X (x)dx n = x P(A i )f X Ai (x)dx i=1 n = P(A i ) xf X Ai (x)dx i=1 n = P(A i )E[X A i ] i=1 55/65
56 Example 3.17 Random variable X is continuous with image I X = [0, 2] and PDF 1/3, 0 x 1 f X (x) = 2/3, 1 < x 2 0, otherwise Find E[X] and var(x) via total expectation with the partition {A 1 = (X [0, 1]), A 2 = (X [1, 2])} 56/65
57 Independent Random Variables Random variables X and Y are continuous. They are independent if f XY (x, y) = f X (x)f Y (y) This is denoted by X Y Random variables X Y. For any y I Y f X Y (x y) = f X (x) For any x I X f Y X (y x) = f Y (y) 57/65
58 Example 3.18 Independent Gaussians Random variables Xand Y are normal with means µ x, µ y and variances σ 2 x, σ 2 y, respectively, with X Y. What is the joint PDF of X and Y? 1 f XY = f X f Y = e (x µx ) 2σx 2 2πσx 2 1 e (y µy ) 2σy 2 2πσy 2 58/65
59 Bayes Rule 59/65
60 Inference Problem Random variable X has PDF f X and Y has conditional PDF f Y X. The decision of f X Y is called inference problem. The estimation of X given Y is a decoding problem. A common decoding criterion is ˆx = arg max f X Y (x y) The decoding-error event is E = (X ˆX) 60/65
61 Two Continuous Random Variables Random variables X and Y are continuous. Given f X and f Y X, the conditional PDF of X given Y is f X Y (x y) = f X (x)f Y X (y x) fx (x )f Y X (y x )dx Joint PDF f XY (x, y) = f X (x)f Y X (y x) Marginal PDF f Y (y) = f XY (x, y)dx = f X (x )f Y X (y x )dx Conditional PDF f X Y (x y) = f XY (x, y) f Y (y) = f X (x)f Y X (y x) fx (x )f Y X (y x )dx 61/65
62 Example 3.19 Light Bulb A light bulb is known to have an exponentially distributed lifetime Y. However, the manufacturing company is experiencing quality control problems, so the parameter Λ of the PDF of Y is random, uniformly distributed in the interval [1, 3/2]. We test a light bulb and record its lifetime y. What can we say about the parameter λ? Λ and Y are continuous. The joint PDF is f ΛY (λ, y) = f Λ (λ)f Y Λ (y λ) By Bayes rule f Λ Y (λ y) = f ΛY (λ, y) = f Y (y) = ( 1 λ 3 ) 2 f Λ (λ)f Y Λ (y λ) fλ (λ )f Y Λ (y λ )dλ? 2λe λy : λ e λ y dλ 62/65
63 A Discrete RV and a Continuous RV Random variable N is discrete and Y is continuous, both defined on (Ω, F, P). The probability of a joint event is P(N = n Y (y, δ)) = P(N = n)p(y (y, δ) N = n) = p N (n)f Y N (y n)δ The conditional PMF of N given Y (y, δ) is P(N = n Y (y, δ)) p N Y (n y) = P(N = n Y (y, δ)) = P(Y (y, δ)) P(N = n Y (y, δ)) = n P(N = n Y (y, δ)) p N (n)f Y N (y n)δ = n p N(n )f Y N (y n )δ p N (n)f Y N (y n) = n p N(n )f Y N (y n ) 63/65
64 Example 3.20 Binary Signal Transmission The simplest case is a binary RV. A signal S is transmitted, with P(S = 1) = p and P(S = 1) = 1 p. The received signal is Y = S + N, where N is a normal noise with zero mean and unit variance. What is the probability that S = 1, as a function of the observed value y of Y? p S (1)f Y S (y 1) p S Y (1 y) = p S (1)f Y S (y 1) + p S ( 1)f Y S (y 1) = = p 1 2π e (y 1)2 /2 p 1 2π e (y 1)2 /2 + (1 p) 1 2π e (y+1)2 /2 pe y pe y + (1 p)e y 64/65
65 Inference Based on Event Observation Random variable Y is continuous and event A is non-null. Given f Y (y) and P(A Y = y), the conditional PDF f Y A (y) is f Y A (y) = f Y (y)p(a Y = y) P(A) f Y (y)p(a Y = y) = fy (y )P(A Y = y )dy 65/65
1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationECEn 370 Introduction to Probability
ECEn 370 Introduction to Probability Section 001 Midterm Winter, 2014 Instructor Professor Brian Mazzeo Closed Book - You can bring one 8.5 X 11 sheet of handwritten notes on both sides. Graphing or Scientic
More informationLecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs
Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/
More informationConditioning a random variable on an event
Conditioning a random variable on an event Let X be a continuous random variable and A be an event with P (A) > 0. Then the conditional pdf of X given A is defined as the nonnegative function f X A that
More informationECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable
ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More informationChapter 4. Continuous Random Variables 4.1 PDF
Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will
More informationCS145: Probability & Computing
CS45: Probability & Computing Lecture 0: Continuous Bayes Rule, Joint and Marginal Probability Densities Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationAnalysis of Engineering and Scientific Data. Semester
Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each
More informationBasics of Stochastic Modeling: Part II
Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationChapter 2 Random Variables
Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung
More informationProbability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27
Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple
More informationMultivariate distributions
CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density
More informationStochastic Models of Manufacturing Systems
Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/
More informationWill Landau. Feb 21, 2013
Iowa State University Feb 21, 2013 Iowa State University Feb 21, 2013 1 / 31 Outline Iowa State University Feb 21, 2013 2 / 31 random variables Two types of random variables: Discrete random variable:
More informationMotivation and Applications: Why Should I Study Probability?
Motivation and Applications: Why Should I Study Probability? As stated by Laplace, Probability is common sense reduced to calculation. You need to first learn the theory required to correctly do these
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More information2 Continuous Random Variables and their Distributions
Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationMore on Distribution Function
More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution
More information3 Continuous Random Variables
Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More information4 Pairs of Random Variables
B.Sc./Cert./M.Sc. Qualif. - Statistical Theory 4 Pairs of Random Variables 4.1 Introduction In this section, we consider a pair of r.v. s X, Y on (Ω, F, P), i.e. X, Y : Ω R. More precisely, we define a
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationEE 302 Division 1. Homework 6 Solutions.
EE 3 Division. Homework 6 Solutions. Problem. A random variable X has probability density { C f X () e λ,,, otherwise, where λ is a positive real number. Find (a) The constant C. Solution. Because of the
More informationp. 6-1 Continuous Random Variables p. 6-2
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationContinuous distributions
CHAPTER 7 Continuous distributions 7.. Introduction A r.v. X is said to have a continuous distribution if there exists a nonnegative function f such that P(a X b) = ˆ b a f(x)dx for every a and b. distribution.)
More informationWe introduce methods that are useful in:
Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More informationProbability Review. Gonzalo Mateos
Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationsheng@mail.ncyu.edu.tw Content Joint distribution functions Independent random variables Sums of independent random variables Conditional distributions: discrete case Conditional distributions: continuous
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationProbability review. September 11, Stoch. Systems Analysis Introduction 1
Probability review Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ September 11, 2015 Stoch.
More informationLecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationProbability, Random Processes and Inference
INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More informationBasic concepts of probability theory
Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,
More informationChapter 4: Continuous Probability Distributions
Chapter 4: Continuous Probability Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 57 Continuous Random Variable A continuous random
More informationLecture 11. Probability Theory: an Overveiw
Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the
More informationSTAT 430/510: Lecture 16
STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions
More informationECEn 370 Introduction to Probability
RED- You can write on this exam. ECEn 370 Introduction to Probability Section 00 Final Winter, 2009 Instructor Professor Brian Mazzeo Closed Book Non-graphing Calculator Allowed No Time Limit IMPORTANT!
More informationBasic concepts of probability theory
Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,
More informationContinuous r.v practice problems
Continuous r.v practice problems SDS 321 Intro to Probability and Statistics 1. (2+2+1+1 6 pts) The annual rainfall (in inches) in a certain region is normally distributed with mean 4 and standard deviation
More informationSTAT 3610: Review of Probability Distributions
STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random
More informationWhy study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables
ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section
More informationLecture Notes 2 Random Variables. Random Variable
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationECE 302 Division 2 Exam 2 Solutions, 11/4/2009.
NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total
More informationFundamental Tools - Probability Theory II
Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random
More informationChapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables
Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results
More informationReview of Probability. CS1538: Introduction to Simulations
Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed
More informationMATH 151, FINAL EXAM Winter Quarter, 21 March, 2014
Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists
More information5.2 Continuous random variables
5.2 Continuous random variables It is often convenient to think of a random variable as having a whole (continuous) interval for its set of possible values. The devices used to describe continuous probability
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationContinuous random variables
Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot
More informationBasic concepts of probability theory
Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,
More informationProbability- the good parts version. I. Random variables and their distributions; continuous random variables.
Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density
More informationSolution to Assignment 3
The Chinese University of Hong Kong ENGG3D: Probability and Statistics for Engineers 5-6 Term Solution to Assignment 3 Hongyang Li, Francis Due: 3:pm, March Release Date: March 8, 6 Dear students, The
More informationSTAT509: Continuous Random Variable
University of South Carolina September 23, 2014 Continuous Random Variable A continuous random variable is a random variable with an interval (either finite or infinite) of real numbers for its range.
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More informationF X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.
10.2 Properties of PDF and CDF for Continuous Random Variables 10.18. The pdf f X is determined only almost everywhere 42. That is, given a pdf f for a random variable X, if we construct a function g by
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationMidterm Exam 1 Solution
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:
More informationHW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]
HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from
More informationSTAT515, Review Worksheet for Midterm 2 Spring 2019
STAT55, Review Worksheet for Midterm 2 Spring 29. During a week, the proportion of time X that a machine is down for maintenance or repair has the following probability density function: 2( x, x, f(x The
More informationP 1.5 X 4.5 / X 2 and (iii) The smallest value of n for
DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More informationDiscrete Mathematics and Probability Theory Fall 2015 Note 20. A Brief Introduction to Continuous Probability
CS 7 Discrete Mathematics and Probability Theory Fall 215 Note 2 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces Ω, where the number
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationFinal. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:
ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 06 Instructor: Prof. Stanley H. Chan Final Fall 06 (Dec 6, 06) Name: PUID: Please copy and write the following statement: I certify
More informationRandom Variables and Probability Distributions
CHAPTER Random Variables and Probability Distributions Random Variables Suppose that to each point of a sample space we assign a number. We then have a function defined on the sample space. This function
More information