Probability, Random Processes and Inference

Size: px
Start display at page:

Download "Probability, Random Processes and Inference"

Transcription

1 INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio

2 Course Content 1.4. General Random Variables Continuous Random Variables and PDFs Cumulative Distribution Function Normal Random Variables Joint PDFs of Multiple Random Variables Conditioning The Continuous Bayes Rule The Strong Law of Large Numbers 2

3 General Random Variables Continuous random variables The velocity of a vehicle traveling along the highway Continuous random variables can take on any real value in an interval. possibly of infinite length, such as (0, ) or the entire real line. In this section the concepts and method for discrete r.v.s, such as expectation, PMF, and conditioning, for their continuous counterparts are introduced. 3

4 Probability Density Function Continuous random variable. A random variable is called continuous if there exists a non negative function f X, called the probability density function of X, or PDF, such that: For every subset B of the real line 4

5 Probability Density Function The probability that the value of X falls within an interval is: which can be interpreted as the area under the graph of the PDF. 5

6 Probability Density Function 6

7 Probability Density Function For any single value a, we have: For this reason, including or excluding the endpoints of an interval has no effect on its probability: 7

8 Probability Density Function To qualify as a PDF, a function f X must be: o nonnegative, i.e., f X (x) 0 for every x, o have the normalisation property: Graphically, this means that the entire area under the graph of the PDF must be equal to 1. 8

9 Discrete vs. continuous r.v.s. Recall that for a discrete r.v., the CDF jumps at every point in the support, and is flat everywhere else. In contrast, for a continuous r.v. the CDF increases smoothly. 9

10 Discrete vs. continuous r.v.s. For a continuous r.v. X with CDF, F X (x), the probability density function (PDF) of X is the derivative f X (x) of the CDF, given by f X (x) = F X (x). The support of X, and of its distribution, is the set of all x where f X (x) > 0. The PDF represents the density of probability at the point x. 10

11 Probability Density Function To get from the PDF back to the CDF we apply: Thus, analogous to how we obtained the value of a discrete CDF at x by summing the PMF over all values less than or equal to x; here we integrate the PDF over all values up to x, so the CDF is the accumulated area under the PDF. 11

12 Probability Density Function Since we can freely convert between the PDF and the CDF using the inverse operations of integration and differentiation, both the PDF and CDF carry complete information about the distribution of a continuous r.v. Thus the PDF completely specifies the behavior of continuous random variables. 12

13 Probability Density Function For an interval [x, x+ ] with very small length, we have: So we can view f X (x) as the probability mass per unit length near x. Even though a PDF is used to calculate event probabilities, f X (x) is not the probability of any particular event. In particular, it is not restricted to be less than or equal to one. 13

14 Probability Density Function An important way in which continuous r.v.s differ from discrete r.v.s is that for a continuous r.v. X, P(X = x) = 0 for all x. This is because P(X = x) is the height of a jump in the CDF at x, but the CDF of X has no jumps! Since the PMF of a continuous r.v. would just be 0 everywhere, we work with a PDF instead. 14

15 Probability Density Function The PDF is analogous to the PMF in many ways, but there is a key difference: for a PDF f X, the quantity f X (x) is not a probability, and in fact it is possible to have f X (x) > 1 for some values of x. To obtain a probability, we need to integrate the PDF. In summary: To get a desired probability, integrate the PDF over the appropriate range. 15

16 Examples of PDFs The Logistic distribution has CDF: To get the PDF, we differentiate the CDF, which gives: Example: 16

17 Examples of PDFs 17

18 Examples of PDFs The Rayleigh distribution has CDF: To get the PDF, we differentiate the CDF, which gives: Example: 18

19 Examples of PDFs 19

20 Examples of PDFs A continuous r.v. X is said to have Uniform distribution on the interval (a, b) if its PDF is: The CDF is the accumulated area under the PDF: 20

21 Examples of PDFs We denote this by X Unif(a, b). The Uniform distribution that we will most frequently use is the Unif(0, 1) distribution, also called the standard Uniform. The Unif(0, 1) PDF and CDF are particularly simple: f(x) = 1 and F(x) = x for 0 < x < 1. For a general Unif(a, b) distribution, the PDF is constant on (a, b), and the CDF is ramp-shaped, increasing linearly from 0 to 1 as x ranges from a to b. 21

22 Examples of PDFs For Uniform distributions, probability is proportional to length. 22

23 PDF Properties 23

24 Expected Value and Variance of a Continuous r.v. The expected value or expectation or mean of a continuous r.v. X is defined by: This sis similar to the discrete case except that the PMF is replaced by the PDF, and summation is replaced by integration. Its mathematical properties are similar to the discrete case. 24

25 Expected Value and Variance of a Continuous r.v. If X is a continuous random variable with given PDF, then any real-valued function Y = ɡ(X) of X is also a random variable. Note that Y can be a continuous r.v., but Y can also be discrete, e.g., ɡ(x) = 1 for x 0 and ɡ(x) = 0, otherwise. In either case, the mean of ɡ(X) satisfies the expected value rule: 25

26 Expected Value and Variance of a Continuous r.v. The nth moment of a continuous r.v. X is defined as E[X n ], the expected value of the random variable X n. The variance of X denoted as var(x), is defined as the expected value of the random variable (X - E[X n ]) 2 : 26

27 Expected Value and Variance of a Continuous r.v. Example. Consider a uniform PDF over an interval [a, b], its expectation is given by: 27

28 Expected Value and Variance of a Continuous r.v. Its variance is given as: 28

29 Expected Value and Variance of a Continuous r.v. The exponential continuous random variable has PDF: where is a positive parameter characterising the PDF, with 29

30 Expected Value and Variance of a Continuous r.v. The probability that X exceeds a certain value decreases exponentially. This is, for any a 0, we have: An exponential random variable can be a good model for the amount of time until an incident of interest takes place. a message arriving at a computer, some equipment breaking down, a light bulb burning out, etc. 30

31 Expected Value and Variance of a Continuous r.v. 31

32 Expected Value and Variance of a Continuous r.v. The mean of the exponential r.v. X is calculated by: 32

33 Expected Value and Variance of a Continuous r.v. The variance of the exponential r.v. X is calculated by: 33

34 Cumulative Distribution Functions The cumulative distribution function, CDF, of a random variable X is denoted as F X and provides the probability P(X x). In particular for every x we have: The CDF F X (x) accumulates probability up to the value of x. 34

35 Cumulative Distribution Functions Any random variable associated with a given probability model has CDF, regardless of whether it is discrete or continuous. {X x} is always an event and therefore has well-defined probability. 35

36 Cumulative Distribution Functions 36

37 Cumulative Distribution Functions 37

38 Cumulative Distribution Functions 38

39 Cumulative Distribution Functions 39

40 Normal Random Variables A continuous random variable X is normal or Gaussian or normally distributed if it has PDF of the form: where μ and σ are two scalar parameters characterising the PDF (abbreviated N(μ, σ 2 ), and referred to as normal density function), with σ assumed positive. 40

41 Normal Random Variables It can be verified that the normalisation property holds: N(1,1) 41

42 Normal Random Variables If X is N(μ, σ 2 ), then: E(X) = μ Proof: The PDF is symmetric about x = μ. If X is N(μ, σ 2 ), then: Var(X) = σ 2 Proof: 42

43 Normal Random Variables Its maximum value occurs at the mean value of its argument. It is symmetrical about the mean value. The points of maximum absolute slope occur at one standard deviation above and below the mean. Its maximum value is inversely proportional to its standard deviation. The limit as the standard deviation approaches zero is a unit impulse. 43

44 Normal Random Variables 44

45 Linear Function of a Normal Random Variable If X is a normal r.v. with mean and variance 2, and if a 0, b are scalars, then the random variable: Y = ax + b is also normal, with mean and variance: E[Y] = a + b, var(y) = a

46 Standard Normal Random Variables A normal random variable Y with zero mean and unit variance, N(0, 1), is said to be a standard normal. Its PDF and CDF are denoted by and, respectively: 46

47 Standard Normal Random Variables The PDF of a normal r.v. cannot be integrated in terms of the common elementary functions, and therefore the probabilities of X falling in various intervals are obtained from tables or by computer. Example, the Standard Normal Table. The table only provides the values of (y) for y 0, because the omitted values can be calculated using the symmetry of the PDF. 47

48 Standard Normal Random Variables 48

49 Standard Normal Random Variables 49

50 Standard Normal Random Variables It would be overwhelming to construct tables for all μ and σ values required in application. Standardise the r.v. Let X be a normal (Gaussian) random variable with mean μ and variance σ 2 values. We standardise X by defining a new random variable Y given by: 50

51 Standard Normal Random Variables Since Y is a linear function of X, it is normal, This means: Thus, Y is a standard normal random variable. This allows us to calculate the probability of any event defined in terms of X by redefining the event in terms of Y, and then using the standard normal table. 51

52 Standard Normal Random Variables Example 1: 52

53 Standard Normal Random Variables Example 2: The annual snowfall at a particular geographic location is modelled as a normal random variable with a mean = 60 inches and a standard deviation of = 20. What is the probability that this year s snowfall will be at least 80 inches? 53

54 Standard Normal Random Variables Solution: 54

55 Standard Normal Random Variables Example 3: (Height Distribution of Men). Assume that the height X, in inches, of a randomly selected man in a certain population is normally distributed with μ = 69 and σ = 2.6. Find 1. P(X < 72), 2. P(X > 72), 3. P(X < 66), 4. P( X μ < 3). 55

56 Standard Normal Random Variables The table gives (z) only for z 0, and for z < 0 we need to make use of the symmetry of the normal distribution. This implies that, for any z, P(Z < z) = P(Z > z). Thus, solution: 56

57 Standard Normal Random Variables 57

58 Standard Normal Random Variables Normal r.v.s. are often used in signal processing and communications engineering to model noise and unpredictable distortions of signals. Example: 58

59 Standard Normal Random Variables 59

60 Standard Normal Random Variables Solution: 60

61 Standard Normal Random Variables Three important benchmarks for the Normal distribution are the probabilities of falling within one, two, and three standard deviations of the mean. The % rule tells us that these probabilities are what the name suggests. ( % rule). If X N(μ, 2 ), then: Standardising 61

62 Standard Normal Random Variables Three important benchmarks for the Normal distribution are the probabilities of falling within one, two, and three standard deviations of the mean. The % rule tells us that these probabilities are what the name suggests. ( % rule). If X N(μ, 2 ), then: Standardising 62

63 Joint PDF of Multiple Random Variables Two continuous random variables associated with the same experiment are jointly continuous and can be described in terms of a joint PDF f X,Y if f X,Y is a nonnegative function that satisfies: for every subset B of the two-dimensional plane. The notation means that the integration is carried out over the set B. 63

64 Joint PDF of Multiple Random Variables In the particular case where B is a rectangle of the form B = {(x, y) a x b, c y d}, we have: If B is the entire two-dimensional plane, then we obtain the normalisation property: 64

65 Joint PDF of Multiple Random Variables To interpret the joint PDF, we let be a small positive number and consider the probability of a small rectangle. Then we have: so we can view f X,Y (a, c) as the probability per unit area in the vicinity of (a, c). 65

66 Joint PDF of Multiple Random Variables 66

67 Joint PDF of Multiple Random Variables The joint PDF contains all relevant probabilistic information on the random variables X, Y, and their dependencies. Therefore, the joint PDF allow us to calculate the probability of any event that can be defined in terms of these two random variables. 67

68 Marginals Marginal PDF. For continuous r.v.s X and Y with joint PDF f X,Y, the marginal PDF of X is: Similarly, the marginal PDF of Y is: 68

69 Marginals Marginalisation works analogously with any number of variables. For example, if we have the joint PDF of X, Y, Z,W but want the joint PDF of X,W, we just have to integrate over all possible values of Y and Z: Conceptually this is very easy just integrate over the unwanted variables to get the joint PDF of the wanted variables but computing the integral may or may not be difficult. 69

70 Marginals Example 1. 70

71 Marginals Example 1. 71

72 Joint CDFs If X and Y are two random variables associated with the same experiment, their joint CDF is defined by: If X and Y are described by a joint PDF f X,Y, then: 72

73 Joint PDF of Multiple Random Variables Conversely, if X and Y are continuous with joint CDF F X,Y their joint PDF is the derivative of the joint CDF with respect to x and y: 73

74 Joint CDF of Multiple Random Variables Let X and Y be described by a uniform PDF on the unit square. The joint CDF is given by: It can be verified that: for al (x, y) in the unit square. 74

75 Expectation If X and Y are jointly continuous random variables and ɡ is some function, then Z = ɡ (X, Y) is also a random variable. Thus the expected value rule applies: As an important special case, for any scalars a, b, and c, we have: 75

76 More than Two Random Variables The joint PDF of three random variables X, Y, and Z is defined in analogy with the case of two random variables. For example: For any set B. We have the relations such as: 76

77 More than Two Random Variables The expected value rule takes the form: If ɡ is linear, of the form ax +by + cz, then: 77

78 More than Two Random Variables 78

79 More than Two Random Variables 79

80 Conditioning The conditional PDF of a continuous random variable X, given an event A with P(A) 0, is defined as a nonnegative function f X A that satisfies: for any subset B of the real line. 80

81 Conditioning In particular, by letting B be the entire real line, we obtain the normalisation property: so that f X A is a legitimate PDF. 81

82 Conditioning In the important special case where we condition on an event of form {X A}, with P(X A) 0, the definition of conditional probabilities yields: By comparing with the earlier formula, it gives: 82

83 Conditioning 83

84 Conditioning Example 1. 84

85 Joint Conditional PDF Suppose that X and Y are jointly continuous random variables, with joint PDF f X,Y. If we condition on a positive probability event of the form C = {(X,Y) A}, we have: In this case, the conditional PDF of X, given this event, can be obtained from the formula: 85

86 Joint Conditional PDF A version of the total probability theorem, which involves conditional PDFs is given as: if the events A 1,, A n form a partition of the sample space, then: Using the total probability theorem: 86

87 Joint Conditional PDF Finally, the formula can be written as: We then take the derivative of both sides, with respect to x, and obtain the desired result. 87

88 Joint Conditional PDF Example 2. 88

89 Joint Conditional PDF Example 3. 89

90 Joint Conditional PDF To interpret the conditional PDF, let us fix some small positive numbers 1 and 2, and condition on the event B = {y Y y + 2 }. We have: 90

91 Joint Conditional PDF Therefore, f X Y (x y) 1 provides us with the probability that X belongs to a small interval [x, x + 1 ], given that Y belongs to a small interval [y, y + 2 ]. Since f X Y (x y) 1 does not depend on 2, we can think of the limiting case where 2 decreases to zero and write: And more generally: 91

92 Joint Conditional PDF The conditional probability PDF f X Y (x y) can be seen as a description of the probability law of X, given that the event {Y = y} has occurred. As in the discrete case, the PDF f X Y, together with the marginal PDF f y are sometimes used to calculate the joint PDF. This approach can also be used for modelling: instead of directly specifying f X Y, it is often natural to provide a probability law for Y, in terms of a PDF f Y, and then provide a conditional PDF f X Y (x y) for X, given any possible value y of Y. 92

93 Joint Conditional PDF Example 4. The speed of a typical vehicle that drives past a police radar is modelled as an exponentially distributed random variable X with mean 50 miles per hour. The police radar s measurement Y of the vehicle s speed has an error which is modeled as a normal random variable with zero mean and standard deviation equal to one tenth of the vehicle s speed. What is the joint PDF of X and Y? 93

94 Joint Conditional PDF Solution. We have f X (x) = (1/50)e -x/50, for x 0. Also, conditioned on X = x, the measurement Y has a normal PDF with mean x and variance x 2 /100. Therefore: Thus, for all x 0 and all y: 94

95 Conditional PDF for More Than Two r.v.s. Conditional PDF can be defined for the extension for the case of more than two random variables: The analogue multiplication rule is given as: 95

96 Conditional Expectation For a continuous random variable X, we define the conditional expectation E[X A] given an event A, similar to the unconditional case, except that we now need to use the conditional PDF f X A. Let X and Y be jointly continuous random variables, and let A be an event with P(A) 0, then the conditional expectation of X given the event A is defined by: 96

97 Conditional Expectation The conditional expectation of X given that Y = y is defined by: The expectation rule, for a function ɡ(x): and 97

98 Conditional Expectation Total expectation theorem: Let A 1, A 2,, A n be disjoint events that form a partition of the same space, and assume that P(A i ) 0 for all i. Then: Similarly: 98

99 Conditional Expectation There are natural analogues for the case of functions of several random variables. For example: And: 99

100 Conditional Expectation Example

101 Independence Two continuous random variable X and Y are independent if their joint PDF is the product of the marginal PDFs: Comparing with the formula f X,Y (x, y) = f X Y (x y)f Y (y), we see that independence is the same as the condition: or, symmetrically: 101

102 Independence For the case of more than three random variables, for example, we say that three random variables X, Y, and Z are independent if: 102

103 Independence Example. Independent Normal Random Variables. Let X and Y be independent normal random variables with means x, y, and variances 2 x, 2 y, respectively. Their joint PDF is of the form: This joint PDF has the shape of a bell cantered at ( x, y ), and whose width in the x and y directions is proportional to 2 x and 2 y, respectively. 103

104 Independence Additional insight into the form of the PDF can be get by considering its contours. i.e., sets of points ata which the PDF takes a constant value. These contours are described by an equation of the form: and are ellipses whose two axes are horizontal and vertical. If 2 x = 2 y, then the contours are circles. 104

105 Independence 105

106 Independence 106

107 Independence If X and Y are independent, then any two events of the form {X A} and {Y B} are independent: 107

108 Independence Independence implies that: The property: can be used to provide a general definition of independence between two random variables, e.g., if X is discrete and Y is continuous. 108

109 Independence Similarly than to the discrete case, if X and Y are independent, then: for any two functions ɡ and h. The variance of the sum of independent random variables is equal to the sum of their variances: 109

110 Summary of Independence 110

111 Summary of Independence 111

112 The continuous Bayes rule Inference problem: We have an unobserved random variable X with known PDF, and we obtain a measurement Y according to a conditional PDF f X Y. Given an observed value y of Y, the inference problem is to evaluate the conditional PDF f X Y (x y). 112

113 The continuous Bayes rule Thus, whatever information is provided by the event {Y = y} is captured by the conditional PDF f X Y (x y). It thus suffices to evaluate this PDF. From the formula f X f Y X = f X,Y = f Y f X Y, it follows: 113

114 The continuous Bayes rule Based on the normalisation property an equivalent expression is: 114

115 The continuous Bayes rule 115

116 The continuous Bayes rule 116

117 Sums of Independent Random Variables Convolution Let Z = X + Y, where X and Y are independent integer-valued random variables with PMFs p X and p Y, respectively. Then, for any integer z: The resulting PMF p Z is called the convolution of the PMFs of X and Y. 117

118 Covariance and Correlation The covariance of two random variables X and Y, denoted by cov(x, Y), is defined as: When cov(x, Y) = 0, we say X and Y are uncorrelated. A positive o negative covariance indicates that the values of X E[X] and Y E[Y] obtained in a single experiment tend to have the same or the opposite sign, respectively. 118

119 Covariance and Correlation 119

120 Covariance and Correlation Multiplying this out and using linearity, we have an equivalent expression: Covariance has the following key properties: 1. Cov(X,X) = Var(X). 2. Cov(X, Y ) = Cov(Y,X). 3. Cov(X, c) = 0 for any constant c. 4. Cov(aX, Y ) = acov(x, Y ) for any constant a. 120

121 Covariance and Correlation 5. Cov(X + Y,Z) = Cov(X,Z) + Cov(Y,Z). 6. Cov(X + Y,Z +W) = Cov(X,Z) + Cov(X,W) + Cov(Y,Z) + Cov(Y,W). 7. Var(X + Y ) = Var(X) + Var(Y ) + 2Cov(X, Y ). For n r.v.s X 1,...,X n, 121

122 Covariance and Correlation The correlation coefficient (X,Y) of two random variables X and Y that have nonzero variances is defined as: It may be viewed as a normalised version of the covariance cov(x, Y). ranges from -1 to

123 Covariance and Correlation If > 0 (or < 0), then the values of X E[X] and Y E[Y] tend to have the same (or opposite, respectively) sign. The size of provides a normalized measure of the extent to which this is true. Always assuming that X and Y have positive variances, it cab be shown that = 1 (or = 1) if and only if there exists a positive (or negative, respectively) constant c such that: 123

124 Covariance and Correlation 124

125 Covariance and Correlation 125

126 The Weak Law of Large Numbers 126

127 The Central Limit Theorem 127

128 The Strong Law of Large Numbers 128

Probability, Random Processes and Inference

Probability, Random Processes and Inference INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,

More information

Chapter 2 Random Variables

Chapter 2 Random Variables Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung

More information

Probability, Random Processes and Inference

Probability, Random Processes and Inference INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

Probability, Random Processes and Inference

Probability, Random Processes and Inference INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Topic 4: Continuous random variables

Topic 4: Continuous random variables Topic 4: Continuous random variables Course 003, 2018 Page 0 Continuous random variables Definition (Continuous random variable): An r.v. X has a continuous distribution if there exists a non-negative

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Topic 4: Continuous random variables

Topic 4: Continuous random variables Topic 4: Continuous random variables Course 3, 216 Page Continuous random variables Definition (Continuous random variable): An r.v. X has a continuous distribution if there exists a non-negative function

More information

DEEP LEARNING CHAPTER 3 PROBABILITY & INFORMATION THEORY

DEEP LEARNING CHAPTER 3 PROBABILITY & INFORMATION THEORY DEEP LEARNING CHAPTER 3 PROBABILITY & INFORMATION THEORY OUTLINE 3.1 Why Probability? 3.2 Random Variables 3.3 Probability Distributions 3.4 Marginal Probability 3.5 Conditional Probability 3.6 The Chain

More information

General Random Variables

General Random Variables 1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability A general random variable is discrete, continuous, or mixed. A discrete random variable

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Review (Probability & Linear Algebra)

Review (Probability & Linear Algebra) Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState Random variables, Expectation, Mean and Variance Slides are adapted from STAT414 course at PennState https://onlinecourses.science.psu.edu/stat414/ Random variable Definition. Given a random experiment

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

p. 6-1 Continuous Random Variables p. 6-2

p. 6-1 Continuous Random Variables p. 6-2 Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

Conditional distributions (discrete case)

Conditional distributions (discrete case) Conditional distributions (discrete case) The basic idea behind conditional distributions is simple: Suppose (XY) is a jointly-distributed random vector with a discrete joint distribution. Then we can

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adopted from Prof. H.R. Rabiee s and also Prof. R. Gutierrez-Osuna

More information

6.041/6.431 Fall 2010 Quiz 2 Solutions

6.041/6.431 Fall 2010 Quiz 2 Solutions 6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1 Joint probability distributions: Discrete Variables Two Discrete Random Variables Probability mass function (pmf) of a single discrete random variable X specifies how much probability mass is placed on

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Probability, Random Processes and Inference

Probability, Random Processes and Inference INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx

More information

Probability Distributions for Continuous Variables. Probability Distributions for Continuous Variables

Probability Distributions for Continuous Variables. Probability Distributions for Continuous Variables Probability Distributions for Continuous Variables Probability Distributions for Continuous Variables Let X = lake depth at a randomly chosen point on lake surface If we draw the histogram so that the

More information

STAT/MATH 395 PROBABILITY II

STAT/MATH 395 PROBABILITY II STAT/MATH 395 PROBABILITY II Bivariate Distributions Néhémy Lim University of Washington Winter 2017 Outline Distributions of Two Random Variables Distributions of Two Discrete Random Variables Distributions

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

ECON Fundamentals of Probability

ECON Fundamentals of Probability ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,

More information

Chapter 4 continued. Chapter 4 sections

Chapter 4 continued. Chapter 4 sections Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:

More information

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B) REVIEW OF MAIN CONCEPTS AND FORMULAS Boolean algebra of events (subsets of a sample space) DeMorgan s formula: A B = Ā B A B = Ā B The notion of conditional probability, and of mutual independence of two

More information

We introduce methods that are useful in:

We introduce methods that are useful in: Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009. NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

ECEn 370 Introduction to Probability

ECEn 370 Introduction to Probability ECEn 370 Introduction to Probability Section 001 Midterm Winter, 2014 Instructor Professor Brian Mazzeo Closed Book - You can bring one 8.5 X 11 sheet of handwritten notes on both sides. Graphing or Scientic

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Outline Properties of Covariance Quantifying Dependence Models for Joint Distributions Lab 4. Week 8 Jointly Distributed Random Variables Part II

Outline Properties of Covariance Quantifying Dependence Models for Joint Distributions Lab 4. Week 8 Jointly Distributed Random Variables Part II Week 8 Jointly Distributed Random Variables Part II Week 8 Objectives 1 The connection between the covariance of two variables and the nature of their dependence is given. 2 Pearson s correlation coefficient

More information

Chapter 4: Continuous Random Variable

Chapter 4: Continuous Random Variable Chapter 4: Continuous Random Variable Shiwen Shen University of South Carolina 2017 Summer 1 / 57 Continuous Random Variable A continuous random variable is a random variable with an interval (either finite

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Bivariate Distributions

Bivariate Distributions STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 17 Néhémy Lim Bivariate Distributions 1 Distributions of Two Random Variables Definition 1.1. Let X and Y be two rrvs on probability space (Ω, A, P).

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Conditioning a random variable on an event

Conditioning a random variable on an event Conditioning a random variable on an event Let X be a continuous random variable and A be an event with P (A) > 0. Then the conditional pdf of X given A is defined as the nonnegative function f X A that

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Chapter 4: Continuous Probability Distributions

Chapter 4: Continuous Probability Distributions Chapter 4: Continuous Probability Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 57 Continuous Random Variable A continuous random

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables: CHAPTER 5 Jointl Distributed Random Variable There are some situations that experiment contains more than one variable and researcher interested in to stud joint behavior of several variables at the same

More information

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51 Yi Lu Correlation and Covariance Yi Lu ECE 313 2/51 Definition Let X and Y be random variables with finite second moments. the correlation: E[XY ] Yi Lu ECE 313 3/51 Definition Let X and Y be random variables

More information

Class 8 Review Problems 18.05, Spring 2014

Class 8 Review Problems 18.05, Spring 2014 1 Counting and Probability Class 8 Review Problems 18.05, Spring 2014 1. (a) How many ways can you arrange the letters in the word STATISTICS? (e.g. SSSTTTIIAC counts as one arrangement.) (b) If all arrangements

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability

More information

18 Bivariate normal distribution I

18 Bivariate normal distribution I 8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer

More information

Discrete Random Variable

Discrete Random Variable Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n

More information

Probability, Random Processes and Inference

Probability, Random Processes and Inference INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx

More information

STAT509: Continuous Random Variable

STAT509: Continuous Random Variable University of South Carolina September 23, 2014 Continuous Random Variable A continuous random variable is a random variable with an interval (either finite or infinite) of real numbers for its range.

More information

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type.

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type. Expectations of Sums of Random Variables STAT/MTHE 353: 4 - More on Expectations and Variances T. Linder Queen s University Winter 017 Recall that if X 1,...,X n are random variables with finite expectations,

More information

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables. Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

Homework 10 (due December 2, 2009)

Homework 10 (due December 2, 2009) Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2}, ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

ECE Lecture #9 Part 2 Overview

ECE Lecture #9 Part 2 Overview ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by

More information

Lecture 3 Continuous Random Variable

Lecture 3 Continuous Random Variable Lecture 3 Continuous Random Variable 1 Cumulative Distribution Function Definition Theorem 3.1 For any random variable X, 2 Continuous Random Variable Definition 3 Example Suppose we have a wheel of circumference

More information

Appendix A : Introduction to Probability and stochastic processes

Appendix A : Introduction to Probability and stochastic processes A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of

More information

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

COMPSCI 240: Reasoning Under Uncertainty

COMPSCI 240: Reasoning Under Uncertainty COMPSCI 240: Reasoning Under Uncertainty Andrew Lan and Nic Herndon University of Massachusetts at Amherst Spring 2019 Lecture 20: Central limit theorem & The strong law of large numbers Markov and Chebyshev

More information

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C, Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: PMF case: p(x, y, z) is the joint Probability Mass Function of X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) PDF case: f(x, y, z) is

More information

Properties of Continuous Probability Distributions The graph of a continuous probability distribution is a curve. Probability is represented by area

Properties of Continuous Probability Distributions The graph of a continuous probability distribution is a curve. Probability is represented by area Properties of Continuous Probability Distributions The graph of a continuous probability distribution is a curve. Probability is represented by area under the curve. The curve is called the probability

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 5: Review on Probability Theory Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt Febraury 22 th, 2015 1 Lecture Outlines o Review on probability theory

More information

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45 Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions

More information

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion IEOR 471: Stochastic Models in Financial Engineering Summer 27, Professor Whitt SOLUTIONS to Homework Assignment 9: Brownian motion In Ross, read Sections 1.1-1.3 and 1.6. (The total required reading there

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS

UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS In many practical situations, multiple random variables are required for analysis than a single random variable. The analysis of two random variables especially

More information

1 Exercises for lecture 1

1 Exercises for lecture 1 1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

Introduction to Probability

Introduction to Probability LECTURE NOTES Course 6.041-6.431 M.I.T. FALL 2000 Introduction to Probability Dimitri P. Bertsekas and John N. Tsitsiklis Professors of Electrical Engineering and Computer Science Massachusetts Institute

More information

Math Review Sheet, Fall 2008

Math Review Sheet, Fall 2008 1 Descriptive Statistics Math 3070-5 Review Sheet, Fall 2008 First we need to know about the relationship among Population Samples Objects The distribution of the population can be given in one of the

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the

More information

Chapter 5,6 Multiple RandomVariables

Chapter 5,6 Multiple RandomVariables Chapter 5,6 Multiple RandomVariables ENCS66 - Probabilityand Stochastic Processes Concordia University Vector RandomVariables A vector r.v. is a function where is the sample space of a random experiment.

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be Chapter 6 Expectation and Conditional Expectation Lectures 24-30 In this chapter, we introduce expected value or the mean of a random variable. First we define expectation for discrete random variables

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) 1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For

More information

ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables Department o Electrical Engineering University o Arkansas ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Two discrete random variables

More information