2 (Statistics) Random variables
|
|
- Carmella Reeves
- 5 years ago
- Views:
Transcription
1 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes in statistics: random variables and their distribution. 2.1 Single random variables and distributions Consider an experiment in a sample space S. A random variable is a function that transforms the sample space into R, assigning a real value to each possible outcome of the experiment. In general we denote it by X(s) (s S) or simply X when there is no scope for confusion. Attached to each random variable is a probability rule that measures the likelihood of a particular outcome. Suppose A is a set in R and we wish to measure the probability that X A. This is given by: p(x A) = p(s S : X(s) A) A probability rule is generally described by a function, the cumulative distribution function, abbreviated as cdf. For any real value x, the cdf F X (x) is defined as follows: F (x) = p(x x) For a function F X defined in all R to be a cdf it must satisfy the following conditions: be non-decreasing; F X (+ ) = 1 F X () = 0 p(a < X b) = F X (b) F X (a) A random variable X is discrete or follows a discrete distribution if F can take only a finite number of different values {k 1, k 2,..., k n }. A random variable X is discrete or follows a discrete distribution if F can take only a discrete number of different values (possibly infinitely many) {k 1, k 2,..., k n,...}. A random variable X is continuous or follows a continuous distribution if it can take every value in a real interval. 1
2 2.1.1 Discrete distributions A discrete random variable takes values only on a discrete set of values, {k 1, k 2,...}. For these variables we can define a function f X called the probability density function and abbreviated as pdf : f X (x) = p(x = x) The pdf measures the likelihood of each particular outcome x. Notice: If x is not one of the possible outcomes of the experiment, {k 1, k 2,...}, then f X (x) = 0. At points in the possible space of outcomes, the pdf will be strictly positive. These are called mass points of the distribution. Thus f X is always non-negative. Since it measures the probability of each possible event, it cannot be bigger than 1. The pdf of X defines the cdf of X as follows: F X (x) = p(x x) = f X (k i ) i:k i x Since F X is a cdf, it must be that: F X is non-decreasing: if x 2 > x 1 then all values k i that are no larger then x 1 must not be larger than x 2 and thus F X (x 2 ) must be at least equal to F X (x 1 ); F X (+ ) = all i f X(k i ) = 1; F X () = 0; p(a < X b) = i:a<k i b f X(k i ) = i:k i b f X(k i ) i:k i a f X(k i ) = F X (b) F X (a). As a consequence of its definition, the cdf F X is a step function. Examples of discrete distributions: discrete uniform, bernoulli, binomial Continuous distributions A continuous random variable X can assume any value within an interval which may be bounded or not. 2
3 In this case, the cdf of X, F X may be continuous over the whole R or, at least, will be continuous over intervals of R - it may have some discontinuity points. To start with, suppose that the cdf of X is continuous and differentiable over R. Then we can define the function f X as follows: f X (x) = df X(x) dx = F X(x) (1) The function f X defined above is called the probability density function and abbreviated pdf. It measures the marginal change (increase) in F X as x changes infinitesimally. A consequence of this definition is that f X (x) is always non-negative. The reverse of (1) is that the cdf can be defined as being the function F that satisfies the following condition: F X (x) = x f X (x)dx meaning that it measures the area below the curve of f X. This definition of cdf can be extended to random variables that follow a continuous distribution in all R except possibly for a finite number of points. In this case f X (x) = F X (x) for all x where the derivative exists. Notice that from our definition the following properties follow: f X(x)dx = 1; p(x > x) = x f X (t)dt = 1 F X (x); p(a < X b) = F X (b) F X (a) = b a f X(x)dx; p(a < X b) = p(a X < b) = p(a < X < b) = p(a X b); the two properties above then imply that at a point a where the distribution of X is continuous: p(x = a) = p(a X a) = = 0 a a f X (x)dx meaning that the likelihood of the realisation of a particular value in the (continuous) distribution of X is zero. But then the function f X is non-unique: it can be changed in a discrete (finite or infinite) number of points and still form the same cdf. We solve the ambiguity by using always the continuous version of f X unless there are reasons to do it otherwise. Examples: uniform distribution, normal distribution. 3
4 2.1.3 Functions of a random variable Consider a discrete random variable first: Suppose the random variable X is defined on the space {k 1, k 2,...}, following a discrete distribution with pdf f X so that the probability of x is p(x = x) = f X (x). Now consider a transformation of X through a function h to form a new random variable Y : Y = h(x). Y follows a new probability rule f Y which is defined as follows: Now consider a continuous random variable: f Y (y) = p(y = y) = p(h(x) = y) = f X (x) x:h(x)=y X is a continuous random variable with a pdf f X (x). Consider again a transformation of X through a function h. The resulting random variable is Y = h(x). The cdf of Y can now be defined as: F Y (y) = P (Y y) = p(h(x) y) = f X (x)dx x:h(x) y Now suppose that h is strictly monotonic (either increasing or decreasing). Thus h is invertible and we can write X = h 1 (Y ). In this case, the cdf of Y is: f Y (y) = f X (h 1 (y)) dh 1 (y) dy Moments The distribution of a random variable contains all the information about it. However, it is often cumbersome and difficult to present. Instead, some functions of the random variable summarise the distribution and are often presented. The most commonly used functions are the moments of the random variable. Expected value: central moment of the distribution. 4
5 For a discrete random variable with possible realisations {k 1, k 2,..., k n } E X (X) = n k i f X (k i ) i=1 For a continuous random variable: E X (X) = xf X (x)dx What is the expected valued of Z = h(x) where X is a continuous random variable? The expected value may or may not lie at the centre of the distribution of X. Some properties of the expected value: E(c) = c where c is a constant; E(a + bx) = a + be(x) where a and b re scalars; If g(x) = g 1 (X) + g 2 (X) is a function then E(g(X)) = E(g 1 (X)) + E(g 2 (X)). If g is a non-linear function then E(g(X)) is generally different from g(e(x)). Variance: measures the dispersion of the distribution. The variance of a distribution is given by: V (X) = E [ (X E(X)) 2] = E(X 2 ) E(X) 2 We also define standard deviation to be: sd(x) = V (X) Some properties of the variance: V (c) = 0 where c is a constant; V (ax) = a 2 V (X) where a is a scalar; V (ax + b) = a 2 V (X) where a and b are scalars. Higher order moments: These help characterise a distribution. They may be centred or not: non centred moment of order k: E(X k ); centred moment of order k: E [ (X E(X)) k]. Median: another central moment of the distribution. It is the point m that divides the distribution in two parts, each with a probability of
6 The median of the distribution of a continuous random variable X is defined as follows: median(x) = m if p(x m) = p(x > m) = 0.5 The median of the distribution of a discrete random variable X is defined as the smallest value m such that: p(x m) 0.5 Quantile: the median is an example of a quantile, the 0.5-quantile. In general, the p-quantile of a distribution is the value x that divides the distribution in two parts, one with probability p and the other with probability 1 p. For a continuous random variable X, the p-quantile is defined as: Q p (X) = x if p(x x) = p For a discrete random variable X, the p-quantile is defined as the smallest x such that p(x x) p 2.2 Bivariate distributions We may imagine cases where more than one random variable is required to describe an experiment. We will now study how to deal with more than one random variable in simultaneous. Let (X, Y ) be a pair of random variables. We now want to characterise their joint distribution. Discrete case: if both X and Y are discrete random variables defined on the space S, the joint pdf is f XY (x, y) = p(x = x, Y = y) Again f XY is always non-negative and satisfies: f XY (x, y) = 1 (x,y) S The cdf is now: F XY (x, y) = x i x f XY (x i, y j ) y j y 6
7 Continuous case: if X and Y are continuous random variables, the joint cdf is F XY (x, y) = P (X x, Y y) This is a nondecreasing function in both arguments such that: F XY (, ) = 0 F XY (+, + ) = 1 We can now define the pdf to be: and thus: f XY (x, y) = 2 F XY (x, y) x y p(a x < X b x, a y < Y b y ) = bx by a x a y f XY (x, y)dydx and p(x a, Y b) = bx by = F XY (a, b) f XY (x, y)dydx Marginal distribution Consider again the case of two random variables (X, Y ). If the joint cdf is known, then the cdf of each variable can be derived. In the discrete case, this amounts to sum over all the possible values of the other variable. Let S be the support of (X, Y ) (the set of possible values that (X, Y ) may assume) and suppose there are M X and M Y different possible values that X and Y can assume, respectively. The marginal distribution of X is defined by its marginal pdf, f X, as follows: f X (x) = p(x = x) = and similarly to Y : M Y f XY (x, y j ) j=1 f Y (y) = p(y = y) = M X f XY (x i, y) i=1 7
8 In the continuous case we need to integrate over one of the variables to obtain the cdf of the other: F X (x) = F Y (y) = x y f XY (x, y)dydx f XY (x, y)dydx The marginal cdf s can now be obtained from the first derivatives of the marginal pdf: f X (x) = f Y (y) = f XY (x, y)dy f XY (x, y)dx Conditional distribution We have encountered the concept of conditional probability before. We can now apply it to distribution functions. Suppose we have a pair of random variables (X, Y ) and wish to determine the probability of some realisation of y given that we have some information about X. In particular, we can derive: p(y y, X x) P (Y y X x) = p(x x) = F XY (x, y) F X (x) This is true for both discrete and continuous random variables. For discrete random variables we can immediately write the pdf in a similar way: p(y = y, X = x) p(y = y X = x) = p(x = x) = f XY (x, y) f X (x) = f Y X (y x) and the cdf: p(y y X = x) = p(y y, X = x) p(x = x) = F Y X (y x) 8
9 For continuous random variables we need to take a small interval in X and write a similar relationship: p(y y, X x + ) p(y y, X x) P (Y y x < X x + ) = p(x x + ) p(x x) = F XY (x +, y) F XY (x, y) F X (x + ) F X (x) = [F XY (x +, y) F XY (x, y)]/ [F X (x + ) F X (x)]/ Taking the limits as approaches 0 yields p(y y X = x) = F XY (x, y)/ x df X (x)/ dx = F XY (x, y)/ x f X (x) = F Y X (y x) and we can now take the derivatives with respect to Y to obtain: In both the continuous and discrete cases: f Y X (y x) = F Y X(y x) y = f XY (x, y) f X (x) f XY (x, y) = f Y X (y x)f X (x) = f X Y (x y)f Y (y) and thus: f X Y (x y) = f Y X(y x)f X (x) f Y (y) Moments Expected value: Let g(x, Y ) be a function of the two random variables (X, Y ). Then: E XY (g(x, Y )) = { MX i=1 g(x, y)f XY (x, y)dydx for continuous random variables My j=1 g(x i, y j )f XY (x i, y j )dydx for discrete random variables But then: E XY (g 1 (X, Y ) + g 2 (X, Y )) = E XY (g 1 (X, Y )) + E XY (g 2 (X, Y )) 9
10 It is also true that: E XY (g(x)) = E X (g(x)) Covariance: cov(x, Y ) = E(XY ) E(X)E(Y ) Correlation: corr(x, Y ) = cov(xy ) / V (X)V (Y ) Independence Two random variables (X, Y ) are independent if F XY (x, y) = F X (x)f Y (y) but this implies that f XY (x, y) = f X (x)f Y (y) and f X Y (x y) = f X (x) which means that knowing one does not say anything about the other. In this case we have some results for the moments. If X and Y are independent then: E(XY ) = E(X)E(Y ) V (ax + by ) = a 2 V (X) + b 2 V (Y ) E(X Y ) = E(X) and E(Y X) = E(Y ) where E(Y X = x) = yf Y X (y x)dy Iterated expectations This is a very useful result. It states that: E X (X) = E Y [ EX Y (X Y ) ] Based on this result we can prove that, for example: if E(Y X) = 0 then E(XY ) = 0; if E(Y X) = 0 then E(Y ) = 0. 10
11 2.3 Many random variables The above results extend simply to the case where there are many random variables. In such case they are generally arranged in vectors. Let X = [X 1 X 2... X n ] be an n 1 vector of random variables. The joint distribution function is: F X (x) = p(x x) = p(x 1 x 1, X 2 x 2,..., X n x n ) The joint pdf is: f X (x) = n F X (x) x 1 x 2... x n Some of the most important moments are the following. Expected value: E X (X) = E X1 (X 1 ) E X2 (X 2 ). E Xn (X n ) Each of the expectations inside the vector are performed using the marginal distributions, so for example: E X1 (X 1 ) = =... x 1 f X1 (x 1 )dx 1 x 1 f X (x 1, x 2,..., x n )dx 1 dx 2... dx n The variance is given by: V X (X) = E(XX ) E(X)E(X) An important example is: V X (a X + b) = V X (X a) = a V (X)a This is a quadratic form. Since the variance is always non-negative, it yields that V X (a X + b) 0. But then, V (X) is psd. 11
12 2.4 Exercises 1. An exam consists of 100 multiple-choice questions. Form each question there are four possible answers, only one of them being correct. If a candidate guesses answers at random, what is the probability of getting at least 30 questions correct? 2. Two fair dices are thrown. Let X be the number of points in the first die and Y be the number of points in the second die. Define Z = X + Y and W = XY. Find the expectations and variances of X, Y, Z, W. E(W 2 ). Also find E(X 2 ), E(Y 2 ), E(Z 2 ) and 3. The pdf of a random variable X is: f(x) = { αx(2 x) if 0 < x < 2 0 otherwise Find α, E(X) and V (X). 4. Let (X, Y, Z) be independent random variables such that: E(X) = 1 and V (X) = 2 E(y) = 0 and V (Y ) = 3 E(Z) = 1 and V (Z) = 4 Let T = 2X + Y 3Z + 4 U = (X + Z)(Y + Z) Find E(T ), V (T ), E(T 2 ) and E(U). 12
Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationProbability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.
Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationElements of Probability Theory
Short Guides to Microeconometrics Fall 2016 Kurt Schmidheiny Unversität Basel Elements of Probability Theory Contents 1 Random Variables and Distributions 2 1.1 Univariate Random Variables and Distributions......
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationProperties of Summation Operator
Econ 325 Section 003/004 Notes on Variance, Covariance, and Summation Operator By Hiro Kasahara Properties of Summation Operator For a sequence of the values {x 1, x 2,..., x n, we write the sum of x 1,
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More informationECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.
ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers
More informationChapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables
Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results
More informationLecture 11. Probability Theory: an Overveiw
Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationProbability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3
Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................
More informationSTAT 515 MIDTERM 2 EXAM November 14, 2018
STAT 55 MIDTERM 2 EXAM November 4, 28 NAME: Section Number: Instructor: In problems that require reasoning, algebraic calculation, or the use of your graphing calculator, it is not sufficient just to write
More informationmatrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2
Short Guides to Microeconometrics Fall 2018 Kurt Schmidheiny Unversität Basel Elements of Probability Theory 2 1 Random Variables and Distributions Contents Elements of Probability Theory matrix-free 1
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationECON Fundamentals of Probability
ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,
More informationExam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)
Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested
More informationp. 6-1 Continuous Random Variables p. 6-2
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables
More informationMore than one variable
Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationf X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx
INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationEC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)
1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationECE Lecture #9 Part 2 Overview
ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More informationRandom Variables and Expectations
Inside ECOOMICS Random Variables Introduction to Econometrics Random Variables and Expectations A random variable has an outcome that is determined by an experiment and takes on a numerical value. A procedure
More informationSTAT 430/510: Lecture 16
STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions
More informationRandom Variables. P(x) = P[X(e)] = P(e). (1)
Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment
More information1 Joint and marginal distributions
DECEMBER 7, 204 LECTURE 2 JOINT (BIVARIATE) DISTRIBUTIONS, MARGINAL DISTRIBUTIONS, INDEPENDENCE So far we have considered one random variable at a time. However, in economics we are typically interested
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More informationIAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES
IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More information01 Probability Theory and Statistics Review
NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement
More informationReview of Probability. CS1538: Introduction to Simulations
Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationIntroduction to Machine Learning
What does this mean? Outline Contents Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola December 26, 2017 1 Introduction to Probability 1 2 Random Variables 3 3 Bayes
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More information(4) Suppose X is a random variable such that E(X) = 5 and Var(X) = 16. Find
Chapter 5 Exercises A random variable X has the distribution: X 6 PX 6 5 6 Find the expected value, variance, and standard deviation of X. A number Y is chosen at random from the set S = {,, }. Find the
More informationUCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE53 Handout #7 Prof. Young-Han Kim Tuesday, May 6, 4 Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei). Neural net. Let Y = X + Z, where the signal X U[,] and noise Z N(,) are independent.
More informationMathematics 426 Robert Gross Homework 9 Answers
Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX
More informationMULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
More informationSolution to Assignment 3
The Chinese University of Hong Kong ENGG3D: Probability and Statistics for Engineers 5-6 Term Solution to Assignment 3 Hongyang Li, Francis Due: 3:pm, March Release Date: March 8, 6 Dear students, The
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationMath 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS
Name: ANSWE KEY Math 46: Probability MWF pm, Gasson Exam SOLUTIONS Problem Points Score 4 5 6 BONUS Total 6 Please write neatly. You may leave answers below unsimplified. Have fun and write your name above!
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More informationChapter 5 Joint Probability Distributions
Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two
More informationChapter 2. Continuous random variables
Chapter 2 Continuous random variables Outline Review of probability: events and probability Random variable Probability and Cumulative distribution function Review of discrete random variable Introduction
More informationENGG2430A-Homework 2
ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,
More informationProblem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q
Problem Set #5 Econ 103 Part I Problems from the Textbook Chapter 4: 1, 3, 5, 7, 9, 11, 13, 15, 25, 27, 29 Chapter 5: 1, 3, 5, 9, 11, 13, 17 Part II Additional Problems 1. Suppose X is a random variable
More informationThis does not cover everything on the final. Look at the posted practice problems for other topics.
Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry
More informationChapter 4. Continuous Random Variables 4.1 PDF
Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will
More informationJoint Distribution of Two or More Random Variables
Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few
More informationChapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:
4.1 Bivariate Distributions. Chapter 4. Multivariate Distributions For a pair r.v.s (X,Y ), the Joint CDF is defined as F X,Y (x, y ) = P (X x,y y ). Obviously, the marginal distributions may be obtained
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationStat 5101 Notes: Algorithms
Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................
More informationChapter 3. Chapter 3 sections
sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional Distributions 3.7 Multivariate Distributions
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationMA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2
MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is
More information1 Probability theory. 2 Random variables and probability theory.
Probability theory Here we summarize some of the probability theory we need. If this is totally unfamiliar to you, you should look at one of the sources given in the readings. In essence, for the major
More informationStat 5101 Notes: Algorithms (thru 2nd midterm)
Stat 5101 Notes: Algorithms (thru 2nd midterm) Charles J. Geyer October 18, 2012 Contents 1 Calculating an Expectation or a Probability 2 1.1 From a PMF........................... 2 1.2 From a PDF...........................
More informationData Analysis and Monte Carlo Methods
Lecturer: Allen Caldwell, Max Planck Institute for Physics & TUM Recitation Instructor: Oleksander (Alex) Volynets, MPP & TUM General Information: - Lectures will be held in English, Mondays 16-18:00 -
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationGeneral Random Variables
1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability A general random variable is discrete, continuous, or mixed. A discrete random variable
More informationLecture 4: Proofs for Expectation, Variance, and Covariance Formula
Lecture 4: Proofs for Expectation, Variance, and Covariance Formula by Hiro Kasahara Vancouver School of Economics University of British Columbia Hiro Kasahara (UBC) Econ 325 1 / 28 Discrete Random Variables:
More informationBivariate Distributions
Bivariate Distributions EGR 260 R. Van Til Industrial & Systems Engineering Dept. Copyright 2013. Robert P. Van Til. All rights reserved. 1 What s It All About? Many random processes produce Examples.»
More informationActuarial Science Exam 1/P
Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationAnalysis of Engineering and Scientific Data. Semester
Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationExpectation and Variance
Expectation and Variance August 22, 2017 STAT 151 Class 3 Slide 1 Outline of Topics 1 Motivation 2 Expectation - discrete 3 Transformations 4 Variance - discrete 5 Continuous variables 6 Covariance STAT
More informationECON 5350 Class Notes Review of Probability and Distribution Theory
ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one
More informationSTT 441 Final Exam Fall 2013
STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are
More information3-1. all x all y. [Figure 3.1]
- Chapter. Multivariate Distributions. All of the most interesting problems in statistics involve looking at more than a single measurement at a time, at relationships among measurements and comparisons
More informationECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable
ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous
More informationWeek 12-13: Discrete Probability
Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationPreliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com
1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics,
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationProblem 1. Problem 2. Problem 3. Problem 4
Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication
More informationSTAT 430/510 Probability Lecture 7: Random Variable and Expectation
STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011 Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula
More information