Distributions of linear combinations
|
|
- Timothy Owen
- 5 years ago
- Views:
Transcription
1 Distributions of linear combinations CE 311S
2 MORE THAN TWO RANDOM VARIABLES
3 The same concepts used for two random variables can be applied to three or more random variables, but they are harder to visualize (triple integrals, triple sums, etc.) One common thing to do with multiple random variables is to calculate linear combinations of them. More than two random variables
4 DISTRIBUTIONS OF LINEAR COMBINATIONS
5 A linear combination of the random variables X 1,..., X n has the form a 1 X 1 + a 2 X a n X n That is, we multiply each random variable by a constant coefficient, and add them up. Examples: X 1 + X 2 ; X 1 X 2 ; 5X X 2 + 3X 3 Distributions of Linear Combinations
6 To calculate the total of n random variables, we have a linear combination with a 1 = a 2 = = a n = 1 To calculate the difference between 2 random variables, we have a linear combination with a 1 = 1 and a 2 = 1 I want to calculate the toll revenue on SH-130 today. If X 1 is the number of cars and X 2 the number of semi trucks, the revenue is a 1 X 1 + a 2 X 2 where a 1 and a 2 are the toll charged to each car and truck. Distributions of Linear Combinations
7 We have the following formulas: For any random variables X 1,..., X n E[a 1 X 1 + a 2 X 2 + a n X n ] = a 1 E[X 1 ] + a 2 E[X 2 ] + + a n E[X n ] V [a 1 X 1 + a 2 X 2 + a n X n ] = n n a i a j Cov(X i, X j ) i=1 j=1 If X 1,..., X n are independent, the formula for variance simplifies to V [a 1 X 1 + a 2 X 2 + a n X n ] = a 2 1V [X 1 ] + a 2 2V [X 2 ] + + a 2 nv [X n ] Distributions of Linear Combinations
8 Example The toll on SH-130 is $1.50 for cars and $4.50 for trucks. The mean and variance of the number of cars is 15,000 and 250,000; and the mean and variance of the number of trucks is 5,000 and 10,000. The number of cars and trucks is correlated, with covariance 50,000. What is the mean and standard deviation of the daily toll revenue? The toll is 1.50X X 2 where X 1 and X 2 are the number of cars and trucks. E[1.50X X 2 ] = 1.50E[X 1 ] E[X 2 ] = dollars V [1.50X X 2 ] = Cov(X 1, X 1 ) + (1.50)(4.50)Cov(X 1, X 2 ) +(4.50)(1.50)Cov(X 2, X 1 ) Cov(X 2, X 2 ) = 2.25V [X 1 ] Cov(X 1, X 2 ) V [X 2 ] = so the standard deviation is = 1200 dollars Distributions of Linear Combinations
9 Example I run a business where my daily revenue has a mean of 1500 and a standard deviation of 400, while my daily costs have a mean of 1000 and a standard deviation of 300. What is the mean and standard deviation of my daily profit, assuming my daily revenue and costs are independent? Π = R X so E[Π] = E[R] E[X ] = 500 V [R X ] = V [R] + V [X ] = = so σ R X = 500 This is one reason why we use variance even though standard deviation is easier to interpret. Variances add nicely, standard deviations do not. Distributions of Linear Combinations
10 Example I run a business where my daily revenue has a mean of 1500 and a standard deviation of 400, while my daily costs have a mean of 1000 and a standard deviation of 300. What is the mean and standard deviation of my daily profit, assuming my daily revenue and costs have a correlation coefficient of +0.5? Π = R X so E[Π] = E[R] E[X ] = 500 V [R X ] = V [R] + V [X ] 2Cov(R, X ) = (0.5)(300)(400) = so σ R X = 360 If revenue and costs were negatively correlated, would my daily profit have a higher or lower standard deviation? Distributions of Linear Combinations
11 Let s try to derive these formulas with n = 2 to keep the numbers manageable: E [a 1 X 1 + a 2 X 2 ] = (a 1 x 1 + a 2 x 2 )p(x 1, x 2 ) x 1 x 2 = a 1 x 1 p(x 1, x 2 ) + a 2 x 2 p(x 1, x 2 ) x 1 x 2 x 1 x 2 = a 1 x 1 p(x 1, x 2 ) + a 2 x 2 p(x 1, x 2 ) x 2 x 2 x 1 x 1 = a 1 E[X 1 ] + a 2 E[X 2 ] Notice that we did not have to assume independence to derive this formula. Distributions of Linear Combinations
12 What about the variance? V [a 1 X 1 + a 2 X 2 ] = E[(a 1 X 1 + a 2 X 2 µ a1 X 1 +a 2 X 2 ) 2 ] = E[(a 1 (X 1 µ 1 ) + a 2 (X 2 µ 2 )) 2 ] = E[a1(X 2 1 µ 1 ) 2 + a 1 a 2 (X 1 µ 1 )(X 2 µ 2 )+ a 2 a 1 (X 2 µ 2 )(X 1 µ 1 ) + a 2 a 2 (X 2 µ 2 )(X 2 µ 2 )] = a 1 a 1 E[(X 1 µ 1 )(X 1 µ 1 )] + a 1 a 2 E[(X 1 µ 1 )(X 2 µ 2 )]+ a 2 a 1 E[(X 2 µ 1 )(X 1 0µ 1 )] + a 2 a 2 E[(X 2 µ 1 )(X 2 µ 1 )] = 2 2 a i a j Cov(X 1, X 2 ) i=1 j=1 Distributions of Linear Combinations
13 If X 1 and X 2 are independent, their covariance is zero, so the formula simplifies to a 2 1Cov(X 1, X 1 ) + a 2 2Cov(X 2, X 2 ) or simply a 2 1 V [X 1] + a 2 2 V [X 2] Distributions of Linear Combinations
14 WHAT ARE STATISTICS?
15 Remember the measures of location and variability from Chapter 1? What was the purpose of these?
16 We wanted to use a single number to describe the data set in some way. (This is the definition of a statistic. In mathematical terms: Consider a sample of n elements, and let X i describe the variable of the i-th member of the sample. A statistic is a random variable Y which is determined from the random variables X 1,..., X n Examples: Sample mean: Y = n i=1 X i/n Maximum value: Y = max n i=1 {X i} Total: Y = n i=1 X i
17 The important thing to notice is that the statistics are random variables themselves. Let s say I roll a die five times, and take the average of the values. 6, 6, 3, 1, , 1, 1, 5, , 6, 3, 2, , 3, 2, 4, , 6, 6, 1, Each sample could have a different mean, so the sample means form a random variable (taking the values 3.4, 2.8, 4.0, 3.4, 3.6, and so on). Can we say anything meaningful about its distribution?
18 Yes, we can. In fact, we will shortly see that if n is large, the sample mean has a normal distribution no matter what the distribution of the X i is. Furthermore, its mean is simply the mean of the individual random variables, and its variance is the variance of the individual random variables, divided by n.
19 To begin, we need to make some assumptions about the X i The random variables X 1,..., X n are a random sample if they are independent and identically distributed. (This is often abbreviated as the iid property.)
20 Assume that X 1,..., X n are a random sample, and let X represent the sample mean: n i=1 X = X i n What is E[X ]? In the dice example above, this is asking what the average is of the average of five dice rolls. This is conceptually different from asking what the average is of each roll of the die, although we might think the answers should be the same.
21 Let s try it with n = 2 to keep the numbers manageable: [ n i=1 E[X ] = E X ] i = E[(X 1 + X 2 )/2] n = x 1 + x 2 p(x 1, x 2 ) 2 x 1 x 2 by independence = x 1 + x 2 p X1 (x 1 )p X2 (x 2 ) 2 x 1 x 2 = x 1 x 2 x 1 2 p X 1 (x 1 )p X2 (x 2 ) + x 2 2 p X 1 (x 1 )p X2 (x 2 ) x 1 x 2
22 = x 1 x 1 2 p X 1 (x 1 ) x 2 p X2 (x 2 ) + x 2 x 2 2 p X 2 (x 2 ) x 1 p X1 (x 1 ) rearranging the sums (Why?) = x 1 x 1 2 p X 1 (x 1 ) x 2 x 2 2 p X 2 (x 2 ) = E[X 1] 2 + E[X 2] 2
23 However, since X 1 and X 2 are identically distributed, they have the same expected value (3.5) which we can write as E[X ]: E[X 1 ] 2 + E[X 2] 2 = E[X ] 2 + E[X ] 2 = E[X ] = 3.5 So, all this is just to show the intuitive result E[X ] = E[X ], that is, that the expected value of the sample mean is just the expected value of each observation.
24 What about the variance? V [X ] = E[(X E[X ]) 2 ] because E[X ] = E[X ]. Writing µ = E[X ] to save space, we have = V [X ] = E = 1 4 [ (X1 + X 2 2 ) ] 2 µ = E = 1 4 E [ {(X 1 µ) + (X 2 µ)} 2] [ (X1 ) ] + X 2 2µ 2 = 1 4 E [ (X 1 µ) 2 + (X 2 µ) 2 + 2(X 1 µ)(x 2 µ) ] ( E[(X1 µ) 2 ] + E[(X 2 µ) 2 ] + 2E[(X 1 µ)(x 2 µ)] ) 2 = V [X ]/2
25 The same results hold even for larger n: If X 1,... X n are a random sample (with each X i having mean µ and variance σ 2, then E[X ] = µ and V [X ] = σ2 n
26 The central limit theorem goes one step further and specifies what type of distribution the sample mean has: Let X 1,..., X n be a random sample. Then if n is sufficiently large, X has approximately a normal distribution, with mean and standard deviation given on the previous slide. This is true no matter what distribution the X i are taken from. As a practical rule of thumb, if n > 30 it is safe to use the Central Limit Theorem.
27 Example I flip a coin 49 times, and calculate the proportion of flips which were heads. What is the probability that this proportion is between 0.49 and 0.51?
28 Visualizing the central limit theorem This is the PMF for a random variable:
29 Visualizing the central limit theorem This is the PMF of the average of two independent draws of the same random variable:
30 Visualizing the central limit theorem This is the PMF of the average of three independent draws of the same random variable:
31 Visualizing the central limit theorem This is the PMF of the average of thirty independent draws of the same random variable:
32 Visualizing the central limit theorem This is the PDF for a random variable:
33 Visualizing the central limit theorem This is the PDF of the average of two independent draws of the same random variable:
34 Visualizing the central limit theorem This is the PDF of the average of three independent draws of the same random variable:
35 Visualizing the central limit theorem This is the PDF of the average of thirty independent draws of the same random variable:
STAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationJointly Distributed Random Variables
Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,
More information2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.
CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook
More informationProbabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation
EE 178 Probabilistic Systems Analysis Spring 2018 Lecture 6 Random Variables: Probability Mass Function and Expectation Probability Mass Function When we introduce the basic probability model in Note 1,
More informationBrief Review of Probability
Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic
More informationMath 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =
Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2
IEOR 316: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 2 More Probability Review: In the Ross textbook, Introduction to Probability Models, read
More informationBusiness Statistics 41000: Homework # 2 Solutions
Business Statistics 4000: Homework # 2 Solutions Drew Creal February 9, 204 Question #. Discrete Random Variables and Their Distributions (a) The probabilities have to sum to, which means that 0. + 0.3
More information18.440: Lecture 19 Normal random variables
18.440 Lecture 19 18.440: Lecture 19 Normal random variables Scott Sheffield MIT Outline Tossing coins Normal random variables Special case of central limit theorem Outline Tossing coins Normal random
More informationSTA Module 4 Probability Concepts. Rev.F08 1
STA 2023 Module 4 Probability Concepts Rev.F08 1 Learning Objectives Upon completing this module, you should be able to: 1. Compute probabilities for experiments having equally likely outcomes. 2. Interpret
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation
More informationMAT Mathematics in Today's World
MAT 1000 Mathematics in Today's World Last Time We discussed the four rules that govern probabilities: 1. Probabilities are numbers between 0 and 1 2. The probability an event does not occur is 1 minus
More informationLecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality
Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek Bhrushundi
More informationReview of Probability. CS1538: Introduction to Simulations
Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed
More informationProbability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3
Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................
More informationHomework 4 Solution, due July 23
Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var
More information18.05 Exam 1. Table of normal probabilities: The last page of the exam contains a table of standard normal cdf values.
Name 18.05 Exam 1 No books or calculators. You may have one 4 6 notecard with any information you like on it. 6 problems, 8 pages Use the back side of each page if you need more space. Simplifying expressions:
More informationDiscrete Random Variables
Discrete Random Variables We have a probability space (S, Pr). A random variable is a function X : S V (X ) for some set V (X ). In this discussion, we must have V (X ) is the real numbers X induces a
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationConfidence Intervals
Quantitative Foundations Project 3 Instructor: Linwei Wang Confidence Intervals Contents 1 Introduction 3 1.1 Warning....................................... 3 1.2 Goals of Statistics..................................
More informationMATH 3C: MIDTERM 1 REVIEW. 1. Counting
MATH 3C: MIDTERM REVIEW JOE HUGHES. Counting. Imagine that a sports betting pool is run in the following way: there are 20 teams, 2 weeks, and each week you pick a team to win. However, you can t pick
More informationWeek 04 Discussion. a) What is the probability that of those selected for the in-depth interview 4 liked the new flavor and 1 did not?
STAT Wee Discussion Fall 7. A new flavor of toothpaste has been developed. It was tested by a group of people. Nine of the group said they lied the new flavor, and the remaining 6 indicated they did not.
More informationExpectations and Variance
4. Model parameters and their estimates 4.1 Expected Value and Conditional Expected Value 4. The Variance 4.3 Population vs Sample Quantities 4.4 Mean and Variance of a Linear Combination 4.5 The Covariance
More informationExpectations and Variance
4. Model parameters and their estimates 4.1 Expected Value and Conditional Expected Value 4. The Variance 4.3 Population vs Sample Quantities 4.4 Mean and Variance of a Linear Combination 4.5 The Covariance
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More informationEXPECTED VALUE of a RV. corresponds to the average value one would get for the RV when repeating the experiment, =0.
EXPECTED VALUE of a RV corresponds to the average value one would get for the RV when repeating the experiment, independently, infinitely many times. Sample (RIS) of n values of X (e.g. More accurately,
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More informationReview for Exam Spring 2018
Review for Exam 1 18.05 Spring 2018 Extra office hours Tuesday: David 3 5 in 2-355 Watch web site for more Friday, Saturday, Sunday March 9 11: no office hours March 2, 2018 2 / 23 Exam 1 Designed to be
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationDiscussion 03 Solutions
STAT Discussion Solutions Spring 8. A new flavor of toothpaste has been developed. It was tested by a group of people. Nine of the group said they liked the new flavor, and the remaining indicated they
More informationSolutionbank S1 Edexcel AS and A Level Modular Mathematics
Heinemann Solutionbank: Statistics S Page of Solutionbank S Exercise A, Question Write down whether or not each of the following is a discrete random variable. Give a reason for your answer. a The average
More informationMS&E 226: Small Data. Lecture 11: Maximum likelihood (v2) Ramesh Johari
MS&E 226: Small Data Lecture 11: Maximum likelihood (v2) Ramesh Johari ramesh.johari@stanford.edu 1 / 18 The likelihood function 2 / 18 Estimating the parameter This lecture develops the methodology behind
More informationReview of probabilities
CS 1675 Introduction to Machine Learning Lecture 5 Density estimation Milos Hauskrecht milos@pitt.edu 5329 Sennott Square Review of probabilities 1 robability theory Studies and describes random processes
More informationFundamentals of Probability CE 311S
Fundamentals of Probability CE 311S OUTLINE Review Elementary set theory Probability fundamentals: outcomes, sample spaces, events Outline ELEMENTARY SET THEORY Basic probability concepts can be cast in
More informationProbability and Statistical Decision Theory
Tufts COMP 135: Introduction to Machine Learning https://www.cs.tufts.edu/comp/135/2019s/ Probability and Statistical Decision Theory Many slides attributable to: Erik Sudderth (UCI) Prof. Mike Hughes
More informationClass 26: review for final exam 18.05, Spring 2014
Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event
More information6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips
6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips Hey, everyone. Welcome back. Today, we're going to do another fun problem that
More informationPrecept 4: Hypothesis Testing
Precept 4: Hypothesis Testing Soc 500: Applied Social Statistics Ian Lundberg Princeton University October 6, 2016 Learning Objectives 1 Introduce vectorized R code 2 Review homework and talk about RMarkdown
More informationCS 361: Probability & Statistics
February 12, 2018 CS 361: Probability & Statistics Random Variables Monty hall problem Recall the setup, there are 3 doors, behind two of them are indistinguishable goats, behind one is a car. You pick
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 10: Expectation and Variance Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/ psarkar/teaching
More informationCovariance and Correlation
Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability
More informationIntelligent Data Analysis. Principal Component Analysis. School of Computer Science University of Birmingham
Intelligent Data Analysis Principal Component Analysis Peter Tiňo School of Computer Science University of Birmingham Discovering low-dimensional spatial layout in higher dimensional spaces - 1-D/3-D example
More information1 Proof techniques. CS 224W Linear Algebra, Probability, and Proof Techniques
1 Proof techniques Here we will learn to prove universal mathematical statements, like the square of any odd number is odd. It s easy enough to show that this is true in specific cases for example, 3 2
More informationSTAT 430/510 Probability Lecture 7: Random Variable and Expectation
STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011 Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula
More informationLecture 16 : Independence, Covariance and Correlation of Discrete Random Variables
Lecture 6 : Independence, Covariance and Correlation of Discrete Random Variables 0/ 3 Definition Two discrete random variables X and Y defined on the same sample space are said to be independent if for
More informationAnalysis of Engineering and Scientific Data. Semester
Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each
More informationChapter 2.5 Random Variables and Probability The Modern View (cont.)
Chapter 2.5 Random Variables and Probability The Modern View (cont.) I. Statistical Independence A crucially important idea in probability and statistics is the concept of statistical independence. Suppose
More informationRVs and their probability distributions
RVs and their probability distributions RVs and their probability distributions In these notes, I will use the following notation: The probability distribution (function) on a sample space will be denoted
More informationMidterm Exam 1 Solution
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:
More informationWeek 12-13: Discrete Probability
Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible
More informationk P (X = k)
Math 224 Spring 208 Homework Drew Armstrong. Suppose that a fair coin is flipped 6 times in sequence and let X be the number of heads that show up. Draw Pascal s triangle down to the sixth row (recall
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationPROBABILITY THEORY REVIEW
PROBABILITY THEORY REVIEW CMPUT 466/551 Martha White Fall, 2017 REMINDERS Assignment 1 is due on September 28 Thought questions 1 are due on September 21 Chapters 1-4, about 40 pages If you are printing,
More informationthe time it takes until a radioactive substance undergoes a decay
1 Probabilities 1.1 Experiments with randomness Wewillusethetermexperimentinaverygeneralwaytorefertosomeprocess that produces a random outcome. Examples: (Ask class for some first) Here are some discrete
More informationA brief review of basics of probabilities
brief review of basics of probabilities Milos Hauskrecht milos@pitt.edu 5329 Sennott Square robability theory Studies and describes random processes and their outcomes Random processes may result in multiple
More information18.440: Lecture 26 Conditional expectation
18.440: Lecture 26 Conditional expectation Scott Sheffield MIT 1 Outline Conditional probability distributions Conditional expectation Interpretation and examples 2 Outline Conditional probability distributions
More informationChapter 1 Review of Equations and Inequalities
Chapter 1 Review of Equations and Inequalities Part I Review of Basic Equations Recall that an equation is an expression with an equal sign in the middle. Also recall that, if a question asks you to solve
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationExpectation is linear. So far we saw that E(X + Y ) = E(X) + E(Y ). Let α R. Then,
Expectation is linear So far we saw that E(X + Y ) = E(X) + E(Y ). Let α R. Then, E(αX) = ω = ω (αx)(ω) Pr(ω) αx(ω) Pr(ω) = α ω X(ω) Pr(ω) = αe(x). Corollary. For α, β R, E(αX + βy ) = αe(x) + βe(y ).
More informationClass 8 Review Problems 18.05, Spring 2014
1 Counting and Probability Class 8 Review Problems 18.05, Spring 2014 1. (a) How many ways can you arrange the letters in the word STATISTICS? (e.g. SSSTTTIIAC counts as one arrangement.) (b) If all arrangements
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More informationRandom variables (discrete)
Random variables (discrete) Saad Mneimneh 1 Introducing random variables A random variable is a mapping from the sample space to the real line. We usually denote the random variable by X, and a value that
More informationExercises with solutions (Set D)
Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where
More informationCh. 5 Joint Probability Distributions and Random Samples
Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However,
More informationLecture 1: Probability Fundamentals
Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability
More informationJUSTIN HARTMANN. F n Σ.
BROWNIAN MOTION JUSTIN HARTMANN Abstract. This paper begins to explore a rigorous introduction to probability theory using ideas from algebra, measure theory, and other areas. We start with a basic explanation
More informationContinuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014
Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem 18.5 Spring 214.5.4.3.2.1-4 -3-2 -1 1 2 3 4 January 1, 217 1 / 31 Expected value Expected value: measure of
More informationSteve Smith Tuition: Maths Notes
Maths Notes : Discrete Random Variables Version. Steve Smith Tuition: Maths Notes e iπ + = 0 a + b = c z n+ = z n + c V E + F = Discrete Random Variables Contents Intro The Distribution of Probabilities
More informationDEFINITION: IF AN OUTCOME OF A RANDOM EXPERIMENT IS CONVERTED TO A SINGLE (RANDOM) NUMBER (E.G. THE TOTAL
CHAPTER 5: RANDOM VARIABLES, BINOMIAL AND POISSON DISTRIBUTIONS DEFINITION: IF AN OUTCOME OF A RANDOM EXPERIMENT IS CONVERTED TO A SINGLE (RANDOM) NUMBER (E.G. THE TOTAL NUMBER OF DOTS WHEN ROLLING TWO
More informationCS 246 Review of Proof Techniques and Probability 01/14/19
Note: This document has been adapted from a similar review session for CS224W (Autumn 2018). It was originally compiled by Jessica Su, with minor edits by Jayadev Bhaskaran. 1 Proof techniques Here we
More informationNotes 12 Autumn 2005
MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.
More informationWhat is a random variable
OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr
More informationSampling Distributions
Sampling Error As you may remember from the first lecture, samples provide incomplete information about the population In particular, a statistic (e.g., M, s) computed on any particular sample drawn from
More informationECON Fundamentals of Probability
ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2016 Page 0 Expectation of a discrete random variable Definition: The expected value of a discrete random variable exists, and is defined by EX
More informationEE 345 MIDTERM 2 Fall 2018 (Time: 1 hour 15 minutes) Total of 100 points
Problem (8 points) Name EE 345 MIDTERM Fall 8 (Time: hour 5 minutes) Total of points How many ways can you select three cards form a group of seven nonidentical cards? n 7 7! 7! 765 75 = = = = = = 35 k
More informationBernoulli Trials, Binomial and Cumulative Distributions
Bernoulli Trials, Binomial and Cumulative Distributions Sec 4.4-4.6 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 9-3339 Cathy Poliak,
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationExpectations and Entropy
Expectations and Entropy Data Science: Jordan Boyd-Graber University of Maryland SLIDES ADAPTED FROM DAVE BLEI AND LAUREN HANNAH Data Science: Jordan Boyd-Graber UMD Expectations and Entropy 1 / 9 Expectation
More informationRandom Variables Chris Piech CS109, Stanford University. Piech, CS106A, Stanford University
Random Variables Chris Piech CS109, Stanford University Piech, CS106A, Stanford University Piech, CS106A, Stanford University Let s Play a Game Game set-up We have a fair coin (come up heads with p = 0.5)
More informationORF 245 Fundamentals of Statistics Chapter 9 Hypothesis Testing
ORF 245 Fundamentals of Statistics Chapter 9 Hypothesis Testing Robert Vanderbei Fall 2014 Slides last edited on November 24, 2014 http://www.princeton.edu/ rvdb Coin Tossing Example Consider two coins.
More informationOutline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions
Week 5 Random Variables and Their Distributions Week 5 Objectives This week we give more general definitions of mean value, variance and percentiles, and introduce the first probability models for discrete
More informationArkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan
2.4 Random Variables Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan By definition, a random variable X is a function with domain the sample space and range a subset of the
More informationApplied Statistics I
Applied Statistics I Liang Zhang Department of Mathematics, University of Utah July 8, 2008 Liang Zhang (UofU) Applied Statistics I July 8, 2008 1 / 15 Distribution for Sample Mean Liang Zhang (UofU) Applied
More informationBayesian Updating with Discrete Priors Class 11, Jeremy Orloff and Jonathan Bloom
1 Learning Goals ian Updating with Discrete Priors Class 11, 18.05 Jeremy Orloff and Jonathan Bloom 1. Be able to apply theorem to compute probabilities. 2. Be able to define the and to identify the roles
More informationJoint Distribution of Two or More Random Variables
Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few
More informationTopic 3 Random variables, expectation, and variance, II
CSE 103: Probability and statistics Fall 2010 Topic 3 Random variables, expectation, and variance, II 3.1 Linearity of expectation If you double each value of X, then you also double its average; that
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationWhat does independence look like?
What does independence look like? Independence S AB A Independence Definition 1: P (AB) =P (A)P (B) AB S = A S B S B Independence Definition 2: P (A B) =P (A) AB B = A S Independence? S A Independence
More informationTerminology. Experiment = Prior = Posterior =
Review: probability RVs, events, sample space! Measures, distributions disjoint union property (law of total probability book calls this sum rule ) Sample v. population Law of large numbers Marginals,
More informationChapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.
Chapter 14 From Randomness to Probability Copyright 2012, 2008, 2005 Pearson Education, Inc. Dealing with Random Phenomena A random phenomenon is a situation in which we know what outcomes could happen,
More informationLecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019
Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation
CS 70 Discrete Mathematics and Probability Theory Spring 206 Rao and Walrand Note 6 Random Variables: Distribution and Expectation Example: Coin Flips Recall our setup of a probabilistic experiment as
More informationCommon ontinuous random variables
Common ontinuous random variables CE 311S Earlier, we saw a number of distribution families Binomial Negative binomial Hypergeometric Poisson These were useful because they represented common situations:
More informationProbability and Independence Terri Bittner, Ph.D.
Probability and Independence Terri Bittner, Ph.D. The concept of independence is often confusing for students. This brief paper will cover the basics, and will explain the difference between independent
More informationMATH/STAT 3360, Probability
MATH/STAT 3360, Probability Sample Final Examination This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are provided after each
More information2. Probability. Chris Piech and Mehran Sahami. Oct 2017
2. Probability Chris Piech and Mehran Sahami Oct 2017 1 Introduction It is that time in the quarter (it is still week one) when we get to talk about probability. Again we are going to build up from first
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationCentral Limit Theorem and the Law of Large Numbers Class 6, Jeremy Orloff and Jonathan Bloom
Central Limit Theorem and the Law of Large Numbers Class 6, 8.5 Jeremy Orloff and Jonathan Bloom Learning Goals. Understand the statement of the law of large numbers. 2. Understand the statement of the
More information