ORF 245 Fundamentals of Statistics Great Expectations
|
|
- Sharyl Conley
- 6 years ago
- Views:
Transcription
1 ORF 245 Fundamentals of Statistics Great Expectations Robert Vanderbei Fall 2015 Slides last edited on November 16, rvdb
2 Definition The expectation of a random variable is a measure of it s average value. Discrete Case: E(X) = i x i p(x i ) Caveat: If it s an infinite sum and the x i s are both positive and negative, then the sum can fail to converge. We restrict our attention to cases where the sum converges absolutely: x i p(x i ) < Otherwise, we say that the expectation is undefined. i Continuous Case: E(X) = xf(x)dx Corresponding Caveat: If we say that the expectation is undefined. x f(x)dx = 1
3 Geometric Random Variable Recall that a geometric random variable takes on positive integer values, 1, 2,..., and that p(k) = P (X = k) = q k 1 p where q = 1 p. We compute: E(X) = kpq k 1 = p k=1 kq k 1 = p k=1 k=1 d dq qk = p d dq q k = p d dq q q k = p d dq k=1 k=0 q 1 q (1 q)(1) q( 1) 1 = p = p (1 q) 2 (1 q) 2 = 1 p (Isn t calculus fun!) 2
4 Poisson Random Variable Recall that a Poisson random variable takes on nonnegative integer values, 0, 1, 2,..., and that p(k) = P (X = k) = λk k! e λ where λ is some positive real number. We compute: E(X) = k=0 k λk k! e λ = λe λ k=1 λ k 1 (k 1)! = λe λ k=0 λ k k! = λe λ e λ = λ We now see that λ is the mean. 3
5 Exponential Random Variable Recall that an exponential random variable is a continuous random variable with f(x) = λe λx, x 0, where λ > 0 is a fixed parameter. We compute: E(X) = 0 x λe λx dx = 1 λ 0 u e u du = 1 λ (the last integral being done using integration by parts). 4
6 Normal Random Variable Recall that a normal random variable is a continuous random variable with f(x) = 1 2πσ e (x µ)2 /2σ 2 We compute: E(X) = = = x 1 2πσ e (x µ)2 /2σ 2 dx (u + µ) u 1 2πσ e u2 /2σ 2 du 1 e u2 /2σ 2 du + µ 2πσ 1 2πσ e u2 /2σ 2 du = 0 + µ 1 = µ The expected value of X is the mean µ. 5
7 Theorem Let g( ) be some given function. Discrete Case: E(g(X)) = x j g(x j )p(x j ) Continuous Case: E(g(X)) = g(x)f(x)dx Derivation (Discrete case): Let Y = g(x). Then E(g(X)) = E(Y ) = i y i p Y (y i ) Let A i = {x j g(x j ) = y i }. Then, p Y (y i ) = x j A i p(x j ) and so E(Y ) = i y i x j A i p(x j ) = i x j A i y i p(x j ) = i x j A i g(x j )p(x j ) = x j g(x j )p(x j ) Note: Usually E(g(X)) g(e(x)). 6
8 Theorem Theorem: E a + n b i X i = a + i=1 n b i E(X i ) i=1 Proof: We give the proof for the continuous case with n = 2. Other cases are similar. E(Y ) = (a + b 1 x 1 + b 2 x 2 )f(x 1, x 2 )dx 1 dx 2 = a f(x 1, x 2 )dx 1 dx 2 + b 1 x 1 f(x 1, x 2 )dx 1 dx 2 + b 2 x 2 f(x 1, x 2 )dx 1 dx 2 = a + b 1 x 1 ( ) f(x 1, x 2 )dx 2 dx 1 + b 2 x 2 ( f(x 1, x 2 )dx 1 ) dx 2 = a + b 1 x 1 f X1 (x 1 )dx 1 + b 2 x 2 f X2 (x 2 )dx 2 = a + b 1 E(X 1 ) + b 2 E(X 2 ) NOTE: In this class, an integral without limits is an integral from to. It s not an indefinite integral. 7
9 Example Consider a binomial random variable Y representing the number of successes in n independent trials where each trial has success probability p. It s expectation is defined in terms of the probability mass function as ( ) n n E(Y ) = k p k (1 p) n k k This sum is tricky to simplify. k=0 Here s an easier way. Let X i denote the Bernoulli random variable that takes the value 1 if the i-th trial is a success and 0 otherwise. Then and so E(Y ) = n Y = i=1 X i n n E(X i ) = p = np i=1 i=1 8
10 Variance and Standard Deviation Definition: The variance of a random variable X is defined as σ 2 := Var(X) := E ( X E(X) ) 2 The standard deviation, denoted by σ, is simply the square root of the variance. Theorem: If Y = a + bx, then Var(Y ) = b 2 Var(X). Proof: E ( Y E(Y ) )2 = E ( a + bx E(a + bx) ) 2 = E ( a + bx a be(x) ) 2 = E ( bx be(x) ) 2 = b 2 E ( X E(X) ) 2 = b 2 Var(X) 9
11 Bernoulli Distribution Recall that q = 1 p and E(X) = 0q + 1p = p Hence Var(X) = E(X E(X)) 2 = (0 p) 2 q + (1 p) 2 p = p 2 q + q 2 p = pq(p + q) = pq Important Note: EX 2 = E(X 2 ) (E(X)) 2 10
12 Normal Distribution Recall that Hence E(X) = µ Var(X) = E(X µ) 2 = 1 (x µ) 2 e (x µ)2 2σ 2πσ 2 dx Make a change of variables z = (x µ)/σ to get Var(X) = σ2 2π z 2 e z2 2 dz This last integral evaluates to 2π a fact that can be checked using integration by parts with u = z and dv = everything else. Hence Var(X) = σ 2 11
13 An Equivalent Alternate Formula for Variance Var(X) = E(X 2 ) (E(X)) 2 Let µ denote the expected value of X: µ = E(X). Var(X) = E(X µ) 2 = E(X 2 2µX + µ 2 ) = E(X 2 ) 2µE(X) + µ 2 = E(X 2 ) 2µ 2 + µ 2 = E(X 2 ) µ 2 12
14 Standard and Poors 500 Daily Returns Raw data: R j, j = 1, 2,..., n 0.04 Real data from S&P Return Days from start 13
15 Standard and Poors 500 Daily Returns µ = E(R i ) j R j /n = , σ = Var(R i ) (R j µ) 2 /n = j Cumulative Distribution Function for S&P500 S&P500 Normal(µ,σ) 0.6 F(x) x 14
16 Standard and Poors 500 Value Over Time Value of Investment over Time S&P500 Simulated from Same Distribution Current Value Days from start
17 Standard and Poors 500 Bootstrap 600 Bootstrap Histogram 500 Frequency Value After One Year 16
18 Standard and Poors 500 Matlab Code load -ascii 'sp500.txt' n = length(sp500); R = sp500; mu = sum(r)/n sigma = std(r) figure(1); plot(r); xlabel('days from start'); ylabel('return'); title('real data from S&P500'); figure(3); P = cumprod(1+r); plot(p,'r-'); hold on; for i=1:4 RR = R(randi(n,[n 1])); PP = cumprod(1+rr); plot(pp,'k-'); end xlabel('days from start'); ylabel('current Value'); title('value of Investment over Time'); legend('s&p500', 'Simulated from Same Distribution'); hold off; figure(2); Rsort = sort(r); x = (-400:400)/10000; y = cdf('norm', x, mu, sigma); plot(rsort, (1:n)/n, 'r-'); hold on; plot(x,y,'k-'); hold off; xlabel('x'); ylabel('f(x)'); title('cumulative Dist. Func. for S&P500'); legend('s&p500', 'Normal(\mu,\sigma)'); figure(4); P = prod(1+r); for i=1:10000 RR = R(randi(n,[n 1])); PP(i) = prod(1+rr); end histogram(pp); xlabel('value After One Year'); ylabel('frequency'); title('bootstrap Histogram'); 17
19 Standard and Poors 500 The Data The data file is called sp500.txt. It is 250 lines of plain text. Each line contains one number R i. Here are the first 15 lines e
20 Covariance Given two random variables, X and Y, let µ X = E(X) and µ Y = E(Y ). The covariance between X and Y is defined as: Cov(X, Y ) = E((X µ X )(Y µ Y )) = E(XY ) µ X µ Y Proof of equality. E((X µ X )(Y µ Y )) = E(XY Xµ Y µ X Y + µ X µ Y ) = E(XY ) µ X µ Y µ X µ Y + µ X µ Y = E(XY ) µ X µ Y Comment: If X and Y are independent, then E(XY ) = E(X)E(Y ) and so Cov(X, Y ) = 0. The converse is not true. 19
21 Covariance/Variance of Linear Combinations If U = a + n i=1 b i X i and V = c + m j=1 d j Y j, then Cov(U, V ) = n i=1 m b i d j Cov(X i, Y j ) j=1 If X i s are independent, then Cov(X i, X j ) = 0 for i j and so Var i X i = Cov i X i, j X j = i = i Cov(X i, X i ) Var(X i ) 20
22 Variance of a Binomial RV Recall our representation of a Binomial random variable Y Bernoulli s: as a sum of independent Y = n i=1 X i From this we see that Var(Y ) = i Var(X i ) = np(1 p). 21
23 Correlation Coefficient The correlation coefficient between two random variables X and Y is denoted by ρ and defined as ρ = Cov(X, Y ) Var(X)Var(Y ) = σ XY σ X σ Y Let s talk about units. Suppose that X represents a random spatial length measured in meters (m) and that Y representst a random time interval measured in seconds (s). Then, the units of Cov(X, Y ) are meter-seconds, Var(X) is measured in meters-squared and Var(Y ) has units of seconds-squared. Hence, ρ is unitless the units in the numerator cancel with the units in the denominator. One can show that always holds. 1 ρ 1 22
24 Conditional Expectation The following formulas seem self explanatory... Discrete case: E(Y X = x) = y yp Y X (y x) Continuous case: E(Y X = x) = yf Y X (y x)dy Arbitrary function of Y : E(h(Y ) X = x) = h(y)f Y X (y x)dy 23
25 Prediction Let Y be a random variable. We d like to give a single deterministic number to represent where this random variable sits on the real line. The expected value, E(Y ) is one choice that is quite reasonable if the distribution of Y is symmetric about this mean value. But, many distributions are skewed and in such cases the expected value might not be the best choice. The real question is: how do we quantify what we mean by best choice? One answer to that question involves the mean squared error (MSE): MSE(α) = E(Y α) 2 To find a good estimator, pick the value of α that minimizes the MSE. To find this minimizer, we differentiate and set the derivative to zero: d dα MSE(α) = d ( ) d dα E(Y α)2 = E (Y α)2 = E ( 2(Y α)( 1) ) dα Hence, we pick α such that 0 = E(α Y ) = α E(Y ) i.e., α = E(Y ) Conclusion: the mean minimizes the mean squared error. 24
26 Least Squares Suppose we know from some underlying fundamental principle (say physics for example) that some parameter y is related linearly to another parameter x: y = α + βx but we don t know α and β. We d like to do experiments to determine them. A probabilistic model of the experiment has X and Y as random variables. Let s imagine we do the experiment over and over many times and have a good sense of the joint distribution of X and Y. We want to pick α and β so as to minimize E(Y α βx) 2 Again, we take derivatives and set them to zero. This time we have two derivatives: and ( ) α E(Y α βx)2 = E (Y α βx)2 α = 2E(Y α βx) = 2(µ Y α βµ X ) = 0 ( ) β E(Y α βx)2 = E (Y α βx)2 = 2E ( (Y α βx)x ) = 2 ( E(XY ) αe(x) βe(x 2 ) ) = 0 β 25
27 Least Squares Continued We get two linear equations in two unknowns α + βµ X = µ Y αµ X + βe(x 2 ) = E(XY ) Multiplying the first equation by µ X and subtracting it from the second equation, we get This equation simplifies to βe(x 2 ) βµ 2 X = E(XY ) µ X µ Y and so βσ 2 X = σ XY β = σ XY σ 2 X = ρ σ Y σ X Finally, substituting this expression into the first equation, we get α = µ Y ρ σ Y σ X µ X 26
28 Regression to the Mean Suppose that a large statistics class has two midterms. Let X denote the score that a random student gets on the first midterm and let Y denote the same student s score on the second midterm. Based on prior use of these two exams, the instructor has figured out how to grade them so that the average and variance of the scores are the same µ X = µ Y = µ, σ X = σ Y = σ But, those students who do well on the first midterm tend to do well on the second midterm, which is reflected in the fact that ρ > 0. From the calculations on the previous slide, we can estimate how a student will do on the second midterm based on his/her performance on the first one. Our estimate, denoted Ŷ, is Ŷ = µ ρµ + ρx We can rewrite this as Ŷ µ = ρ(x µ) In words, we expect the performance of the student on the second midterm to be closer by a factor of ρ to the average than was his/her score on the first midterm. This is a famous effect called regression to the mean. 27
29 Sum of Poissons Let X and Y be independent Poisson random variables with parameter λ and µ, respectively. Let Z = X + Y. Let s compute the probability mass function: P (Z = n) = P (X + Y = n) = = = n P (X = k, Y = n k) k=0 n P (X = k) P (Y = n k) k=0 n k=0 = e (λ+µ) 1 n! λ k k! e λ µ n k (n k)! e µ = e (λ+µ) n k=0 n k=0 λ k k! µ n k (n k)! ( ) n λ k µ n k = e (λ+µ)(λ + µ)n k n! Conclusion: The sum is Poisson with parameter λ + µ. The result can be extended to a sum of any number of independent Poisson random variables: X k Poisson(λ k ) = Poisson k X k k λ k 28
30 Sum of Normals Let X and Y be independent Normal(0,1) r.v. s and Z = X + Y. Compute Z s cdf: P (Z z) = P (X + Y z) = = 1 2π z x e x2 /2 Differentiating, we compute the density function for Z: f Z (z) = 1 2π = 1 2π = 1 2π e z2 /4 f(x)p (Y z x)dx e y2 /2 dydx e x2 /2 e (z x)2 /2 dx = 1 2π e (x z/2)2 +z 2 /4 z 2 /2 dx = 1 e x2 dx = 1 2π 2 e z2 /4 e x2 +xz z 2 /2 dx /4 2π e z2 e (x z/2)2 dx Conclusion: The sum is Normal with mean 0 and variance 2. The result can be extended to a sum of any number of independent Normal random variables: X k Normal(µ k, σk) 2 = X k Normal µ k, σ 2 k k k k 29
ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations
ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations Robert Vanderbei Fall 2014 Slides last edited on October 20, 2014 http://www.princeton.edu/ rvdb Definition The expectation of a random variable
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationFinal Exam # 3. Sta 230: Probability. December 16, 2012
Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets
More informationPractice Examination # 3
Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationCDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables
CDA5530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (R.V.) X: A function on sample space X: S R Cumulative distribution function
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationThe Binomial distribution. Probability theory 2. Example. The Binomial distribution
Probability theory Tron Anders Moger September th 7 The Binomial distribution Bernoulli distribution: One experiment X i with two possible outcomes, probability of success P. If the experiment is repeated
More informationChp 4. Expectation and Variance
Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationECE 302 Division 2 Exam 2 Solutions, 11/4/2009.
NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationJoint Distribution of Two or More Random Variables
Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationRandom Variables. P(x) = P[X(e)] = P(e). (1)
Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationSTAT 430/510: Lecture 16
STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More informationSTT 441 Final Exam Fall 2013
STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationMassachusetts Institute of Technology
6.4/6.43: Probabilistic Systems Analysis (Fall ) Recitation Solutions October 9,. cov(ax + b,y ) ρ(ax + b,y ) var(ax + b)(var(y )) E[(aX + b E[aX + b])(y E[Y ])] a var(x)var(y ) E[(aX + b ae[x] b)(y E[Y
More informationCDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random
CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (RV)X (R.V.) X: A function on sample space X: S R Cumulative distribution
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 10: Expectation and Variance Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/ psarkar/teaching
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More information1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise
Name M36K Final. Let X be a random variable with probability density function { /x x < f(x = 0 otherwise Compute the following. You can leave your answers in integral form. (a ( points Find F X (t = P
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationMore on Distribution Function
More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation
More informationJoint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:
Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,
More informationLecture 4: Proofs for Expectation, Variance, and Covariance Formula
Lecture 4: Proofs for Expectation, Variance, and Covariance Formula by Hiro Kasahara Vancouver School of Economics University of British Columbia Hiro Kasahara (UBC) Econ 325 1 / 28 Discrete Random Variables:
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationThis does not cover everything on the final. Look at the posted practice problems for other topics.
Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry
More informationReview 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015
Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted
More informationClass 8 Review Problems solutions, 18.05, Spring 2014
Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots
More informationECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.
NAME: ECE 302 Division MWF 0:30-:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. If you are not in Prof. Pollak s section, you may not take this
More informationChapter 4 continued. Chapter 4 sections
Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:
More informationNotes for Math 324, Part 19
48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which
More informationClosed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.
IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,
More information14.30 Introduction to Statistical Methods in Economics Spring 2009
MIT OpenCourseWare http://ocw.mit.edu 14.30 Introduction to Statistical Methods in Economics Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
More informationStochastic Models of Manufacturing Systems
Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationTable of z values and probabilities for the standard normal distribution. z is the first column plus the top row. Each cell shows P(X z).
Table of z values and probabilities for the standard normal distribution. z is the first column plus the top row. Each cell shows P(X z). For example P(X.04) =.8508. For z < 0 subtract the value from,
More informationChapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables
Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results
More informationProbability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27
Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple
More informationActuarial Science Exam 1/P
Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested
More informationProblem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51
Yi Lu Correlation and Covariance Yi Lu ECE 313 2/51 Definition Let X and Y be random variables with finite second moments. the correlation: E[XY ] Yi Lu ECE 313 3/51 Definition Let X and Y be random variables
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationProbability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.
Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel
More informationJointly Distributed Random Variables
Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,
More informationUC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007
UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 8 Fall 007 Issued: Thursday, October 5, 007 Due: Friday, November, 007 Reading: Bertsekas
More informationCovariance and Correlation
Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability
More informationLecture 22: Variance and Covariance
EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationLecture 25: Review. Statistics 104. April 23, Colin Rundel
Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More informationMULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationRandom Variables and Expectations
Inside ECOOMICS Random Variables Introduction to Econometrics Random Variables and Expectations A random variable has an outcome that is determined by an experiment and takes on a numerical value. A procedure
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationf X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx
INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't
More informationIf g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get
18:2 1/24/2 TOPIC. Inequalities; measures of spread. This lecture explores the implications of Jensen s inequality for g-means in general, and for harmonic, geometric, arithmetic, and related means in
More informationWe introduce methods that are useful in:
Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more
More informationTopic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.
Topic 2: Probability & Distributions ECO220Y5Y: Quantitative Methods in Economics Dr. Nick Zammit University of Toronto Department of Economics Room KN3272 n.zammit utoronto.ca November 21, 2017 Dr. Nick
More informationSet Theory Digression
1 Introduction to Probability 1.1 Basic Rules of Probability Set Theory Digression A set is defined as any collection of objects, which are called points or elements. The biggest possible collection of
More informationBell-shaped curves, variance
November 7, 2017 Pop-in lunch on Wednesday Pop-in lunch tomorrow, November 8, at high noon. Please join our group at the Faculty Club for lunch. Means If X is a random variable with PDF equal to f (x),
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationECE Lecture #9 Part 2 Overview
ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationExpectation of Random Variables
1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2016 Page 0 Expectation of a discrete random variable Definition: The expected value of a discrete random variable exists, and is defined by EX
More information2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.
CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook
More informationMA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2
MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is
More informationSTOR Lecture 16. Properties of Expectation - I
STOR 435.001 Lecture 16 Properties of Expectation - I Jan Hannig UNC Chapel Hill 1 / 22 Motivation Recall we found joint distributions to be pretty complicated objects. Need various tools from combinatorics
More informationHomework 10 (due December 2, 2009)
Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n
More informationExam 3, Math Fall 2016 October 19, 2016
Exam 3, Math 500- Fall 06 October 9, 06 This is a 50-minute exam. You may use your textbook, as well as a calculator, but your work must be completely yours. The exam is made of 5 questions in 5 pages,
More informationProbability and Statistics Notes
Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1
More informationTom Salisbury
MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom
More informationTheory of probability and mathematical statistics
Theory of probability and mathematical statistics Tomáš Mrkvička Bibliography [1] J. [2] J. Andďż l: Matematickďż statistika, SNTL/ALFA, Praha 1978 Andďż l: Statistickďż metody, Matfyzpress, Praha 1998
More information1.6 Families of Distributions
Your text 1.6. FAMILIES OF DISTRIBUTIONS 15 F(x) 0.20 1.0 0.15 0.8 0.6 Density 0.10 cdf 0.4 0.05 0.2 0.00 a b c 0.0 x Figure 1.1: N(4.5, 2) Distribution Function and Cumulative Distribution Function for
More informationIntroduction to Computational Finance and Financial Econometrics Probability Review - Part 2
You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /
More informationProbability- the good parts version. I. Random variables and their distributions; continuous random variables.
Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density
More informationProbability Review. Gonzalo Mateos
Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More information