Lecture 19 - Covariance, Conditioning
|
|
- Emerald Chandler
- 5 years ago
- Views:
Transcription
1 THEORY OF PROBABILITY VLADIMIR KOBZAR Review. Lecture 9 - Covariance, Conditioning Proposition. (Ross, 6.3.2) If X,..., X n are independent normal RVs with respective parameters µ i, σi 2 for i,..., n, then ř n X i is Npµ `... ` µ n, σ 2 `... ` σnq 2 (Beware that we add the squares of σ i s!) Proof. Step : Let n 2, X be Np0, q and X 2 be Np0, σ 2 q Step 2: get the result for any two independent normals by scaling and translating. Step 3: induction on n. More on sums: Variance and covariance (Ross, Secs 7.3 and 7.4): Proposition 2. (Ross Prop 7.4.) If X and Y are independent, and g and h are any functions from reals to reals, then (Not true without independence!) ErgpXqhpY qs ErgpXqsErhpY qs Proof. (idea) Split the two-variate integral into a product of singlevariate integrals. Now recall: the variance of a RV X is V arpxq ErpX µq 2 s where µ ErXs It gives a useful measure of how spread out X is. We can generalize it to two RVs X and Y: Definition. Let X and Y be RVs and let µ X ErXs and µ Y ErY s. The covariance of X and Y is CovpX, Y q ErpX µ X qpy µ Y qs provided that this expectation converges for discrete RV s, or continuous, or otherwise. Date: August 8, 206.
2 2 VLADIMIR KOBZAR First properties. Symmetry: CovpX, Y q CovpY, Xq 2. Applying with Y X, we see that Cov generalizes V ar, CovpX, Xq V arpxq 3. Like V ar, Cov has a useful alternative formula: CovpX, Y q CovpY, Xq 4. By 3., if X and Y are independent, CovpX, Y q ErX µ X s ErY µ Y s 0 Example. (Ross, 7.4d) Let A and B be events, and let I A and I B be their indicator variables. Since ErI A s P paq, ErI B s P pbq, and ErI A X I B s P pa X Bq, CovpI A, I B q P pa X Bq P paqp pbq P pbqrp pa Bq P paqs Therefore, Cov can be positive, zero, or negative depending on whether P pa Bq is, respectively, greater than, equal to, or less than P paq. Also, by Property 2., V arpi A q P paq P paq 2 P paqp P paqq Here is what makes covariance really useful: how it transforms under sums and products: Proposition 3. (Ross Prop 7.4.2). For any RVs X and Y, and any real value a, we have CovpaX, Y q CovpX, ay q acovpx, Y q (so V arpaxq CovpaX, axq a 2 CovpX, Xq a 2 V arpxq - consistency check) 2. For any RVs X,..., X n and Y,...Y m, we have mÿ mÿ Covp X i, Y j q CovpX i, Y j q j j (so it behaves just like multiplying out a product of sums of numbers.) In particular, if m n and Y i X i in part 2 above, we get (Ross, eqn (7.4.)) ÿ V arp X i q V arpx i q ` 2 CovpX i, X j q ďiăjďn
3 THEORY OF PROBABILITY 3 If X,..., X n are independent, then CovpX i, X j q 0 whenever i ă j, so in this case, we re left with V arp X i q V arpx i q. Example 2. (Ross, 7.4b) If X is binom(n,p) then V arpxq npp pq For an independent Bernoulli RV X i V arpx i q ErX 2 i s perx i sq 2 ErX 2 i s perx i sq 2 p p 2 Now recall that X X `...X n, a sum of independent Bernoulli variables. Note that this is a third proof in Ross of variance of a binomial random variable. In Lecture, we reviewed the proof on p 39 (using the recursive expression for ErX k s). Example 7.3a, p 299, uses moments of binomial random variable, which we will see study tomorrow. Example 3. (special case of Ross, 7.4f) An experiment has three possible outcomes with respective probabilities p, p 2, p 3. After n independent trials are performed, we write X i for the number of times outcome i occurred, i, 2, 3. Find CovpX, X 2 q. Note that where X I pkq I 2 pkq CovpX, X 2 q I pkq and X 2 k I 2 pkq k if trial k results in outcome if trial k results in outcome 2 CovpI plq, I 2 pkqq l k
4 4 VLADIMIR KOBZAR Since the trial are independent, only the diagonal terms survive. CovpI plq, I 2 plqq ErI plqi 2 plqs ErI plqseri 2 plqs p p 2 Therefore, CovpX, X 2 q np p 2 as expected. Now some continuous examples... Example 4. (Ross, Section 7.2l) (Random walk) A flee starts at the origin of the plane. Each second it jumps one inch. For each jump, it chooses the direction uniformly at random, independently of its previous jump. Find the expected square of the distance from the origin after n jumps. Let px i, Y i q denote the change in position at the i-th step, i,..., n where X i cos θ i and Y i sin θ i and θ i is an independent uniform p0, 2πq random variable. Then D 2 p X i q 2 ` p Y i q 2 pxi 2 ` Yi 2 q ` ÿ ÿ pxi X j ` Y i Y j q i j n ` ÿ ÿ pcos θj cos θ i ` sin θ j sin θ i q i j where we have used the fact that cos 2 θ i ` sin 2 θ i. We observe that 2πErcos θ i s 2πErsin θ i s ż 2π 0 ż 2π 0 cos udu 0 sin udu 0 Now, taking the expectation and using the independence of θ i and θ j when i j and the immediately preceding result we obtain ErD 2 s n
5 THEORY OF PROBABILITY 5 Example 5. If X,...X n are independent normal RVs with respective parameters µ i, σ i for i,..., n and, then Er X i s µ i and V arp X i q Therefore, Ross eq. (7.4.) is consistent with our previous calculation (Ross, Prop 6.3.2) that ř n X i is still normal with those parameters. Gamma functions. (Special case of Ross, Prop 6.3.): If X and Y are independent Exp(λ) RVs, then X ` Y is continuous with PDF f X`Y λ 2 ze λz z ą 0 Applying the identity Ross, 6.3.2, we get ż 8 f X`Y pzq f X pz yqf Y pyqdy 8 ş z λe λpz yqλe λy dy if z ą 0 0 λ 2ze λz if z ą 0 were in setting the limits of integration we observed that f Y pyq 0 for y ă 0 and f X pz yq 0 for y ą z. Now assume that for X,..., X n independent Exppλq RVs, their sum is λn z n 2e λz if z ą 0 pn 2q! f X`X n pzq Again by the identity Ross, (6.3.2), we get ż 8 f px`x n q`x n pzq f X`X n pyqf Xn pz yqdy 8 ş z 0 p λn y n 2e λyqpλe λpz yq qdy if z ą 0 pn 2q! pn q! λn z n e λz if z ą 0 σ 2 i
6 6 VLADIMIR KOBZAR were in setting the limits of integration we observed that f X`X n pyq 0 for y ă 0 and f Xn pz yq 0 for y ą z. Any continuous RV with the PDF above is called Gammapn, λq, so for instance Gammap, λq = Exppλq. See Ross subsections 5.3. and for more information on Gamma RVs. Conditioning with random variables. Our next topic is the use of conditional probability in connection with random variables as well as single events. As for events, this can come up in various ways. Sometimes we know the joint distribution of RVs X and Y, and want to compute updated probabilities for the behavior of X in light of some information about Y. Sometimes there is a natural modeling choice for the conditional probabilities, and we must reconstruct the unconditioned probabilities from them. The tools we introduce below can give an convenient way to do a calculation even when we start and end with unconditional quantities. Conditional distributions: discrete case (Ross Sec 6.4) Suppose that X and Y are discrete RVs with joint PMF p. We have have various events defined in terms of X and others defined in terms of Y. Sometimes we need conditional probabilities of X events given Y events and vice-versa. This requires no new ideas beyond conditional probability. But some new notation can be convenient. Definition 2. The conditional PMF of X given Y is the function P X Y px yq P tx x Y yu ppx, yq p Y pyq It is defined for any x and any y for which p Y pyq ą 0 (we simply don t use it for other choices of y). Example 6. (Ross, 6.4a) Suppose that X and Y have joint PMF given by pp0, 0q 0.4, pp0, q 0.2, pp, 0q 0., pp, q 0.3 Calculate the conditional PMF of X given that Y
7 THEORY OF PROBABILITY 7 Example 7. (Ross, 6.4b) Suppose that X and Y be independent Poipλ q, Poipλ 2 q RVs. Calculate the conditional PMF of X given that X ` Y n Conditioning can also involve three or more RVs. This requires care, but no new ideas. Example 8. (Special case of Ross, 6.4c) An experiment has 3 possible outcomes with respective probabilities p, p 2, p 3. After n independent trials are performed, we write X i for the number of times outcome i occurred, i, 2, 3 Find the conditional distribution (that is the conditional joint PMF) of px, X 2 q given that X 3 m. WHAT THE QUESTION IS ASKING FOR: p X,X 2 X 3 pk, l mq P tx k, X 2 l X 3 mu for all possible values of k, l and m. Another fact to note: If X and Y are independent, then P X Y px yq p Xpxqp Y pyq p Y pyq p X pxq So this is just like for events: if X and Y are independent, then knowing the value taken by Y doesn t influence the probability distribution of X. Conditional distributions: continuous case (Ross, Sec 6.5). When continuous RVs are involved we do need a new idea. THE PROBLEM: Suppose that Y is a continuous RV and E is any event. Then P ty yu 0 for any real value y, so we cannot define the conditional probability P pe ty yu using the usual formula. However, we can sometimes make sense of this in another way.
8 8 VLADIMIR KOBZAR Instead of assuming that Y takes the value y exactly, let us condition on Y taking a value in a tiny window around y: P pe X ty ď Y ď y ` dyuq P pe ty ď Y ď y ` dyuq P ty ď Y ď y ` dyu P pe X ty ď Y ď y ` dyuq f Y pyqdy where we use the infinitessimal interpretation of f Y pyq. This makes sense provided f Y ą 0. Now let dy Ñ 0. The conditional probability of E given that Y y is defined to be P pe Y yq lim dyñ0 P pe X ty ď Y ď y ` dyuq f Y pyqdy Theorem 4. The above limit always exists except maybe for a negligible set of possible values of y, which you can safely ignore. Negligible sets are another idea from measure theory, so we won t describe it here. Ross doesn t mention this result at all. In this course, you can always assume that the limit exists. Example 9. (Ross 6.5.) Suppose that X, Y have joint PMF e x{ye y 0 ă x, y ă 8 y fpx, yq Find P tx ą Y yu In this example, E is defined in terms of another RV, jointly continuous with Y. In fact there is a useful general tool for this situation. Definition 3. (Ross, p250) Suppose X and Y are jointly continuous with joint PDF f. The conditional PDF of X given Y is the function f X Y px yq fpx, yq f Y pyq it is defined for all real values x and all y such that f Y pyq ą 0. This is not a conditional probability since it s a ratio of densities, not of probability values.
9 THEORY OF PROBABILITY 9 INTUITIVE INTERPRETATION: Instead of conditioning tx xu on ty yu, let s allow very small window around both x and y. We find: P tx ď X ď x ` dx ty ď Y ď y ` dyu P tx ď X ď x ` dx, y ď Y ď y ` dyu P ty ď Y ď y ` dyu fpx, yqdxdy «f X Y px yqdx f Y pyqdy so f X Y px yq is a PDF, which describes the probability distribution of X, given that Y lands in a very small window around y. Also, just as in the discrete case, if X and Y are independent, then f X Y px yq f X pxq so knowing the value taken by Y (to arbitrary accuracy) doesn t influence the probability distribution of X. The conditional probability PDF enables us to compute probabilities of X-events given that ty yu, without taking a limit: Proposition 5. (See Ross p.25) Let a ă b, or a 8 or b 8. Let y be a real value such that f Y pyq ą 0. Then P ta ď X ď b Y yu ż b a f X Y px yqdx MORAL: f X Y px yq is a new PDF. It describes the probabilities of Xevents given that Y y. It has all other properties that we ve already seen for PDFs. Finding a conditional PDF is just like finding a conditional PMF. Example 0. (Ross, 6.5a). The joint PDF of X and Y is 2 xp2 x yq 0 ă x, y ă fpx, yq 5 Find f X Y px yq for all x and for 0 ă y ă. PROCEDURE: Find the marginal f Y the formula for f X Y. by integrating, then plug into
10 0 VLADIMIR KOBZAR In some situations it is more natural to assume a conditional PDF and then reconstruct a joint P DF. This is the same idea as the multiplication rule for conditional probabilities of events. Example. A stick of unit length is broken at a uniformly random point. Then the left-hand piece is broken again at a uniform random point. Let Y be the length of the left-most piece after the second break. Find f y. Let X be the length of the left-hand piece after the first break. Step : Obtain fpx, yq f X pxqf Y X py xq (multiplication rule!) fpx, yq x if 0 ă y ă x ă Step 2: Integrate out x to get f Y pyq. ż 8 8 ż fpx y qdx y x log y if 0 ă y ă Definition 4 (See Ross, Example 6.5d for the general bivariate normal, which has several other parameters).. The random vector px, Y q is a bivariate standard normal with correlation ă ρ ă if it is jointly continuous with joint PDF fpx, yq 2π a ρ expp 2 2p ρ 2 q px2 2ρxy ` y 2 qq for 8 ă x, y ă 8. In Lecture 8, we showed that the marginal distributions of X and Y are standard normals.
11 Therefore, conditioned on ty yu, f X Y px yq fpx, yq f Y pyq Npρy, ρ 2 q THEORY OF PROBABILITY 2π? ρ expp 2? a 2π ρ 2 2p ρ 2 q px2 2ρxy ` y 2 qq 2? π e y2 {2 expp px ρyq2 2p ρ 2 q q Mixtures of discrete and continuous. Some situations are best modeled using both some continuous RVs and some discrete RVs. Example: Suppose electronic components can be more or less durable, and this is measured by a parameter between 0 and. If we buy and install a component, its parameter is a continuous RV X lying in that range. Then it survives each use independently with probability X. The number of uses before failure is a discrete RV N, but it depends on the continuous RV X. This is sometimes called a hybrid model, or said to be of mixed type. Conditional probability plays an especially important role here. Suppose that X is a RV and E is any event with P peq ą 0. Consider any X-event of the form ta ď X ď bu. By definition, its conditional probability given E is KEY FACT: P pa ď X ď b Eq P ta ď X ď bu X E P peq Proposition 6. (cf Ross, p 255) If X is continuous, and P peq ą 0, then there is a conditional PDF f X E such that whenever a ă b. P pa ď X ď b Eq ż b a f X E pxqdx That is X continuous ñ X still continuous after conditioning. Proof. INTUITION: P px ď X ď x ` dx Eq «f X E pxqdx
12 2 VLADIMIR KOBZAR ROUGH IDEA OF THE PROOF: if X is continuous and P peq ą 0, then we can define P ptx ď X ď x ` dxu Eq f X E pxq lim dxñ0 dx That is the limit exists and it satisfies the equation for f X E. Ross doesn t mention this point and we will not attempt to make it rigorous. Example 2. Letx X be Exppλq, and let E tx ą tu. Then f X E psq P tsďxďs`dsu lim dsñ0 if s ą t e λt ds e λs if s ą t e λt References [] Austin, Theory of Probability lecture notes, tim/tofp [2] Bernstein, Theory of Probability lecture notes, brettb/probsum205/index.html [3] Ross, A First Course in Probability (9th ed., 204)
THEORY OF PROBABILITY VLADIMIR KOBZAR
THEORY OF PROBABILITY VLADIMIR KOBZAR Lecture 20 - Conditional Expectation, Inequalities, Laws of Large Numbers, Central Limit Theorem This lecture is based on the materials from the Courant Institute
More informationDS-GA 1002: PREREQUISITES REVIEW SOLUTIONS VLADIMIR KOBZAR
DS-GA 2: PEEQUISIES EVIEW SOLUIONS VLADIMI KOBZA he following is a selection of questions (drawn from Mr. Bernstein s notes) for reviewing the prerequisites for DS-GA 2. Questions from Ch, 8, 9 and 2 of
More informationTHEORY OF PROBABILITY VLADIMIR KOBZAR
THEORY OF PROBABILITY VLADIMIR KOBZAR Lecture 3 - Axioms of Probability I Sample Space and Events. In probability theory, an experiment is any situation which has several possible outcomes, exactly one
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationProbability Generating Functions
Probability Generating Functions Andreas Klappenecker Texas A&M University 2018 by Andreas Klappenecker. All rights reserved. 1 / 27 Probability Generating Functions Definition Let X be a discrete random
More informationSTAT 430/510: Lecture 16
STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions
More informationSTAT 430/510: Lecture 15
STAT 430/510: Lecture 15 James Piette June 23, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.4... Conditional Distribution: Discrete Def: The conditional
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationCh. 5 Joint Probability Distributions and Random Samples
Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However,
More informationProbability Review. Gonzalo Mateos
Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction
More informationWe introduce methods that are useful in:
Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationMTH 224: Probability and Statistics Fall 2017
MTH 224: Probability and Statistics Fall 2017 Course Notes 2 Drew Armstrong Oct 17 We have finished covering the basics of probability. The next section of the course is all about random variables. What
More informationSTAT 430/510 Probability
STAT 430/510 Probability Hui Nie Lecture 16 June 24th, 2009 Review Sum of Independent Normal Random Variables Sum of Independent Poisson Random Variables Sum of Independent Binomial Random Variables Conditional
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationJoint Distribution of Two or More Random Variables
Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationJoint Probability Distributions, Correlations
Joint Probability Distributions, Correlations What we learned so far Events: Working with events as sets: union, intersection, etc. Some events are simple: Head vs Tails, Cancer vs Healthy Some are more
More informationChapter 2. Discrete Distributions
Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation
More informationMultivariate distributions
CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation
More informationM378K In-Class Assignment #1
The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.
More informationLecture 11: Probability Distributions and Parameter Estimation
Intelligent Data Analysis and Probabilistic Inference Lecture 11: Probability Distributions and Parameter Estimation Recommended reading: Bishop: Chapters 1.2, 2.1 2.3.4, Appendix B Duncan Gillies and
More informationPreliminary statistics
1 Preliminary statistics The solution of a geophysical inverse problem can be obtained by a combination of information from observed data, the theoretical relation between data and earth parameters (models),
More informationData, Estimation and Inference
Data, Estimation and Inference Pedro Piniés ppinies@robots.ox.ac.uk Michaelmas 2016 1 2 p(x) ( = ) = δ 0 ( < < + δ ) δ ( ) =1. x x+dx (, ) = ( ) ( ) = ( ) ( ) 3 ( ) ( ) 0 ( ) =1 ( = ) = ( ) ( < < ) = (
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2016 Page 0 Expectation of a discrete random variable Definition: The expected value of a discrete random variable exists, and is defined by EX
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationBMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs
Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem
More informationMath Review Sheet, Fall 2008
1 Descriptive Statistics Math 3070-5 Review Sheet, Fall 2008 First we need to know about the relationship among Population Samples Objects The distribution of the population can be given in one of the
More informationLecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs
Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions
More information4. CONTINUOUS RANDOM VARIABLES
IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationHW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]
HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More information(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.
54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and
More information7 Random samples and sampling distributions
7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationMath 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,
Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: PMF case: p(x, y, z) is the joint Probability Mass Function of X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) PDF case: f(x, y, z) is
More informationSpecial distributions
Special distributions August 22, 2017 STAT 101 Class 4 Slide 1 Outline of Topics 1 Motivation 2 Bernoulli and binomial 3 Poisson 4 Uniform 5 Exponential 6 Normal STAT 101 Class 4 Slide 2 What distributions
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationReview of Probabilities and Basic Statistics
Alex Smola Barnabas Poczos TA: Ina Fiterau 4 th year PhD student MLD Review of Probabilities and Basic Statistics 10-701 Recitations 1/25/2013 Recitation 1: Statistics Intro 1 Overview Introduction to
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationStatistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University
Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationREAL ANALYSIS II TAKE HOME EXAM. T. Tao s Lecture Notes Set 5
REAL ANALYSIS II TAKE HOME EXAM CİHAN BAHRAN T. Tao s Lecture Notes Set 5 1. Suppose that te 1, e 2, e 3,... u is a countable orthonormal system in a complex Hilbert space H, and c 1, c 2,... is a sequence
More informationEntropy and Ergodic Theory Lecture 27: Sinai s factor theorem
Entropy and Ergodic Theory Lecture 27: Sinai s factor theorem What is special about Bernoulli shifts? Our main result in Lecture 26 is weak containment with retention of entropy. If ra Z, µs and rb Z,
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationEntropy and Ergodic Theory Notes 22: The Kolmogorov Sinai entropy of a measure-preserving system
Entropy and Ergodic Theory Notes 22: The Kolmogorov Sinai entropy of a measure-preserving system 1 Joinings and channels 1.1 Joinings Definition 1. If px, µ, T q and py, ν, Sq are MPSs, then a joining
More informationProbability Background
Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationContents 1. Contents
Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................
More informationEXAM # 3 PLEASE SHOW ALL WORK!
Stat 311, Summer 2018 Name EXAM # 3 PLEASE SHOW ALL WORK! Problem Points Grade 1 30 2 20 3 20 4 30 Total 100 1. A socioeconomic study analyzes two discrete random variables in a certain population of households
More informationMath/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions
Math/Stats 45, Sec., Fall 4: Introduction to Probability Final Exam: Solutions. In a game, a contestant is shown two identical envelopes containing money. The contestant does not know how much money is
More informationProbability. Table of contents
Probability Table of contents 1. Important definitions 2. Distributions 3. Discrete distributions 4. Continuous distributions 5. The Normal distribution 6. Multivariate random variables 7. Other continuous
More informationExpectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction
More informationSOLUTION FOR HOMEWORK 12, STAT 4351
SOLUTION FOR HOMEWORK 2, STAT 435 Welcome to your 2th homework. It looks like this is the last one! As usual, try to find mistakes and get extra points! Now let us look at your problems.. Problem 7.22.
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationPhysics 403. Segev BenZvi. Propagation of Uncertainties. Department of Physics and Astronomy University of Rochester
Physics 403 Propagation of Uncertainties Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Maximum Likelihood and Minimum Least Squares Uncertainty Intervals
More informationNotes for Math 324, Part 19
48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which
More information6.1 Moment Generating and Characteristic Functions
Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,
More informationMTH 224: Probability and Statistics Fall 2017
MTH 224: Probability and Statistics Fall 2017 Course Notes 3 Drew Armstrong Nov 9 We have finished the first two thirds of the course, which in retrospect I would label as Part I: Introduction to Probability
More informationADVANCE TOPICS IN ANALYSIS - REAL. 8 September September 2011
ADVANCE TOPICS IN ANALYSIS - REAL NOTES COMPILED BY KATO LA Introductions 8 September 011 15 September 011 Nested Interval Theorem: If A 1 ra 1, b 1 s, A ra, b s,, A n ra n, b n s, and A 1 Ě A Ě Ě A n
More informationDiscrete Random Variables
CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is
More informationThe Binomial distribution. Probability theory 2. Example. The Binomial distribution
Probability theory Tron Anders Moger September th 7 The Binomial distribution Bernoulli distribution: One experiment X i with two possible outcomes, probability of success P. If the experiment is repeated
More informationSTAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.
STAT 302 Introduction to Probability Learning Outcomes Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. Chapter 1: Combinatorial Analysis Demonstrate the ability to solve combinatorial
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More information15 Discrete Distributions
Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.
More informationRandom Variables. Andreas Klappenecker. Texas A&M University
Random Variables Andreas Klappenecker Texas A&M University 1 / 29 What is a Random Variable? Random variables are functions that associate a numerical value to each outcome of an experiment. For instance,
More informationProbability. Carlo Tomasi Duke University
Probability Carlo Tomasi Due University Introductory concepts about probability are first explained for outcomes that tae values in discrete sets, and then extended to outcomes on the real line 1 Discrete
More informationExpectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda
Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,
More informationConditional distributions (discrete case)
Conditional distributions (discrete case) The basic idea behind conditional distributions is simple: Suppose (XY) is a jointly-distributed random vector with a discrete joint distribution. Then we can
More informationBasics on Probability. Jingrui He 09/11/2007
Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationMultivariate random variables
DS-GA 002 Lecture notes 3 Fall 206 Introduction Multivariate random variables Probabilistic models usually include multiple uncertain numerical quantities. In this section we develop tools to characterize
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous
More informationGaussian vectors and central limit theorem
Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables
More informationA Bowl of Kernels. By Nuriye Atasever, Cesar Alvarado, and Patrick Doherty. December 03, 2013
December 03, 203 Introduction When we studied Fourier series we improved convergence by convolving with the Fejer and Poisson kernels on T. The analogous Fejer and Poisson kernels on the real line help
More informationHilbert modules, TRO s and C*-correspondences
Hilbert modules, TRO s and C*-correspondences (rough notes by A.K.) 1 Hilbert modules and TRO s 1.1 Reminders Recall 1 the definition of a Hilbert module Definition 1 Let A be a C*-algebra. An Hilbert
More informationLecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019
Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial
More informationDiscrete Random Variable
Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n
More informationJoint Probability Distributions, Correlations
Joint Probability Distributions, Correlations What we learned so far Events: Working with events as sets: union, intersection, etc. Some events are simple: Head vs Tails, Cancer vs Healthy Some are more
More informationChapter 5 Joint Probability Distributions
Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two
More informationChapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be
Chapter 6 Expectation and Conditional Expectation Lectures 24-30 In this chapter, we introduce expected value or the mean of a random variable. First we define expectation for discrete random variables
More informationECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation
ECE353: Probability and Random Processes Lecture 5 - Cumulative Distribution Function and Expectation Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu
More information4 Pairs of Random Variables
B.Sc./Cert./M.Sc. Qualif. - Statistical Theory 4 Pairs of Random Variables 4.1 Introduction In this section, we consider a pair of r.v. s X, Y on (Ω, F, P), i.e. X, Y : Ω R. More precisely, we define a
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions
More informationReview of probability
Review of probability Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Goals for the lecture you should understand the following concepts definition of probability random variables
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More information