2. The CDF Technique. 1. Introduction. f X ( ).
|
|
- Abigail Juliet Newton
- 5 years ago
- Views:
Transcription
1 Week 5: Distributions of Function of Random Variables. Introduction Suppose X,X 2,..., X n are n random variables. In this chapter, we develop techniques that may be used to find the distribution of functions these random variables, say Y = u (X,..., X n ). Some of the techniques we consider are:. The Cumulative Distribution Function (CDF) Technique 2. The Jacobian Transformation Technique 3. The Moment Generating Function (MGF) Technique Here this week, we also talk about: Distributions of Order Statistics Special Sampling Distributions 2. The CDF Technique Let X be a continuous random variable with cumulative distribution function F X ( ) and density function f X ( ). Now suppose that Y = g (X) is a function of X where g is differentiable and strictly increasing. Thus, its inverse g uniquely exists. The CDF of Y can be derived using F Y (y) =Prob (Y y) = Prob X g (y) = F X g (y) and its density is given by f Y (y) = d dy F Y (y) = d dy F X g (y) = f X g (y) d dy g (y). If g were strictly decreasing, then we would have f Y (y) = f X g (y) d dy g (y). In summary, if g is strictly monotonic function, then f Y (y) =f X g (y) d dy g (y).
2 3. Example - CDF Technique Let X be a random variable with p.d.f. e x f (x) = ( + e x 2 for x. ) We wish to find the distribution of Y = e X. Here we have g (X) =e X which is strictly decreasing function. Thus, g (y) = ln y d so that dy g (y) = and applying the formula y above, we have f Y (y) =f X g (y) d dy g (y) y = ( + y) 2 y = where the range of y is obviously 0 <y<. ( + y) 2 4. The Jacobian Transformation Technique To explain this technique, we consider only the case of two continuous random variables X and X 2 and assume that they are mapped onto U and U 2 by the transformation u = g (x,x 2 ) and u 2 = g 2 (x,x 2 ). Suppose this transformation is one-to-one so that we can invert them to get x = h (u,u 2 ) and x 2 = h 2 (u,u 2 ). The Jacobian of this transformation is the determinant g g x x 2 J (x,x 2 )=det g 2 x g 2 x 2 = g g 2 g 2 g, x x 2 x x 2 provided this is not zero. Suppose the joint density of X and X 2 is denoted by f X X 2. Then, the joint density of U and U 2 is given by f U U 2 (u,u 2 )= J (h (u,u 2 ),h 2 (u,u 2 )) f X X 2 (h (u,u 2 ),h 2 (u,u 2 )). The above technique can be easily extended to several variables. See Hogg & Craig (995).
3 5. Example - Jacobian Technique As an illustration of the Jacobian transformation technique, let us consider deriving the t-distribution. Suppose Z N (0, ) and V χ 2 (r) and are independent. Then, the random variable T = p Z V /r has a t-distribution with r degrees of freedom. Define the variables s = v and t = z p v /r so that this forms a one to one transformation with the inversion z = t p s /r and v = s. Its Jacobian is J (z,v) =det s z t z s v t v 0 =det p v /r 2 zv 3/2 r = p v /r = p s /r Since Z and V are independent, their joint density canbewrittenas f ZV (z,v) =f Z (z) f V (v) = e 2 z2 e v/2 2π Γ (r/2) 2 r/2vr/2 Thus, using the Jacobain transformation formula above, the joint density of (S, T ) is given by f ST (s, t) = p s /r ³ e 2 t s/r 2 e s/2 2π Γ (r/2) 2 r/2sr/2 = sr/2 p s /r exp s µ+ t2 2πΓ (r/2) 2 r/2 2 r wherewenotethatsince 0 <v< and <z< then 0 <s< and <t<. Therefore, the marginal density of T is given by Z f T (t) = f ST (s, t) ds Z0 = sr/2 p s /r 0 2πΓ (r/2) 2 r/2 exp s µ+ t2 ds. 2 r Making the transformation w = s µ+ t2 2 r,
4 so that dw = 2 and therefore Z f T (t) = 0 2πΓ (r/2) 2 r/2 µ e w 2 dw +t 2 /r = for <t<. µ+ t2 ds r Γ [(r +)/2] πrγ (r/2) ( + t 2 /r) (r+)/2, µ 2w (r+)/2 +t 2 /r 6. The MGF Technique This method can be effective in instances where we can derive a recognizable m.g.f. because when it exists, it is unique and it uniquely determines the distribution. Suppose we are interested in the distribution of U = g (X,..., X n ) where X,..., X n have a joint density f (x,..., x n ). Then, we find the m.g.f. of U using M U (t) =E e Ut = Z Z e g(x,...,x n )t f (x,..., x n ) dx...dx n. In the special case where U is the sum of the random variables U = X + + X n and X,..., X n are independent, we have M U (t) =E ³e (X + +X n )t = E e X t E e X nt = M X (t) M Xn (t). The m.g.f. of U is the product of the m.g.f. of X,..., X n.
5 7. Examples - The MGF Technique Example (Poisson): Let X Poisson(λ ) and X 2 Poisson(λ 2 ) where X,X 2 are independent. Then the mgf of U = X + X 2 is given by M U (t) =M X (t) M X2 (t) = e λ (e t ) e λ 2(e t ) = e (λ +λ 2 )(e t ) which is the mgf of another Poisson with parameter λ + λ 2,i.e. U Poisson (λ + λ 2 ). Example (Normal): Let X N µ, σ 2 and X 2 N µ 2, σ 2 2 where X,X 2 again are independent. Then the mgf of U = X + X 2 is given by M U (t) =M X (t) M X2 (t) = e µ t+ 2 σ2 t 2 e µ 2t+ 2 σ2 2t 2 = e (µ +µ 2 )t+ 2(σ 2 +σ 2 2)t 2 which is the mgf of another Normal with mean µ + µ 2 and variance σ 2 + σ 2 2. That is U N µ + µ 2, σ 2 + σ Distributions of Order Statistics Assume X,X 2,..., X n are n independent identically distributed (i.i.d.) random variables and let their common distribution function be F X and density f X. Suppose we sort these variables and denote by X () <X (2) < <X (n) the order statistics. In particular, X () =min(x,..., X n ) is the minimum and X (n) =max(x,..., X n ).Forsimplicity, denote by U = X (n) and V = X (). Distribution of the Maximum Deriving the distribution of the maximum, we have F U (u) = Prob (U u) = Prob (X u) Prob (X 2 u) Prob (X n u) =[F (u)] n and the density function is f U (u) =nf (u)[f (u)] n. Distribution of the Minimum We have F V (v) =Prob (V v) = Prob (V >v) = [Prob (X >u) Prob (X n >u)] = [ F (v)] n
6 and the corresponding density function is f V (v) =nf (v)[ F (v)] n. In general, we can show that the probability density of the k-thorderstatisticisgivenby n! f k (x) = (k )! (n k)! f (x)[f (x)]k [ F (x)] n k. The joint probability density of the order statistic is given by: f 2...n (y,y 2,..., y n )=n!f (y ) f (y 2 ) f (y n ). 9. Example - Order Statistics Consider a system with n components. Assume that the lifetimes of the components are T,T 2,..., T n which are i.i.d. with exponential distribution with parameter λ. Suppose that the system are connected in series, that is, the system will fail if any one of the components fail. The lifetime V of the system is therefore the minimum of the T k,i.e. V =min(t,...t n ). Therefore the density of V is given by f V (v) =nf (v)[ F (v)] n = nλe λv e λv n =(nλ) e (nλ)v which is exponential with parameter nλ. Suppose that the system are connected in parallel, that is, the system will fail only if all of the components fail. The lifetime U of the system is therefore the minimum of the T k,i.e. V =min(t,...t n ). Therefore the density of V is given by f U (u) =nf (u)[f (u)] n = nλe λu e λu n.
7 0. Some Special Sampling Distributions We now consider some results regarding distributions resulting from sampling from a normal distribution. A Single Normal and Chi-Square. Suppose Z N (0, ), then Y = Z 2 χ 2 () has a chi-square distribution with degree of freedom. It is interesting to prove this, and it uses the CDF technique. Consider F Y (y) = Prob Z 2 y = Prob ( y Z y) = Z y y 2π e 2 z2 dz =2 Z y 0 2π e 2 z2 dz and now applying change of variable, say z = w,so that dz = 2 w /2 dw. Therefore, we have Z y F Y (y) =2 2π 2 w /2 e 2 w dw 0 Differentiating to get the p.d.f. we get f Y (y) = y /2 e 2 y = y ( 2)/2 e y/2 2π 2 /2 Γ 2 which is the density of a χ 2 () distributed random variables. Normal and Chi-Square. Suppose Z,Z 2,..., Z r are independent standard normal random variables. Then, the random variable rx V = Z 2 + Z Zr 2 = k= has a chi-square distribution with r degrees of freedom. t-distribution. Suppose Z N (0, ) and V χ 2 (r) and are independent. Then, the random variable T = p Z V /r has a t-distribution with r degrees of freedom. F-distribution. Suppose U χ 2 (r ) and V χ 2 (r 2 ) are two independent chi-square distributed random variables Then, the random variable F = U /r V /r 2 has an F-distribution with r and r 2 degrees of freedom. Sample Mean and Sample Variance. Suppose X,X 2,..., X n are n independent random variables with identical distribution N µ, σ 2. Define the Z 2 k
8 sample mean by X = n nx k= X k and the sample variance by S 2 = nx Xk X 2. n k= Then the following important properties can be verified: X N µ, n σ2 (n ) S2 χ 2 (n ) σ 2 X and S 2 are independent. Using these results, it can further be shown that T = X µ S / n has a t-distribution with n degrees of freedom.
Sampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationSampling Distributions
In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,
More informationBMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs
Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationAPPM/MATH 4/5520 Solutions to Problem Set Two. = 2 y = y 2. e 1 2 x2 1 = 1. (g 1
APPM/MATH 4/552 Solutions to Problem Set Two. Let Y X /X 2 and let Y 2 X 2. (We can select Y 2 to be anything but when dealing with a fraction for Y, it is usually convenient to set Y 2 to be the denominator.)
More informationReview Quiz. 1. Prove that in a one-dimensional canonical exponential family, the complete and sufficient statistic achieves the
Review Quiz 1. Prove that in a one-dimensional canonical exponential family, the complete and sufficient statistic achieves the Cramér Rao lower bound (CRLB). That is, if where { } and are scalars, then
More informationStatistics for scientists and engineers
Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3
More informationMath 151. Rumbos Spring Solutions to Review Problems for Exam 2
Math 5. Rumbos Spring 22 Solutions to Review Problems for Exam 2. Let X and Y be independent Normal(, ) random variables. Put Z = Y X. Compute the distribution functions (z) and (z). Solution: Since X,
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More information, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40
Assignment 4 Fall 07. Exercise 3.. on Page 46: If the mgf of a rom variable X is ( 3 + 3 et) 5, find P(X or 3). Since the M(t) of X is ( 3 + 3 et) 5, X has a binomial distribution with n 5, p 3. The probability
More informationSTAT 512 sp 2018 Summary Sheet
STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}
More informationContinuous Distributions
A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later
More information0, otherwise. U = Y 1 Y 2 Hint: Use either the method of distribution functions or a bivariate transformation. (b) Find E(U).
1. Suppose Y U(0, 2) so that the probability density function (pdf) of Y is 1 2, 0 < y < 2 (a) Find the pdf of U = Y 4 + 1. Make sure to note the support. (c) Suppose Y 1, Y 2,..., Y n is an iid sample
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationMath 152. Rumbos Fall Solutions to Exam #2
Math 152. Rumbos Fall 2009 1 Solutions to Exam #2 1. Define the following terms: (a) Significance level of a hypothesis test. Answer: The significance level, α, of a hypothesis test is the largest probability
More informationSTAT 430/510 Probability
STAT 430/510 Probability Hui Nie Lecture 16 June 24th, 2009 Review Sum of Independent Normal Random Variables Sum of Independent Poisson Random Variables Sum of Independent Binomial Random Variables Conditional
More informationMTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6
MTH739U/P: Topics in Scientific Computing Autumn 16 Week 6 4.5 Generic algorithms for non-uniform variates We have seen that sampling from a uniform distribution in [, 1] is a relatively straightforward
More information4. CONTINUOUS RANDOM VARIABLES
IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous
More informationDistributions of Functions of Random Variables
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 217 Néhémy Lim Distributions of Functions of Random Variables 1 Functions of One Random Variable In some situations, you are given the pdf f X of some
More informationHybrid Censoring; An Introduction 2
Hybrid Censoring; An Introduction 2 Debasis Kundu Department of Mathematics & Statistics Indian Institute of Technology Kanpur 23-rd November, 2010 2 This is a joint work with N. Balakrishnan Debasis Kundu
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationIntroduction to Statistical Inference Self-study
Introduction to Statistical Inference Self-study Contents Definition, sample space The fundamental object in probability is a nonempty sample space Ω. An event is a subset A Ω. Definition, σ-algebra A
More informationUCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)
UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable
More informationContents 1. Contents
Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................
More informationSection 8.1. Vector Notation
Section 8.1 Vector Notation Definition 8.1 Random Vector A random vector is a column vector X = [ X 1 ]. X n Each Xi is a random variable. Definition 8.2 Vector Sample Value A sample value of a random
More informationPractice Problems Section Problems
Practice Problems Section 4-4-3 4-4 4-5 4-6 4-7 4-8 4-10 Supplemental Problems 4-1 to 4-9 4-13, 14, 15, 17, 19, 0 4-3, 34, 36, 38 4-47, 49, 5, 54, 55 4-59, 60, 63 4-66, 68, 69, 70, 74 4-79, 81, 84 4-85,
More informationDeccan Education Society s FERGUSSON COLLEGE, PUNE (AUTONOMOUS) SYLLABUS UNDER AUTOMONY. SECOND YEAR B.Sc. SEMESTER - III
Deccan Education Society s FERGUSSON COLLEGE, PUNE (AUTONOMOUS) SYLLABUS UNDER AUTOMONY SECOND YEAR B.Sc. SEMESTER - III SYLLABUS FOR S. Y. B. Sc. STATISTICS Academic Year 07-8 S.Y. B.Sc. (Statistics)
More informationChapter 5,6 Multiple RandomVariables
Chapter 5,6 Multiple RandomVariables ENCS66 - Probabilityand Stochastic Processes Concordia University Vector RandomVariables A vector r.v. is a function where is the sample space of a random experiment.
More information1. Point Estimators, Review
AMS571 Prof. Wei Zhu 1. Point Estimators, Review Example 1. Let be a random sample from. Please find a good point estimator for Solutions. There are the typical estimators for and. Both are unbiased estimators.
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More informationStatistics STAT:5100 (22S:193), Fall Sample Final Exam B
Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationHybrid Censoring Scheme: An Introduction
Department of Mathematics & Statistics Indian Institute of Technology Kanpur August 19, 2014 Outline 1 2 3 4 5 Outline 1 2 3 4 5 What is? Lifetime data analysis is used to analyze data in which the time
More informationIEOR 4703: Homework 2 Solutions
IEOR 4703: Homework 2 Solutions Exercises for which no programming is required Let U be uniformly distributed on the interval (0, 1); P (U x) = x, x (0, 1). We assume that your computer can sequentially
More informationCHARACTERIZATIONS OF THE PARETO DISTRIBUTION BY THE INDEPENDENCE OF RECORD VALUES. Se-Kyung Chang* 1. Introduction
JOURNAL OF THE CHUNGCHEONG MATHEMATICAL SOCIETY Volume 20, No., March 2007 CHARACTERIZATIONS OF THE PARETO DISTRIBUTION BY THE INDEPENDENCE OF RECORD VALUES Se-Kyung Chang* Abstract. In this paper, we
More informationFunctions of Random Variables Notes of STAT 6205 by Dr. Fan
Functions of Random Variables Notes of STAT 605 by Dr. Fan Overview Chapter 5 Functions of One random variable o o General: distribution function approach Change-of-variable approach Functions of Two random
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationThis midterm covers Chapters 6 and 7 in WMS (and the notes). The following problems are stratified by chapter.
This midterm covers Chapters 6 and 7 in WMS (and the notes). The following problems are stratified by chapter. Chapter 6 Problems 1. Suppose that Y U(0, 2) so that the probability density function (pdf)
More informationBivariate Normal Distribution
.0. TWO-DIMENSIONAL RANDOM VARIABLES 47.0.7 Bivariate Normal Distribution Figure.: Bivariate Normal pdf Here we use matrix notation. A bivariate rv is treated as a random vector X X =. The expectation
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Definition: n-dimensional random vector, joint pmf (pdf), marginal pmf (pdf) Theorem: How to calculate marginal pmf (pdf) given joint pmf (pdf) Example: How to calculate
More information[Chapter 6. Functions of Random Variables]
[Chapter 6. Functions of Random Variables] 6.1 Introduction 6.2 Finding the probability distribution of a function of random variables 6.3 The method of distribution functions 6.5 The method of Moment-generating
More informationSummary. Ancillary Statistics What is an ancillary statistic for θ? .2 Can an ancillary statistic be a sufficient statistic?
Biostatistics 62 - Statistical Inference Lecture 5 Hyun Min Kang 1 What is an ancillary statistic for θ? 2 Can an ancillary statistic be a sufficient statistic? 3 What are the location parameter and the
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationlim F n(x) = F(x) will not use either of these. In particular, I m keeping reserved for implies. ) Note:
APPM/MATH 4/5520, Fall 2013 Notes 9: Convergence in Distribution and the Central Limit Theorem Definition: Let {X n } be a sequence of random variables with cdfs F n (x) = P(X n x). Let X be a random variable
More informationSOLUTION FOR HOMEWORK 12, STAT 4351
SOLUTION FOR HOMEWORK 2, STAT 435 Welcome to your 2th homework. It looks like this is the last one! As usual, try to find mistakes and get extra points! Now let us look at your problems.. Problem 7.22.
More informationThis exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.
GROUND RULES: This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner. This exam is closed book and closed notes. Show
More informationCDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables
CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Two Classes of R.V. Discrete R.V. Bernoulli Binomial Geometric Poisson Continuous R.V. Uniform Exponential,
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More informationBasic concepts of probability theory
Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,
More informationProbability Distributions Columns (a) through (d)
Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More information(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.
54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and
More informationSolution to Assignment 3
The Chinese University of Hong Kong ENGG3D: Probability and Statistics for Engineers 5-6 Term Solution to Assignment 3 Hongyang Li, Francis Due: 3:pm, March Release Date: March 8, 6 Dear students, The
More informationAPPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2
APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationMoments of the Reliability, R = P(Y<X), As a Random Variable
International Journal of Computational Engineering Research Vol, 03 Issue, 8 Moments of the Reliability, R = P(Y
More informationRandom Vectors and Multivariate Normal Distributions
Chapter 3 Random Vectors and Multivariate Normal Distributions 3.1 Random vectors Definition 3.1.1. Random vector. Random vectors are vectors of random 75 variables. For instance, X = X 1 X 2., where each
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationTransformations and Expectations
Transformations and Expectations 1 Distributions of Functions of a Random Variable If is a random variable with cdf F (x), then any function of, say g(), is also a random variable. Sine Y = g() is a function
More informationECE531 Lecture 10b: Maximum Likelihood Estimation
ECE531 Lecture 10b: Maximum Likelihood Estimation D. Richard Brown III Worcester Polytechnic Institute 05-Apr-2011 Worcester Polytechnic Institute D. Richard Brown III 05-Apr-2011 1 / 23 Introduction So
More informationProbability Background
Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second
More informationLecture 2. Spring Quarter Statistical Optics. Lecture 2. Characteristic Functions. Transformation of RVs. Sums of RVs
s of Spring Quarter 2018 ECE244a - Spring 2018 1 Function s of The characteristic function is the Fourier transform of the pdf (note Goodman and Papen have different notation) C x(ω) = e iωx = = f x(x)e
More informationASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata
ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Errata Effective July 5, 3, only the latest edition of this manual will have its errata
More informationA6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011
A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao
More informationMA 519 Probability: Review
MA 519 : Review Yingwei Wang Department of Mathematics, Purdue University, West Lafayette, IN, USA Contents 1 How to compute the expectation? 1.1 Tail........................................... 1. Index..........................................
More informationJoint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:
Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,
More informationRecall the Basics of Hypothesis Testing
Recall the Basics of Hypothesis Testing The level of significance α, (size of test) is defined as the probability of X falling in w (rejecting H 0 ) when H 0 is true: P(X w H 0 ) = α. H 0 TRUE H 1 TRUE
More informationReview 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015
Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted
More informationBasic concepts of probability theory
Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,
More informationRaquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010
Raquel Prado Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010 Final Exam (Type B) The midterm is closed-book, you are only allowed to use one page of notes and a calculator.
More information3 Continuous Random Variables
Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random
More informationLecture 11: Probability, Order Statistics and Sampling
5-75: Graduate Algorithms February, 7 Lecture : Probability, Order tatistics and ampling Lecturer: David Whitmer cribes: Ilai Deutel, C.J. Argue Exponential Distributions Definition.. Given sample space
More informationMultivariate Distributions
IEOR E4602: Quantitative Risk Management Spring 2016 c 2016 by Martin Haugh Multivariate Distributions We will study multivariate distributions in these notes, focusing 1 in particular on multivariate
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationSTAT 450: Statistical Theory. Distribution Theory. Reading in Casella and Berger: Ch 2 Sec 1, Ch 4 Sec 1, Ch 4 Sec 6.
STAT 45: Statistical Theory Distribution Theory Reading in Casella and Berger: Ch 2 Sec 1, Ch 4 Sec 1, Ch 4 Sec 6. Basic Problem: Start with assumptions about f or CDF of random vector X (X 1,..., X p
More informationE X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.
E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationProbability- the good parts version. I. Random variables and their distributions; continuous random variables.
Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density
More information1 Inverse Transform Method and some alternative algorithms
Copyright c 2016 by Karl Sigman 1 Inverse Transform Method and some alternative algorithms Assuming our computer can hand us, upon demand, iid copies of rvs that are uniformly distributed on (0, 1), it
More informationLecture 11. Multivariate Normal theory
10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances
More information8 - Continuous random vectors
8-1 Continuous random vectors S. Lall, Stanford 2011.01.25.01 8 - Continuous random vectors Mean-square deviation Mean-variance decomposition Gaussian random vectors The Gamma function The χ 2 distribution
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More informationMATH2715: Statistical Methods
MATH275: Statistical Methods Exercises VI (based on lectre, work week 7, hand in lectre Mon 4 Nov) ALL qestions cont towards the continos assessment for this modle. Q. The random variable X has a discrete
More informationJoint p.d.f. and Independent Random Variables
1 Joint p.d.f. and Independent Random Variables Let X and Y be two discrete r.v. s and let R be the corresponding space of X and Y. The joint p.d.f. of X = x and Y = y, denoted by f(x, y) = P(X = x, Y
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationSTAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS
STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,
More informationThree hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.
Three hours To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER EXTREME VALUES AND FINANCIAL RISK Examiner: Answer QUESTION 1, QUESTION
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata
ASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Errata Effective July 5, 3, only the latest edition of this manual will have its errata
More information