TRANSFORMATIONS OF RANDOM VARIABLES

Similar documents
PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

Physics 1140 Lecture 6: Gaussian Distributions

Thus, P(F or L) = P(F) + P(L) - P(F & L) = = 0.553

Random variables (section 6.1)

MP203 Statistical and Thermal Physics. Problem set 7 - Solutions

HW1 Solutions. October 5, (20 pts.) Random variables, sample space and events Consider the random experiment of ipping a coin 4 times.

Stochastic processes and stopping time Exercises

Advantages of Sampling. Cheaper Often the only possible way Better quality control over measurement Investigation can be destructive

Discrete Random Variables. David Gerard Many slides borrowed from Linda Collins

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

Elementary Statistics for Geographers, 3 rd Edition

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MATH Introduction to MATLAB

1 The Rocky Mountain News (Denver, Colorado), Dec

4.2 Probability Models

Performance Evaluation

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Lecture 2 31 Jan Logistics: see piazza site for bootcamps, ps0, bashprob

Chapter 01 : What is Statistics?

The Wallis Product, a Connection to Pi and Probability, and Maybe the Gamma Function Perhaps?

Chapter 3: Random Variables 1

Algorithms for Uncertainty Quantification

Continuous Random Variables

Pre-Calc Lesser Topics Not as important for studying Calculus (These will, however, be on your final PreCalculus exam!) Click when ready to proceed!

Chap 1: Experiments, Models, and Probabilities. Random Processes. Chap 1 : Experiments, Models, and Probabilities

Numerical Methods I Monte Carlo Methods

Topic 5 Basics of Probability

Physics 116C The Distribution of the Sum of Random Variables

Two Heads Are Better Than None

ELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities

Discrete Binary Distributions

ANSWERS EXERCISE 1.1 EXERCISE 1.2

Notes 10.1 Probability

Scientific Computing: Monte Carlo

Random Variables and Their Distributions

Chapter 3: Random Variables 1

Moments and Discrete Random Variables OPRE 7310 Lecture Notes by Metin Çakanyıldırım Compiled at 15:07 on Wednesday 22 nd November, 2017

Formulas for probability theory and linear models SF2941

S n = x + X 1 + X X n.

Probability & Statistics - FALL 2008 FINAL EXAM

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Chapter 2: Random Variables

P (second card 7 first card King) = P (second card 7) = %.

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Introduction to Probability 2017/18 Supplementary Problems

Introduction to Stochastic Processes

MATH Solutions to Probability Exercises

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

noise = function whose amplitude is is derived from a random or a stochastic process (i.e., not deterministic)

Probability and Distributions

RANDOM NUMBER GENERATION USING A BIASED SOURCE

Elisha Mae Kostka 243 Assignment Mock Test 1 due 02/11/2015 at 09:01am PST

BASICS OF PROBABILITY

3. Probability and Statistics

p. 6-1 Continuous Random Variables p. 6-2

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Moments and Discrete Random Variables OPRE 7310 Lecture Notes by Metin Çakanyıldırım Compiled at 17:41 on Monday 1 st October, 2018

Math Camp II. Calculus. Yiqing Xu. August 27, 2014 MIT

Exercises and Answers to Chapter 1

Functions of Random Variables Notes of STAT 6205 by Dr. Fan

THE PHYSICS OF STUFF: WHY MATTER IS MORE THAN THE SUM OF ITS PARTS

Product measure and Fubini s theorem

Brief Review of Probability

Appendix A : Introduction to Probability and stochastic processes

Math 141: Lecture 11

CMPT 882 Machine Learning

Basics on Probability. Jingrui He 09/11/2007

Lecture 2: Repetition of probability theory and statistics

1 Presessional Probability

Exercises with solutions (Set D)

Week 1 Quantitative Analysis of Financial Markets Distributions A

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Brief Review of Probability

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

n N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)

1 Random Variable: Topics

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

5. Conditional Distributions

Department of Mathematics

Introduction to Probability Theory

Question Points Score Total: 76

1 Review of di erential calculus

Multiple Random Variables

Sample Spaces, Random Variables

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

RS Chapter 1 Random Variables 6/5/2017. Chapter 1. Probability Theory: Introduction

Department of Mathematics

Today s Outline. Biostatistics Statistical Inference Lecture 01 Introduction to BIOSTAT602 Principles of Data Reduction

Review for the Final Exam

1.12 Multivariate Random Variables

Introduction to Probability and Statistics (Continued)

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

1 Random variables and distributions

MTH4107 / MTH4207: Introduction to Probability

COS 424: Interacting with Data. Lecturer: Dave Blei Lecture #2 Scribe: Ellen Kim February 7, 2008

Chapter 8. Some Approximations to Probability Distributions: Limit Theorems

Assignment 6 Solution. Please do not copy and paste my answer. You will get similar questions but with different numbers!

3. Review of Probability and Statistics

Statistics 100A Homework 5 Solutions

Transcription:

TRANSFORMATIONS OF RANDOM VARIABLES 1. INTRODUCTION 1.1. Definition. We are often interested in the probability distributions or densities of functions of oneormorerandomvariables.supposewehaveasetofrandomvariables,x 1,X 2,X 3,...X n,with a known joint probability and/or density function. We may want to know the distribution of some functionoftheserandomvariablesyφ(x 1,X 2,X 3,...X n ).Realizedvaluesofywillberelatedto realized values of the X s as follows y Φ (x 1, x 2, x 3,..., x n ) (1) A simple example might be a single random variable x with transformation y Φ (x) log (x) (2) 1.2. Techniques for finding the distribution of a transformation of random variables. 1.2.1.Distributionfunctiontechnique. Wefindtheregioninx 1,x 2,x 3,...x n spacesuchthat Φ(x 1, x 2,...x n ) φ.wecanthenfindtheprobabilitythat Φ(x 1,x 2,...x n ) φ,i.e.,p[ Φ(x 1,x 2,...x n ) φ]byintegratingthedensityfunctionf(x 1,x 2,...x n )overthisregion.ofcourse,f Φ (φ)isjust P[ Φ φ].oncewehavef Φ (φ),wecanfindthedensitybyintegration. 1.2.2. Method of transformations(inverse mappings). Suppose we know the density function of x. Also supposethatthefunction y Φ (x)isdifferentiableandmonotonicforvalueswithinitsrangefor whichthedensityf(x).thismeansthatwecansolvetheequation y Φ (x)forxasafunction ofy.wecanthenusethisinversemappingtofindthedensityfunctionofy.wecandoasimilar thingwhenthereismorethanonevariablexandthenthereismorethanonemapping Φ. 1.2.3. Method of moment generating functions. There is a theorem(casella[2, p. 65]) stating that if two random variables have identical moment generating functions, then they possess the same probability distribution. The procedure is to find the moment generating function for Φ and then compareittoanyandallknownonestoseeifthereisamatch.thisismostcommonlydonetosee if a distribution approaches the normal distribution as the sample size goes to infinity. The theorem is presented here for completeness. Theorem1. LetF X (x)andf Y (y)betwocumulativedistributionfunctionsallofwhosemomentsexist. Then a:ifxandyhaveboundedsupport,thenf X (u)f Y (u)foralluifandonlyifex r EY r forall integersr,1,2,.... b:ifthemomentgeneratingfunctionsexistandm X (t)m Y (t)foralltinsomeneighborhoodof, thenf X (u)f Y (u)forallu. For further discussion, see Billingsley[1, ch. 21-22]. Date: February 18, 28. 1

2 TRANSFORMATIONS OF RANDOM VARIABLES 2. DISTRIBUTION FUNCTION TECHNIQUE 2.1. Procedure for using the Distribution Function Technique. As stated earlier, we find the regioninthex 1,x 2,x 3,...x n spacesuchthat Φ(x 1,x 2,...x n ) φ.wecanthenfindtheprobability that Φ(x 1,x 2,...x n ) φ,i.e.,p[ Φ(x 1,x 2,...x n ) φ]byintegratingthedensityfunctionf(x 1,x 2,...x n )overthisregion.ofcourse,f Φ (φ)isjustp[ Φ φ].oncewehavef Φ (φ),wecanfindthe density by differentiation. 2.2.Example1.LettheprobabilitydensityfunctionofXbegivenby { 6 x(1 x) < x < 1 f(x) otherwise NowfindtheprobabilitydensityofYX 3. (3) LetG(y)denotethevalueofthedistributionfunctionofYatyandwrite G(y ) P (Y y ) P (X 3 y ) P (X ) y 1/3 y 1/3 6 x (1 x)dx (4) y 1/3 ( 6 x 6 x 2 ) dx ( 3 x 2 2 x 3 ) y1/3 3 y 2/3 2y Now differentiate G(y) to obtain the density function g(y) g(y) dg(y) d ( 3 y 2/3 2 y 2 y 1/3 2 ) (5) 2 (y 1/3 1 ), < y < 1 2.3.Example2.Lettheprobabilitydensityfunctionofx 1 andofx 2 begivenby f(x 1, x 2 ) { 2 e x 1 2 x 2 x 1 >, x 2 > otherwise NowfindtheprobabilitydensityofYX 1 +X 2 orx 1 Y-X 2.GiventhatYisalinearfunction ofx 1 andx 2,wecaneasilyfindF(y)asfollows. (6)

TRANSFORMATIONS OF RANDOM VARIABLES 3 LetF Y (y)denotethevalueofthedistributionfunctionofyatyandwrite F Y (y) P (Y y) y y y y y y x2 Nowintegratewithrespecttox 2 asfollows 2 e x1 2 x2 dx 1 dx 2 2 e x1 2 x2 y x2 dx 2 [( 2 e y + x 2 2 x 2 ) ( 2 e 2x2 ) ] dx 2 2 e y x2 + 2 e 2x2 dx 2 2 e 2 x2 2 e y x2 dx 2 (7) F Y (y) P (Y y) y 2 e 2x2 2 e y x2 dx 2 e 2x2 + 2 e y x2 y e 2y + 2 e y y [ e + 2 e y] (8) e 2y 2 e y + 1 NowdifferentiateF Y (y)toobtainthedensityfunctionf(y) f Y (y) df (y) d ( e 2y 2 e y + 1 ) 2 e 2y + 2 e y (9) 2 e 2 y ( 1 + e y ) 2.4.Example3.LettheprobabilitydensityfunctionofXbegivenby f X (x) 1 σ 2π e 1 2 ( x µ σ ) 2, < x < [ exp 1 2 π σ 2 ] (x µ )2 2 σ 2, < x < NowletYΦ(X)e X.WecanthenfindthedistributionofYbyintegratingthedensityfunction ofxovertheappropriateareathatisdefinedasafunctionofy.letf Y (y)denotethevalueofthe distributionfunctionofyatyandwrite (1)

4 TRANSFORMATIONS OF RANDOM VARIABLES F Y (y) P (Y y) P ( e X y ) P (X lny), y > (11) ln y [ ] 1 (x µ )2 exp 2 π σ 2 2 σ 2 dx, y > NowdifferentiateF Y (y)toobtainthedensityfunctionf(y).inthiscasewewillneedtherulesfor differentiating under the integral sign. They are given by theorem 2 which we state below without proof. Theorem2.Supposethatfand f x arecontinuousintherectangle R {(x, t) : a x b, c t d} andsupposethatu (x)andu 1 (x)arecontinuouslydifferentiablefora x bwiththerangeofu (x)and u 1 (x)in(c,d).if ψisgivenby then ψ (x) u1 (x) u (x) f(x, t)dt (12) dψ dx x u1(x) u (x) f(x, t)dt f (x, u 1 (x)) du 1(x) dx f (x, u (x)) du (x) dx + u1(x) u (x) f(x, t) dt x Ifoneoftheboundsofintegrationdoesnotdependonx,thentheterminvolvingitsderivativewillbezero. Foraproofoftheorem2see(Protter[3,p.425]).Applyingthistoequation11whereytakesthe roleofx, lnytakestheroleofu 1 (x),andxtakestheroleoftinthetheoremweobtain F Y (y) ln y F Y (y) f Y (y) + ln y [ 1 exp 2πσ 2 ( 1 2πσ 2 exp [ d ( 1 [ y 2πσ exp 2 The last line of equation 14 follows because ( [ 1 exp 2πσ 2 d ( 1 2πσ 2 exp [ ] (x µ)2 2σ 2 dx, y > ] ( ) ) (lny µ)2 1 2σ 2 y ]) (x µ)2 2σ 2 dx ]) (lny µ)2 2σ 2 ] ) (x µ)2 2σ 2 (13) (14)

TRANSFORMATIONS OF RANDOM VARIABLES 5 3. METHOD OF TRANSFORMATIONS(SINGLE VARIABLE) 3.1. Discrete examples of the method of transformations. 3.1.1. One-to-one function. Find a formula for the probability distribution of the total number of headsobtainedinfourtossesofacoinwheretheprobabilityofaheadis.6. Thesamplespace,probabilitiesandthevalueoftherandomvariablearegivenintable1. TABLE 1. Outcomes, Probabilities and Number of Heads from Tossing a Coin Four Times. Value of random variable X Element of sample space Probability (x) HHHH 81/625 4 HHHT 54/625 3 HHTH 54/625 3 HTHH 54/625 3 THHH 54/625 3 HHTT 36/625 2 HTHT 36/625 2 HTTH 36/625 2 THHT 36/625 2 THTH 36/625 2 TTHH 36/625 2 HTTT 24/625 1 THTT 24/625 1 TTHT 24/625 1 TTTH 24/625 1 TTTT 16/625 From the table we can determine the probabilities as P(X ) 16 96 216 216 81, P(X 1), P(X 2), P (X 3), P (X 4) 625 625 625 625 625 We can also compute these probabilities using counting rules. The probability of one head and then three tails is ( ) ( ) ( ) ( ) 3 2 2 2 5 5 5 5 or ( ) 1 ( 3 2 5 5 Theprobabilityof3headsandthenonetailis ( ) ( 3 3 5 5 or ) 3 24 625 ) ( 3 5 ) ( ) 2 5

6 TRANSFORMATIONS OF RANDOM VARIABLES ( ) 3 ( ) 1 3 2 54 5 5 625. Ofcoursethereareotherwaystoobtain1headandthreetailsbesidesoneheadandthenthree tails.inparticularthereare ( ( 4 1) 4waystoobtainonehead.Andthereare 4 ) 1waytoobtain zeroheads.similarly,therearesixwaystoobtaintwoheads,fourwaystoobtainthreeheadsand onewaytoobtainfourheads.wecanthenwritetheprobabilitymassfunctionas f(x) ( ) ( ) x ( ) 4 x 4 3 2 for x, 1, 2, 3, 4 (15) x 5 5 This, of course, is the binomial distribution. The probabilities of the various possible random variables are contained in table 2. TABLE2. ProbabilityofNumberofHeadsfromTossingaCoinFourTimes Number of Heads x Probability f(x) 16/625 1 96/625 2 216/625 3 216/625 4 81/625 NowconsideratransformationofXintheformY2X 2 +X.Therearefivepossibleoutcomes fory,i.e.,,3,1,21,36.giventhatthefunctionisone-to-one,wecanmakeupatabledescribing the probability distribution for Y. TABLE3. ProbabilityofaFunctionoftheNumberofHeadsfromTossingaCoin Four Times. Y2*(#heads) 2 +#ofheads Number of Heads x Probability f(x) y g(y) 16/625 16/625 1 96/625 3 96/625 2 216/625 1 216/625 3 216/625 21 216/625 4 81/625 36 81/625 3.1.2.Casewherethetransformationisnotone-to-one.NowletthetransformationofXbegivenbyZ (6-2X) 2.ThepossiblevaluesforZare,4,16,36.WhenX2andwhenX4,Y4.Wecan findtheprobabilityofzbyaddingtheprobabilitiesforcaseswhenxgivesmorethanonevalueas shownintable4.

TRANSFORMATIONS OF RANDOM VARIABLES 7 TABLE4. ProbabilityofaFunctionoftheNumberofHeadsfromTossingaCoin Four Times(not one-to-one). y Y(6-(#heads)) 2 Number of Heads x g(y) 216 3 625 216 4 2, 4 625 + 81 625 297 625 96 16 1 625 16 36 625 3.2. Intuitive Idea of the Method of Transformations. The idea of a transformation is to consider thefunctionthatmapstherandomvariablexintotherandomvariabley.theideaisthatifwecan determinethevaluesofxthatleadtoanyparticularvalueofy,wecanobtaintheprobabilityofy bysummingtheprobabilitiesofthosevaluesofxthatmappedintoy.inthecontinuouscase,to findthedistributionfunction,wewanttointegratethedensityofxovertheportionofitsspace thatismappedintotheportionofyinwhichweareinterested.supposeforexamplethatbothx andyaredefinedonthereallinewith X 1and Y 1.IfwewanttoknowG(5),weneed tointegratethedensityofxoverallvaluesofxleadingtoavalueofylessthanfive,whereg(y)is theprobabilitythatyislessthanfive. 3.3. General formula when the random variable is discrete. Consider a transformation defined byyφ(x).thefunction ΦdefinesamappingfromthesamplespaceofthevariableX,toasample space for the random variable Y. IfXisdiscretewithfrequencyfunctionp X,then Φ(X)isdiscreteandhasfrequencyfunction p Φ(X) (t) p X (x) x: Φ(x)t p X (x) xǫφ 1 (t) (16) Theprocessissimpleinthiscase. Oneidentifiesg 1 (t)foreachtinthesamplespaceofthe randomvariabley,andthensumstheprobabilitieswhichiswhatwedidinsection3.1. 3.4. General change of variable or transformation formula. Theorem3. Letf X (x)bethevalueoftheprobabilitydensityofthecontinuousrandomvariablexatx.if the function y Φ(x) is differentiable and either increasing or decreasing(monotonic) for all values within therangeofxforwhichf X (x),thenforthesevaluesofx,theequationyφ(x)canbeuniquelysolved forxtogivexφ 1 (y)w(y)wherew( )Φ 1 ( ).Thenforthecorrespondingvaluesofy,theprobability densityofyφ(x)isgivenby

8 TRANSFORMATIONS OF RANDOM VARIABLES [ f X Φ 1 (y) ] dφ 1 (y) f g(y) f Y (y) X [ w (y)] d w (y) dφ(x) dx f X [ w (y)] w (y) otherwise Proof. Consider the digram in figure 1. (17) FIGURE1. yφ(x)isanincreasingfunction. Ascanbeseenfrominfigure1,eachpointontheyaxismapsintoapointonthexaxis,that is,xmusttakeonavaluebetween Φ 1 (a)and Φ 1 (b)whenytakesonavaluebetweenaandb. Therefore P(a < Y < b) P ( Φ 1 (a) < X < Φ 1 (b) ) Φ 1 (b) Φ 1 (a) f X (x)dx (18)

TRANSFORMATIONS OF RANDOM VARIABLES 9 Whatwewouldliketodoisreplacexinthesecondlinewithy,and Φ 1 (a)and Φ 1 (b)witha andb. Todosoweneedtomakeachangeofvariable.Considerhowwemakeausubstitution whenweperformintegrationorusethechainrulefordifferentiation.forexampleifuh(x)then duh (x)dx.soifxφ 1 (y),then Thenwecanwrite dx d Φ 1 (y ) ( f X (x) dx f X Φ 1 (y) ) d Φ 1 (y) (19) For the case of a definite integral the following lemma applies. Lemma1.Ifthefunctionuh(x)hasacontinuousderivativeontheclosedinterval[a,b]andfiscontinuous ontherangeofh,then b a f (h(x)) h (x)dx. h(b) h (a) f (u)du (2) Usingthislemmaortheintuitionfromequation19wecanthenrewriteequation18asfollows P (a < Y < b ) P ( Φ 1 (a) < X < Φ 1 (b) ) Φ 1 ( b ) f Φ 1 (a) X (x) dx b a f ( X Φ 1 (y ) ) d Φ 1 (y ) d y Theprobabilitydensityfunction,f Y (y),ofacontinuousrandomvariableyisthefunctionf( ) that satisfies P (a < Y < b) F (b) F (a) b a (21) f Y (t)dt (22) Thisthenimpliesthattheintegrandinequation21isthedensityofY,i.e.,g(y),soweobtain g (y) f Y (y) f X [ Φ 1 (y) ] d Φ 1 (y) d y (23) aslongas d Φ 1 (y) exists.thisprovesthelemmaif Φisanincreasingfunction.Nowconsiderthecasewhere Φisa decreasing function as in figure 2. Ascanbeseenfromfigure2,eachpointontheyaxismapsintoapointonthexaxis,thatis, Xmusttakeonavaluebetween Φ 1 (a)and Φ 1 (b)whenytakesonavaluebetweenaandb. Therefore P (a < Y < b) P ( Φ 1 (b) < X < Φ 1 (a) ) (24) Φ 1 ( a) Φ 1 ( b ) f X (x)dx

1 TRANSFORMATIONS OF RANDOM VARIABLES FIGURE2. yφ(x)isadecreasingfunction. MakingachangeofvariableforxΦ 1 (y)asbeforewecanwrite P (a < Y < b) P ( Φ 1 (b) < X < Φ 1 (a) ) (25) Because Φ 1 (a) Φ 1 (b) a b f X (x)dx ( f X Φ 1 (y) ) d Φ 1 (y) b a f X ( Φ 1 (y) ) d Φ 1 (y) d Φ 1 (y) whenthefunctionyφ(x)isincreasingand dx 1 dx

TRANSFORMATIONS OF RANDOM VARIABLES 11 d Φ 1 (y) ispositivewhenyφ(x)isdecreasing,wecancombinethetwocasesbywriting g (y) f Y (y) f X [ Φ 1 (y) ] d Φ 1 (y ) d y (26) 3.5. Examples. 3.5.1. Example 1. Let X have the probability density function given by { 1 2 f X (x) x, x 2, elsewhere FindthedensityfunctionofYΦ(X)6X-3. (27) Noticethatf X (x)ispositiveforallxsuchthat x 1.Thefunction ΦisincreasingforallX. Wecanthenfindtheinversefunction Φ 1 asfollows y 6 x 3 6 x y + 3 x y + 3 6 Φ 1 (y) Wecanthenfindthederivativeof Φ 1 withrespecttoyas dφ 1 d ( ) y + 3 6 (28) (29) 1 6 Thedensityofyisthen g(y) f Y (y) f X [ Φ 1 (y) ] ( ) ( ) 1 3 + y 1 2 6 6, dφ 1 (y) 3 + y 6 Forallothervaluesofy,g(y).Simplifyingthedensityandtheboundsweobtain 2 (3) g(y) f Y y) { 3 + y 72, 3 y 9 elsewhere (31)

12 TRANSFORMATIONS OF RANDOM VARIABLES 3.5.2.Example2.LetXhavetheprobabilitydensityfunctiongivenbyf X (x). Thenconsiderthe transformationyφ(x)σx+µ, σ.thefunction ΦisincreasingforallX.Wecanthenfind theinversefunction Φ 1 asfollows y σ x + µ σ x y µ x y µ Φ 1 (y) σ Wecanthenfindthederivativeof Φ 1 withrespecttoyas Thedensityofyisthen dφ 1 d 1 σ ( y µ [ f Y (y) f X Φ 1 (y) ] d Φ 1 (y ) [ ] y µ f X 1 σ σ 3.5.3. Example 3. Let X have the probability density function given by σ ) (32) (33) (34) f X (x) FindthedensityfunctionofYX 1/2. { e x, x, elsewhere (35) Noticethatf X (x)ispositiveforallxsuchthat x.thefunction ΦisincreasingforallX. Wecanthenfindtheinversefunction Φ 1 asfollows y x 1 2 y 2 x (36) x Φ 1 (y) y 2 Wecanthenfindthederivativeof Φ 1 withrespecttoyas Thedensityofyisthen d Φ 1 d y2 2y f Y (y) f X [ Φ 1 (y) ] e y2 2y dφ 1 (y) (37) (38)

TRANSFORMATIONS OF RANDOM VARIABLES 13 Agraphofthetwodensityfunctionsisshowninfigure3. 1 FIGURE 3. The Two Density Functions. Value of Density Function.8.6.4.2 f x g y 1 2 3 4

14 TRANSFORMATIONS OF RANDOM VARIABLES 4. METHOD OF TRANSFORMATIONS(MULTIPLE VARIABLES) 4.1.Generaldefinitionofatransformation.Let ΦbeanyfunctionfromR k tor m,k,m 1,such that Φ 1 (A){x ǫr k : Φ(x)ǫA} ǫß k foreveryaǫß m whereß m isthesmallest σ-fieldhavingall theopenrectanglesinr m asmembers.ifwewriteyφ(x),thefunction Φdefinesamappingfrom thesamplespaceofthevariablex(ξ)toasamplespace(y)oftherandomvariable Ψ.Specifically and Φ (x) : Ξ Ψ (39) Φ 1 (A) { x ǫ Ψ : Φ (x) ǫ A } (4) 4.2. Transformations involving multiple functions of multiple random variables. Theorem4. Let f X1 X 2 (x 1 x 2 )bethevalueofthejointprobabilitydensityofthecontinuousrandom variablesx 1 andx 2 at(x 1,x 2 ).Ifthefunctionsgivenbyy 1 u 1 (x 1,x 2 )an 2 u 2 (x 1,x 2 )arepartially differentiablewithrespecttox 1 andx 2 andrepresentaone-to-onetransformationforallvalueswithinthe rangeofx 1 andx 2 forwhich f X1 X 2 (x 1 x 2 ),then,forthesevaluesofx 1 andx 2,theequations y 1 u 1 (x 1,x 2 )an 2 u 2 (x 1,x 2 )canbeuniquelysolvedforx 1 andx 2 togive x 1 w 1 (y 1, y 2 )and x 2 w 2 (y 1, y 2 )andforcorrespondingvaluesofy 1 an 2,thejointprobabilitydensityofY 1 u 1 (X 1,X 2 ) andy 2 u 2 (X 1,X 2 )isgivenby f Y1 Y 2 (y 1 y 2 ) f X1 X 2 [ w 1 (y 1, y 2 ), w 2 (y 1 y 2 ) ] J (41) wherejisthejacobianofthetransformationandisdefinedasthedeterminant J x 1 x 1 y 1 y 2 x 2 x 2 y 1 y 2 (42) Atallotherpoints f Y1Y 2 (y 1 y 2 ). 4.3.Example.Lettheprobabilitydensityfunctionof X 1 and X 2 begivenby f X1X 2 (x 1, x 2 ) { e (x 1 + x 2) x 1, x 2 elsewhere (43) ConsidertworandomvariablesY 1 andy 2 bedefinedinthefollowingmanner. Y 1 X 1 + X 2 Y 2 X 1 (44) X 1 + X 2 TofindthejointdensityofY 1 andy 2 wefirstneedtosolvethesystemofequationsinequation 44forX 1 andx 2.

TRANSFORMATIONS OF RANDOM VARIABLES 15 Y 1 X 1 + X 2 X 1 Y 2 X 1 + X 2 X 1 Y 1 X 2 Y 1 X 2 Y 2 Y 1 X 2 + X 2 Y 2 Y 1 X 2 Y 1 (45) TheJacobianisgivenby Y 1 Y 2 Y 1 X 2 X 2 Y 1 Y 1 Y 2 Y 1 (1 Y 2 ) X 1 Y 1 ( Y 1 Y 1 Y 2 ) Y 1 Y 2 x 1 x 1 y J 1 y 2 x 2 x 2 y 1 y 2 y 2 y 1 1 y 2 y 1 y 2 y 1 y 1 (1 y 2 ) y 2 y 1 y 1 + y 1 y 2 (46) y 1 Thistransformationisone-to-oneandmapsthedomainofX(Ξ)givenbyx 1 > andx 2 > in thex 1 x 2 -planeintothedomainofy(ψ)inthey 1 y 2 -planegivenbyy 1 > and < y 2 < 1.Ifwe apply theorem 4 we obtain f Y1Y 2 (y 1 y 2 ) f X1 X 2 [ w 1 (y 1, y 2 ), w 2 ( y 1 y 2 ) ] J e ( y1 y2 + y1 y1 y2 ) y 1 e y1 y 1 y 1 e y1 Consideringallpossiblevaluesofvaluesofy 1 an 2 weobtain { y1 e y1 for y 1, < y 2 < 1 f Y1 Y 2 (y 1, y 2 ) elsewhere WecanthenfindthemarginaldensityofY 2 byintegratingovery 1 asfollows (47) (48) f Y2 (y 1, y 2 ) f Y1 Y 2 (y 1, y 2 ) 1 y 1 e y1 1 (49)

16 TRANSFORMATIONS OF RANDOM VARIABLES Wemakeauvsubstitutiontointegratewhereu,v,du,anddvaredefinedas This then implies u y 1 v e y1 du 1 dv e y1 1 (5) f Y2 (y 1, y 2 ) forally 2 suchthat < y 2 < 1. y 1 e y1 1 y 1 e y1 ( ) ( e y1 ) ( e ) y1 ( e e ) + 1 1 e y1 1 (51) Agraphofthejointdensitiesandthemarginaldensityfollows.Thejointdensityof(X 1,X 2 )is showninfigure4. x 1 4 3 1 2 FIGURE4. JointDensityof X 1 and X 2..2.15 f 12 x 1,x 2.1.5 1 2 3 4 x 2 Thisjointdensityof(Y 1,Y 2 )iscontainedinfigure5.

TRANSFORMATIONS OF RANDOM VARIABLES 17 y 1 8 6 2 4 FIGURE5. JointDensityof Y 1 and Y 2..3.2 f y 1,y 2.1.25.5.75 1 y 2 ThismarginaldensityofY 2 isshowngraphicallyinfigure6. y 1 8 6 2 4 FIGURE6. MarginalDensityof Y 2. 2 1.5 1 f 2 y 2.5.25.5.75 1 y 2

18 TRANSFORMATIONS OF RANDOM VARIABLES REFERENCES [1] Billingsley, P. Probability and Measure. 3rd edition. New York: Wiley, 1995 [2] Casella, G. and R.L. Berger. Statistical Inference. Pacific Grove, CA: Duxbury, 22 [3] Protter, Murray H. and Charles B. Morrey, Jr. Intermediate Calculus. New York: Springer-Verlag, 1985