Lecture 03 Copyright by Hongyun Wang, UCSC Review of probability theory (Continued) We show that Sum of independent normal random variable normal random variable Characteristic function (CF) of a random variable Random variable: Density: ρx(x) X(ω) Characteristic function is defined as φ X + ( ( )) = exp iξx ( ξ) E exp iξx ( )ρ X ( x)dx This is very similar to Fourier transform (FT). Fourier transform: Forward transform: ( ) = exp( i2π ξx) f x ˆf ξ + Backward transform: f 2 + ( x) = exp i2π ξx f 2 ( x) = f x ( ) ( ) ˆf ξ f ( x) ˆf ( ξ) ( )dx ˆf ( ξ ) f 2 ( x) ( )dξ Therefore, the backward transform is called the inverse transform. Connection between characteristic function (CF) and Fourier transform (FT) φ X ( ξ) = ˆρ ξ ( ) ξ = ξ 2π - -
If ˆf ξ If φ X ( ) = ĝ( ξ), then f ( x) = g( x). ( ξ) = φ Y ( ξ), then ρ X ( s) = ρ Y s ( ). Characteristic function (CF) of a normal random variable: X ~ N(μ, σ 2 ) Density: ρ X ( x) = ( ) 2 exp x µ 2πσ 2 2σ 2 Characteristic function: φ X ( ( )) = exp iξ x ( ξ) E exp iξx ( )ρ X ( x)dx = = ( ) 2 + i2σ 2 ξ x exp x µ 2πσ 2 2σ 2 exp x µ 2πσ 2 dx ( ) 2 + i2σ 2 ξ x µ ( ) 2 ( ) iσ 2 ξ + iµξ σ2 ξ 2 2σ 2 2 dx = exp iµξ σ2 ξ 2 2 ( ) iσ 2 ξ exp x µ 2πσ 2 2σ 2 2 dx = exp iµξ σ2 ξ 2 2 = exp iµξ σ2 ξ 2 2 We obtain: 2πσ 2 exp s 2 2σ 2 ds X ~ N ( µ, σ ) 2 φ X ( ξ) = exp iµξ σ2 ξ 2 Since we can map the characteristic function back to the probability density function, we have the theorem below. 2-2 -
X ~ N(μ, σ 2 ) if and only if φ X ( ξ) = exp iµξ σ2 ξ 2 2 Next, we look at the characteristic function for the sum of two independent random variables. It is simply the product of two individual characteristic functions. If X and Y are independent, then we have φ ( X+Y )( ξ) = φ X ( ξ) φ Y ( ξ) Proof: φ ( X+Y )( ξ) = E exp iξ X +Y ( ( ( ))) ( ) (using the independence) ( ( )) = E exp( iξ X ) exp( iξy ) = E( exp( iξ X )) E exp iξy = φ X ( ξ) φ Y ( ξ) Applying this theorem to two independent normal random variables, we conclude Proof: Suppose X ~ N(μ, σ 2 ), Y ~ N(μ2, σ2 2 ). Suppose X and Y are independent. Then we have (X + Y) ~ N(μ+μ2, σ 2 + σ2 2 ). That is, sum of independent normals is normal ( )( ξ) = φ X ( ξ) φ Y ξ φ X+Y ( ) = exp iµ ξ σ 2 ξ 2 = exp i ( µ + µ 2 )ξ σ 2 2 ( + σ 2 )ξ 2 2 2 exp iµ 2 ξ σ 2 2 2 ξ 2-3 -
Before we start the discussion of stochastic differential equations, let us look at another example on probability to demonstrate the importance of the framework of repeated experiments. Monty Hall s game: Your group and Mike s group will do summer camping together. But each group has a different itinerary in mind. To decide on an itinerary, you and Mike play a game ONCE with Mike hosting. Specifications of the game: ) The host (Mike) puts a card in one of the 3 boxes without you looking (so he knows which box has the card but you don t know). 2) You select a box. 3) If the box you pick contains the card, you win (and you will have priority in the itinerary planning). Otherwise, you lose (and Mike will have priority in the itinerary planning). 4) At the end, all boxes are opened to verify that the host is not cheating. The incident: After you select box #, Mike says Let us play it the Monty Hall style. He opens box #2, which is empty, and offers you the option of switching to box #3. Question: Should you switch? Answer: The behavior of the host (Mike) is incompletely specified. There are many possible ways the game can be repeated. Version : Monty Hall s game This is Mathematicians definition of Monty Hall s game. The real game show hosted by Monty Hall did not actually follow these rules. *) Upon your initial selection, the host must open a box that you did not pick. *) The host must open an empty box. *) The host must offer you the option of switching. For this game, we have Pr(winning not switching) = /3 Pr(winning switching) = 2/3 See Appendix A for derivation. - 4 -
Version 2: (The greedy host) The greedy host wants to lure you away from the correct box. *) Upon your initial selection, the host has the option of opening a box that you did not pick. The greedy host will open a box if and only if you initial selection is correct. *) If the host opens a box, he must open an empty box. *) If the host opens a box, he must offer you the option of switching. For this game, we have Pr(winning not switching if offered) = /3 Pr(winning switching if offered) = 0 See Appendix A2 for derivation. Version 3: (The less greedy host) The less greedy host still wants to lure you away from the correct box. But he wants to avoid this behavior being easily recognized in repeated games. *) Upon your initial selection, the host has the option of opening a box that you did not pick. The less greedy host adds some randomness to the decision on whether or not to open a box. Pr(opening a box your initial selection is incorrect) = p Pr(opening a box your initial selection is correct) = p2 *) If the host opens a box, he must open an empty box. *) If the host opens a box, he must offer you the option of switching. Version 2 is a special case of Version 3 with p = 0 and p2 =. For this game, we have Pr(winning not switching if offered) = /3 Pr(winning switching if offered) = (+2p p2)/3 For example, for p = 0.25 and p2 = 0.75 Pr(winning switching if offered) = 0.25 See Appendix A3 for derivation. Key observation: When you encounter an incompletely specified game only once, you have to make a model perceiving how the game is repeated. The model is subjective. - 5 -
Stochastic differential equation dx ( t ) = b( X ( t ),t )dt + a( X ( t ),t )dw t Or in a more concise form dx = b( X,t )dt + a( X,t )dw We need to introduce W(t). ( ) The Wiener process (Brownian motion) Definition : The Wiener process, denoted by W(t) satisfies ) W(0) = 0 2) For t 0, W(t) ~ N(0, t) 3) For t4 t3 t2 t 0, increments W(t2) W(t) and W(t4) W(t3) are independent. Definition 2: ) W(0) = 0 2) For t2 t 0, W(t2) W(t) ~ N(0, t2 t) 3) For t4 t3 t2 t 0, increments W(t2) W(t) and W(t4) W(t3) are independent. Definition 2 appears to be stronger than Definition. Question: Are these two definitions equivalent? Answer: Yes. Suppose X and Y are independent. Suppose X ~ N(μ, σ 2 ) and (X+Y) ~ N(μ2, σ2 2 ). Then we have Y ~ N(μ2 μ, σ2 2 σ 2 ) Proof: Homework problem. Using this theorem, we show that Definition is as strong as Definition 2. - 6 -
We start with Definition and derive Definition 2. Definition : ==> W(t) ~ N(0, t) and W(t2) ~ N(0, t2) We write W(t2) as a sum W(t2) = W(t) + (W(t2) W(t)) For t2 t 0, W(t) and (W(t2) W(t)) are independent. Applying the theorem above, we conclude W(t2) W(t) ~ N(0, t2 t) which is Definition 2. Appendix A Suppose you never switch. Pr(winning not switching) = Pr(your initial selection is correct) = /3 Suppose you always switch. Recall that the host must open an empty box. When you initial section is incorrect and you switch, you will switch to the correct box. Pr(winning switching) = Pr(your initial selection is incorrect) = 2/3 Appendix A2 Suppose you never switch if offered. Pr(winning not switching if offered) = Pr(your initial selection is correct) = /3 Suppose you always switch if offered. Let A = your initial selection is correct B = the host opens an empty box S = switching if offered - 7 -
W = winning" The greedy host opens a box and offers you the option of switching if and only if your initial selection is correct. B <==> A We use the law of total probability. Pr(winning switching if offered) = Pr(W A and S) Pr(A) + Pr(W A C ) Pr(A C ) = 0 (/3) + 0 (2/3) = 0 Appendix A3 Suppose you never switch if offered. Pr(winning not switching if offered) = Pr(your initial selection is correct) = /3 Suppose you always switch if offered. Let A = your initial selection is correct B = the host opens an empty box S = switching if offered W = winning" The host decides whether or not to open a box with probabilities Pr(B A C ) = p Pr(B A) = p2 We use the law of total probability. Pr(winning switching if offered) = Pr(W A and B and S) Pr(A and B) + Pr(W A and B C and S) Pr(A and B C ) + Pr(W A C and B and S) Pr(A C and B) + Pr(W A C and B C and S) Pr(A C and B C ) We first calculate the various terms used in the law of total probability. Pr(A and B) = Pr(B A) Pr(A) = p2*(/3) Pr(A and B C ) = Pr(B C A) Pr(A) = ( p2)*(/3) Pr(A C and B) = Pr(B A C ) Pr(A C ) = p*(2/3) - 8 -
Pr(A C and B C ) = Pr(B C A C ) Pr(A C ) = ( p)*(2/3) Pr(W A and B and S) = 0 Pr(W A and B C and S) = Pr(W A C and B and S) = Pr(W A C and B C and S) = 0 Substituting these terms into the law of total probability, we obtain Pr(winning switching if offered) = 0 + ( p2)*(/3) + p*(2/3) + 0 = (+2p p2)/3 ==> Pr(winning switching if offered) = (+2p p2)/3-9 -