Stochastics Process Note. Xing Wang
|
|
- Hugo Ellis
- 5 years ago
- Views:
Transcription
1 Stochastics Process Note Xing Wang April 2014
2 Contents 0.1 σ-algebra Monotone Class Theorem measure and expectation Monotone Convergence Theorem Fatou s Lemma Other Convergence Theorem Probability Measure Space Conditional Expectation Expectation Graph Illustration Kolmogorov s Conditional Expectation Theorem Further thinking Stochastic Process and Martingale Stochastic Process Martingale Gamble Special Case with intial as 5,000 and target as 20, Tricks Reference Martingale Problem Formulation Knowledge Graph There are several topic of interest that are related with the final exam: σ-algebra, property, σ-algebra is some way of organizing the information, given a random variable that that measurable with respect to one σ-algebra, it means we can get brief understanding(range) of events that are possible given a outcome of a r.v. Another important feature of the sigma-algebra is that we can use a p-system to generate the sigma-algebra. So we can simply proof property holds for a p-system that generate the sigma-algebra. One important theorem is the Monotone class theorem. TODO, see how is this related to other concept. Measure, probability and expectation The monotoncity and countable addtivity is key property that are used to prove Monotone Convergency Theorem. In old days, we regard the expectation as the mass center in mechanics, as the best understanding of a random varible. Now we take the expectation as integral of the random varible function with respect to a measure. Firstly, we must make sure the integral is valid, which means the result of the integral should be bounded. Also, there are some other theorem, such as fatou s lemma. 1
3 And bounded convergence theorem. And... just can t recall it Also, probability law, we can not get the real probability measure on σ-algebra. What we can get is the observation of r.v. We get some distribution of the r.v, which is called probability law. How many information we can get from one r.v depends on the property of the r.v. conditional expectation we use another view to take care of conditional expectation. The komologov s conditional expectation theorem. There are two way to proof this, TODO, all should be mastered. One is the randon-nicky, the other is the projection view. The projection view is very interesing. stochastic process and martingale From the conditional expectation, we can build relation as the time envolues. Also as time goes on, the σ-algebra is growing. The growing of the σ-algebra means we can get a more smaller possible sub-set, which means we get information. We call the growing list of σ-algebra as the filtration. As time goes on the random varible are also affected, the list of r.v are adapted to the filtration. the martingale transform, we need to prove that no strategy can revery the martingale property. the optinal stopping theorem, TODO, need to understand it. the martingale convergence theorem, TODO, what is this used for? How to gamble, still need to go on with the proving 2
4 0.1 σ-algebra Monotone Class Theorem A d-system that contains a p-system also contains the σ-algebra generated by the p-system. Proof Suppose D is the minimal d-system that contains the p-system C, which means the intersection of all d-systems that contain the p-system. If we can proof D is σ-algebra, we can proof the theorem. We only need to proof that D is p-system. Given one element B C, we know B D. Let D 1 = {A D : A B D}. Every element in C is in D 1, we also know that D 1 is d-system based on lemma, thus D 1 D. So we have A B D A D,B C (1) Given one element in A D, we have D 2 = {B D : A B D}. From 1, we know that all element in C satisfy D 2, s.t. D 2 D. So we have D is closed under intersection, which means that the D is a p-system, thus proof the Theorem. TODO, still have question about the minimal conparison. Is it possible that there is not global minimal d-system, but local optimal results? The results might depends on one proposation : is the intersection of two d-system also d-system 3
5 0.2 measure and expectation Monotone Convergence Theorem Let {X n } be an increasing sequence of non-negative function that converge to X, We can proof that E(X) = E(limX n ) = lime(x n ). Proof 1 For indicator function, 1 A, E(1 A ) = µ(a). For simple non-negative funtion f, f (x) = a i 1 Ai. E f = a i µ(a i ) For non-negative function X, we have the definition EX = sup{eg : gissimplenon negative f unction, g X pointwise}. Firstly, we have EX lime(x n ). The next thing we want to proof is that lime(x n ) EX. For a (0,1), let ϕ be a simple function s.t 0 ϕ X. We define a increasing sets E n = {X n a ϕ}. We can get that lime n = E. Based on the definition of E n, we have E(X n ) E(X n 1 En ) a E(ϕ 1 En ) (2) We can also prove that mapping : A a E(ϕ 1 En ) satisfies the property of measure. Thus we can use the continuity property of measure to prove get that lime(ϕ 1 En = E(ϕ 1 E ) = E(ϕ). By this results to 2, we have lime(x n ) a E(ϕ) (3) By take a 1, We have lime(x n ) E(ϕ), ϕ X. From the definition of expectation (integral), we have lime(x n ) E(X). Thus prove lime(x n ) = E(X). Proof 2 TODO here Fatou s Lemma Let X n 0, then E(limin f X n ) limin f E(X n ). Proof: Let g m = in f n m X n, and limin f X n is the limit of increasing sequence non-negative g m. Then by monotone convergence theorem, we have E(limin f X n ) = lime(g m ) (4) By the definition of infimum, we have g m X n, n m. E(g m ) E(X n ), n m by monotonicity of integration, thus E(g m ) in f n m E(X n ). Then we have Other Convergence Theorem lime(g m ) limin f E(X n ) (5) Dominite convergece theorem: If X n (ω) Y (ω), (v,ω), where E(Y ), then E( X n X ) 0, so that E(X n ) E(X). Bounded convergence theorem: If for some finite constant K, X n (ω) K, (n,ω), then E( X n X ) 0. 4
6 0.3 Probability Measure Space Monotone Convergence Theorem TODO 5
7 0.4 Conditional Expectation Expectation Expectation is the mean value or mass center for the experiment. Of course, we want the get the exact result, but we do not have the right to control the quanlity of the results. So we can accpet the expectation with small variance. It is meaningful only when we are doing the same experiment independently and multiple times Graph Illustration We have probability measure space (Ω,F,P), Ω = {ω 1,ω 2,ω 3 } P(ω 1 ) = 0.2, P(ω 2 ) = 0.3 P(ω 3 ) = 0.5 We have two random variables X : Ω R and Y : Ω R. TODO, here, the σ-algebra generated by a random variable. X is measurable with respect to sigma-algebra F X and Y is measurable with respect to sigma-algebra F Y. Condition 1:F Y is sub-σ-algebra of F X F Y = {/0,{ω 3 },{ω 1,ω 2 },{ω 1,ω 2,ω 3 }} (6) F X = {/0,{ω 1 },{ω 2 },{ω 3 },{ω 1,ω 2 },{ω 1,ω 3 },{ω 2,ω 3 },{ω 1,ω 2,ω 3 }} (7) ω 1 1 ω 2 2 ω Based on the random variables, we can get the following conditional probability: P(X = 1 Y = 1) = 0.2/0.5 = 0.4 P(X = 2 Y = 1) = 0.3/0.5 = 0.6 P(X = 3 Y = 1) = 0 P(X = 1 Y = 2) = 0 P(X = 2 Y = 2) = 0 P(X = 3 Y = 2) = 1 Based on the conditional probability, we can calculate the conditional expectation: E(X Y = 1) = = 1.6 E(X Y = 2) = 3 1 = 3 We can regard the conditional as a function (which can also be called random variable) g(y) = E(X Y = y), this function g is measurable with respect to F Y. We also find one interesting property of this random variable, E(g) = E(X). 6
8 E(X) = = 2.3 E(g) = = 2.3 Some interpretations: From σ(x)7 and σ(y )6, we can see that the σ-algebra represents information. The larger the σ-algebra, the more detail we can get when given a value of the randome varible. For example, we can know event is ω 1 given X = 1, but we can only know the event is ω 1 or ω 2 given Y = 1. The conditional expectation is also a random varible, we get this advantage because we view random variable as mapping. This view might also be related with real analysis or functional analysis. Because we treat function as the element, the transformation from a function to its conditional expectation as a mapping. E(X) = E(g) give a brief understanding of the Kolmogorov s conditional expectation theorem. Condition 2:F Y is not sub-σ-algebra of F X We can just reverse the X and Y of last subsubsection Condition 1.We have the conditional probability as follow P(Y=1 X=1) = 1 P(Y=1 X=2) = 1 P(Y=2 X=3) = 1 Under this probability, we have the following conditional expectation: E(Y X=1) = 1 E(Y X=2) = 1 E(Y X=3) = 2 They are deterministic, this is the reason why we require the σ(y ) to be sub-σ-algebra of σ(x). Condition 3: mutual not included We can think of two example here: F 1 = {/0,{1},{2,3},E} F 2 = {/0,{2},{3,1},E} TODO, need to think more about this Kolmogorov s Conditional Expectation Theorem The Kolmogorov s Conditional Expectation Theorem: Given probability space (Ω,F,P), G is a sub-σ-algebra of F. X is a F -measuarable function, and E( X ) <. There exist a random variable Y such that: 1. Y is G-measurable 2. E( Y ) < 3. For every G G (equavalently, for every G in some p-system which contain Ω and generated G), E(X1 G ) = E(Y 1 G ). Moreover, if Ŷ is another r.v. with these properties, then Ŷ = Y almost surely, P[Ŷ = Y ] = 1. A random variable Y with properties (a)-(c) is called a version of the conditional expectation E(X G) of X given G, and we write Y = E(X G) almost surely. 7
9 Proof 1 Proof it using the Radom-Nickodym Theorem, Prepare Knowledge 1: absolutely continuous Let µ and ν be measures on a measureable space (E,E). Then ν is said to be absolutely continuous with respect to µ if, for every set A in E, µ(a) = 0 ν(a). This relation gives us some information between the relation between two measures. Prepare Knowledge 2: Randon-Nickodym Theorem Suppose that µ is σ-finite (There is a measurable partition (E n )that µ(e n ) is finite), and ν is absolutely continuous with respect to µ. Then, there exists a positive E-measurable function p such that Z E Z ν(dx) f (x) = µ(dx)p(x) f (x), f E + (8) E Moreover, p is unique up to equivalence : if 8 holds for another p in E +, then p(x) = p(x) for µ-almost every x in E. Proof the the RN Theorem TODO, Using RN to prove the conditional expectation theorem Let X H +. Let F be a sub-σ-algebra of H. Then E F X exists and is unique up to equivalence. Proof For each event H in F, define P(H) = P(H),Q(H) = R H P(ω)X(ω) On measurable space (Ω,F ), P is a probability measure, and Q is a measure that is absolutely continuous with respect to P. Hence, using RN theorem, there exists X in F + such that R Ω Q(dω)V (ω) = R Ω P(ω) X(ω)V (ω) for every V in F +. This shows that X is a version of E F X. Proof 2 Prepare Knowledge 1 : L p Space X p = (E X p ) 1/p, p [1, ) (9) X = in f {b R + : X b a.s.} (10) For each p in [1, ], L p denote the collection of all real-valued random variables X with X p <. Prepare Knowledge 2 : Schwarz inequality for Orthogonal projection X and Y are in L 2, XY L 1, and The Schwarz inequality : If We further get triangle law in space L 2, Proof: Firstly, let clear the problem of 11. E(XY ) E( XY ), this is for sure. The next thing is to prove that E( XY ) X 2 Y 2 E(XY ) E( XY ) X 2 Y 2 (11) X +Y 2 X 2 + Y 2 (12) 8
10 To prove the second item above, we can assume that X and Y are larger or equal to 0 because of the absoulte function. Then we propose a series of random variable X n := X n, Y n := Y n, so that X n and Y n are bounded. For any a,b R, 0 E[(aX n + by n ) 2 ] (13) = a 2 E(X 2 n ) + 2abE(X n Y n ) + b 2 E(Y 2 n ) (14) TODO, for a reason that is not quite understandable, we must require {2E(X n Y n )} 2 4E(X 2 n )E(Y 2 n ) 4E(X 2 )E(Y 2 ) (15) If the above inequation is not satisfied, there is a possibility that 14 is not satisfied. Using monotone convergence theorem, we have By combining 16 and 15, we get that E( XY ) X 2 Y 2. Then, let us see the problem of 12. This is obvious from 11. lime(x n Y n ) = E(limX n Y n ) = E(XY ) (16) n n E(XY ) X 2 Y 2 (17) E(X 2 + 2XY +Y 2 ) E(X 2 ) + E(Y 2 ) + 2 X 2 Y 2 (18) E((X +Y ) 2 ) ( X 2 + Y 2 ) 2 (19) X +Y 2 X 2 + Y 2 (20) Prepare Knowledge 3 : Inner product and Parallelogram law For U,V L 2, we define the inner product U,V := E(UV ) (21) The parallelogram law is U +V U V 2 2 = U +V,U +V + U V,U V (22) = 2 U V 2 2 (23) Prepare Knowledge 4 : Orthogonal projection statement For short, all 2 is replace with. Let κ be a vector subspace of L 2 which is complete in that whenerver (V n ) is a sequence in κ which has the Cauchy property that sup r,s k V r V s 0(k ) (24) then there exists a V in κ such that Then given X in L 2, there exists Y in κ such that X Y = := in f { X W : W κ} X Y Z, Z κ V n V 0(n ) (25) Prepare Knowledge 4: Orthogonal projection theorem proof Proof: := in f { X W : W κ}, so we can choose a sequence of (Y n ) in κ such that From parallelogram law 23, we have X Y n (26) X Y r 2 + X Y s 2 = 2 X.5 (Y r +Y s ) (Y r +Y s ) 2 (27) 9
11 .5 (Y r +Y s ) κ, so that X.5 (Y r +Y s ) 2 2. From this, we can know that sequence Y n has the Cauchy property so that there exists a Y in κ such that Y n Y 0 (28) Also from triangle inequation 12, we can get X Y X Y n + Y n Y. By pluging in 26 and 28, we can get that X Y, X Y by the definition of. So we prove that X Y =, thus prove the existence of such Y. Then we prove the orthognal property. For any Z κ, we have Y + tz κ, t R. By definition of Y, we have X Y tz 2 X Y 2, which implies t 2 Z 2 2t Z,X Y. This can only be satisfied when Z,X Y = 0. Final Proof of the conditional expectation TODO The relation between L p space and σ-algebra what is the meaning when we say that κ be a vector subspace of L Further thinking probability law is the usual probability we use. The essence behind it is the measure on the sigmaalgebra. The conditional probability is also a type of probability law which is on another sub-sigmaalgebra. Why we are using sub-sigma-algebra. Because, if it is a super-sigma-algebra, it will be meaningless. The value is one or zero for the conditional probability. So that there is no need for the expectation, because we know all the thing with probability 1 now. TODO, if super-σ-algebra, will there be the possibility some sets that are not measureable. 10
12 0.5 Stochastic Process and Martingale Stochastic Process TODO, it seems that this is not mentioned... This section follows the conditional expectation Martingale Outline by the teacher 1. Are there continuous time martingale? Yes. Firstly, we need to define right continuous filtration H t+ = ε>0 H t+ε. E[X(t) H s ] = X(s), s < t. 2. Are simple path well-behaved? How do we exam the level crossing? This is related to the upcrossing theorem. In this theorem, the number of crossing is bounded by the last value state. 3. What type of stocahstic process can be decomposed into (in part) martingale? Firstly, the process should be in L 1 ; Secondly, the residue item besides the martingale is a previsible process null at Convergence property? The Martingale Convergence Theorem. This theorem requires that the X be a supermartingale bounded in L What happens on random times? A concept of stopping time is proposed, which is useful to describe the end status of the system. You cannot beat the system Notion Treat X n X n 1 as the net winnings per unit stake in game n(n 1). We have E[X n X n 1 F n 1 ] = 0, fair game E[X n X n 1 F n 1 ] 0, unfavourable game We call a process C = (C n : n N) previsible if C n is F n 1 measurable (n 1). This concept is useful for describing the betting strategy. Because the betting stragety at time n can only based on the information we get before time n. So we have total wining up to time n as Y n = 1 k n Ck(X k X k 1 ) =: (C X)n (29) The expression C X, the martingale transform of X by C, is the discrete analogue of the stochastic integral R CdX Principle Statement Let C be a bounded non-negative previsible process so that, for K in [0, ), C n (ω) K for every n and every ω. Let X be a super-martingale [respectively martingale]. Then C X is a supermartingale [martingale] null at 0. The boundedness conditional on C my be replaced by the condition C n L 2, n, provided we also insist that X n L 2, n. Proof for super-martingale: Because C n is bounded non-negative and F n 1 measurable, E[Y n Y n 1 F n 1 ] = E[C n (X n X n 1 ) F n 1 ] (30) = C n E[X n X n 1 F n 1 ] (31) 0 (32) 11
13 Explaination on the condition For the second condition, This guarrent that the expectation exists. Stopping Time Stopping time is a very interesting r.v. that is useful to describe the target of our algorithm, the time to stop playing the game. A map T : Ω {0,1,2,...; } is called a stopping time if, {T n} = {ω : T (ω) n} F n, n Or, {T = n} = {ω : T (ω) = n} F n, n Stopped Process I find on the book, the stake for the stopped process is bounded by 1. Is this true in real world, and will this limit the application? Doob s Optional-stopping theorem Problem with stopping time when T go to Let X be a simple random walk on Z +, starting at 0. Then X is a martingale. Let T be the stopping time: T := in f n : X n = 1. It is well known that P(T < ) = 1. However, even though E(X ( T n) = E(X 0 ) for every n, we have 1 = E(X T ) E(X 0 ) = 0. TODO, here, write some simulation here. Doob s Optional-Stopping Theorem Statement of the theorem Let T be a stopping time and X be a supermartingale. Then the stopped process X T is integrable and E(X T ) E(X 0 ) in each of the following situations: 1. T is bounded. 2. X is bounded, and T is a.s. finite 3. E(T ) <, and for some K R +, X n (ω) X n 1 (ω) K, (n,ω) Proof Firstly, because X is a supermartingale, we have E(X T n X 0 ) T is bounded, which is equal to say N N,T (ω) N, ω. Then we can just take n as N, thus proof it. 2. For situation 2 and 3, the T can not be bounded, we can use some conclusion from convergence theorem to prove it. If X is bounded, we can use bounded convergence theorem to get E(X T n ) E(X T ). 3. X T n X 0 = T n k=1 (X k X k 1 ) KT, thus X T n KT + X 0. Using the dominate convergence theorem, we get E(X T n ) E(X T ), thus prove the theorem. Relation between three condition From view of variable T, we have more and more constraints from the first situation to the last one. T is a.s. finite requires that T (ω) <, µ(ω) 0, while E(T ) < does not require this. TODO, such as absolute of inverse function, which have valur near zero points. Martingale Convergence Theorem Prepare Knowledge 1: Up-Crossing Theorem Upcrossing Upcrossing is short for The number U N [a,b](ω) of upcrossing of [a,b] made by n X n (ω) by time N. It is define to be the largest k such that we can find 0 s 1 < t 1 < s 2 < t 2 < < s k < t k N with X si < a,x ti > b,(1 i k). 12
14 Upcrossing Inequality Y N (ω) (b a)u N [a,b](ω) [X N (ω) a] (33) Doob s Upcrossing Lemma Let X be a supermartingale, U N [a,b] be the number of upcrossing of [a,b] by time N. Then (b a)e(u N [a,b]) E[(X N a) ]. Proof. Because X is supermartingale, the martingale transform Y = C X by a previsible, bounded process C is also supermartingale. E(Y N ) 0, we also know 33, thus prove this lemma. Prepare Knowledge 2: Corollary Infinite crossing not possible for bounded supermartingale Let X be a supermartingale bounded in L 1 in that sup n E( X n ) <. Let a,b R,a < b. Then with U [a,b] := lim N U N [a,b], (b a)eu [a,b] a + sup n E( E n ) < so that P(U [a,b] = ) = 0 Proof: Based on Doob s Upcrossing Lemma, we have (b a)eu N [a,b] a + E( E N ) a + sup n E( E n ). Let N and using MCT, we prove this corollary. Doob s Forward Convergence Theorem Main truck of the theorem. Let X be a supermatingale bounded in L 1 : sup n E( X n ) <. Then, almost surely, X := limx n exists and is finite. Proof Λ : = {ω : X n (ω)doesnotconvergetoalimitin[, ]} (34) = {ω : limin f X n (ω) < limsupx n (ω)} (35) = a,b R:a<b {ω : limin f X n (ω) < a < blimsupx n (ω)} (36) = Λ a,b Based on above corollary, we have P(Λ a,b ) = 0, and Λ is countable union of sets Λ a,b. thus P(Λ) = 0, whence X = limx n exists a.s in [, ]. As for the finite part. From fatou s lemma, we have (37) E( X ) = E(limin f X n ) (38) limin f E( X n ) (39) supe( X n ) (40) (41) so that P(X is f inite) = 1 End of Proof. For definiteness, we define X (ω) = limsupx n (ω), ω, so that X is F measurable and X = limx n, a.s. Doob s decomposition Let (X n : n Z + ) be an adapted process with X n F 1, n. Then X has a Doob decomposition X = X 0 + M + A (42) where M is a martingale null at 0, and A is a previsible process null at 0 (which means A 0 = 0, A n is F n 1 measurable). 13
15 Moreover this decomposition is unique modulo indistinguishability in the sense that if X = X 0 + M + Ã is another such decomposition, then P(M n = M n,a n = A n, n) = 1 (which means almost surely). X is submartingale if and only if the process A is an increasing process in the sense that P(A n A n+1 ) = 1. Proof E(X n X n 1 F n 1 ) = E(M n M n 1 F n 1 ) + E(A n A n 1 F n 1 ) (43) = 0 + (A n A n 1 ) (44) We get A n = n k=1 E(X k X k 1 F k 1 ) a.s., thus get the decomposition of X. 14
16 0.6 Gamble TODO, 10.12, the hitting time for simple random walk. Most game are sub-martingale such that the expectation of total money at next turn is less than the current money we have. Although the expectation of total money will decrease surely, we still have the opportunity to win. So what strategy of betting can we use to gain the most? Special Case with intial as 5,000 and target as 20,000 Specific problem Given initial 5,000, we want to reach goal of 20,000. The payoff is 4:1, the probability of wining for each turn is p. And we know that this p will guarrant that the process is a super-martingale. You will survive if you reach 20,000, or you will die. Which mean you must reach 20,000 before the money in your hand reach 0. We need to prove that P(Y n >= 5000) is decreasing as n increases. But this is not very easy, it depends on what the value of C n is. Timid first and then bold play We play timid for the first turn and bold for the left turns. If we bet 1 in the first turn, the success function is: Similar, we can get that : U 1 = p V ( ) + q V (5000 1) (45) = p V ( ) + q p V ( ) (46) = p (p + q V (4)) + q p V ( ) (47) = p (p + q (V (4) +V ( ))) (48) U n = p (p + q (V (4 n) +V ( n))) (49) The next thing we need to do is to check the relation between V (4n) +V ( n) and 1. What we know now is V (5000) = p, V (0) = 0, V (20000) = 1. We can get a sense of the relation from graphic views. TODO, the real function is NOT KNOWN yet, this graph is only used for illustration. TODO, replace Figure 1 with simulated result. Proof of the V (4n) +V( n) 1 The key to this might still in that the convexity in local range /4 k are points less than the fair line, less than 5000 a n+1 = (a n )/4,a 0 = 5000, are control points greater than 5000 Prove of V (a n ) below fair line given V (a n 1 ) is below fair line. T is short for the total money we have. V (a n ) a n /T = q f ( 4 a n T ) + p 3 (50) q 4 a n T + p a n /T 3 T (51) = q(4 a n T ) + 3pT 3a n 3T (x a)(4q 3) = 3T Now we have prove that through the point mutation, the unfair game points is below the fair game. TODO, the next thing is to prove that I can use the two number mutation method to reach to all numbers. 15 (52) (53)
17 y p=0.8 p=0.25 p=0.1 Figure 1: Illustration of the success function under different probability x From Figure 1, the green line is fair game. For fair game, we have the U n = 1. For unfair game, the success function is always less or equal than the fair game which means U n < 1. We get the conlusion that, when x = 5000, a = and payoff 4:1: p V (x + 3y) + q V (x y) V (x), y [1,5000) (54) Proof of the convex when p < 0.25 The first thought is to use the convexity property of the function to prove valid of inequation 54. But we later found that the convexity does not hold for continous version of the success function.we have the following transformation formula here: V (x) = p V (4x),i f x 5000 (55) V (x) = q V (x (20000 x)/3) + p,i f x > 5000 (56) Given initial value V (0) = 0,V (20000) = 1, we can get a brief overview of the function as shown in Figure 2. We can see that the success function is not consistently convex in range [0,20000], which is the same with result from Vesta 1 Proof of the sufficient condition for optimal strategy This part is from uah [1]. Statement : A strategy S with success function V is optiml if pv (x + 3y) + qv (x y) V (x),x A,y B x (57) Proof: Suppose there is a strategy S, the success function V of it satisfies inequation 57. For any other strategy S with success function U, we define random variable V (Y n ) which can be interpreted as the probability of winning if the gambler s strategy is replaced by strategy S at time n. E[V (Y n )) Y 0 = x] = E[p V (Y n C n ) + q V (Y n 1 C n ) Y 0 = x] (58) Because V satisfy the 57, we have From 59, we have E[V (Y n )) Y 0 = x] E[V (Y n 1 ) Y 0 = x],n N +,x A (59) E[V (Y n )) Y 0 = x] E[V (Y n 1 ) Y 0 = x]... E[V (Y 0 ) Y 0 = x],n N +,x A (60) 1 Labs/redblack/redblack5.html 16
18 Figure 2: Results for forward simulation which means E[V (Y n )) Y 0 = x] V (x), n N +,x A (61) We define the stopping time for strategy S as N = in f {n : Y n = 0orY n = 20,000}. Then we have E[V (Y N )) Y 0 = x] V (x), x A (62) By defination, E[V (Y N )) Y 0 = x] = U(x), thus U(x) V (x) f orx A which mean S is optimal strategy Tricks We know some tricks that in order to survive, we must upcross the line of 20,000. The up-cross means that we should jump from below 20,000 to above or equal to 20,000 using one step, which also means that the probability is p. While the probability of being able to jump to 20,000 might be less than 1, because the payoff is 4:1. Martingale strategy in the gamble field. The game is head or tail coin toss game, and payoff is 2:1. The martigale strategy is that double you bet when you lose. If you have infinite amount of money, you can always win back what you have lost. Markov process view the gamble proces: What the teach show us is only with status from 0 to the target value. Actually, we can also have more status with value larger than the target value. For these extra status, they are similar with the target value, which only have edge from itself to itself. The problem with method is the huge computational cost which is about O(n 3 ) Reference [1] Martingale Problem Formulation TODO, how to calculate the expectation under the limited intial money? We can express it in the following way: We have initial amount of money x, and our target is a. Let X n X n 1 be the net winning per unit bet for the n-th turn. {C n,n > 0} means the bet for the n-th turn, which is our strategy C. Y n = n 1 C n (X n X n 1 )+x is the money until time n. Let F n = σ{x 0,X 1,...,X n }, 17
19 both {X n } and {Y n } are adapted to the filtration F n. We know that the X n is super-martingale, thus the Y n is also super-martingale. We define stopping time as N = in f {n : Y n = 0ora}. The objective function for our problem is to maximize lim (Y n + N n Y 0 = x) (63) The most trouble symbol is the C n which have no restrictions other than no bold play. Which mean the 1 C n Y n 1. Property of the success function in bold play strategy Suppose that our target is money a and the success function V : A [0,1]. If x a/4, we only need to bet a x a x 3. Even we lose, we still have money x 3. We have equation, V (x) = p + q V (x a x,x a/4 (64) 3 The above process will proceed until x goes below a/4. If x < a/4, we need to bet all the money we have now as we are using the bold strategy. We have, V (x) = p V (4x),x < a/4 (65) This process proceed until x goes above a/4 18
20 Monotone Class Theorem Countable Addtivity 0.7 Knowledge Graph 19
A D VA N C E D P R O B A B I L - I T Y
A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2
More informationMartingale Theory for Finance
Martingale Theory for Finance Tusheng Zhang October 27, 2015 1 Introduction 2 Probability spaces and σ-fields 3 Integration with respect to a probability measure. 4 Conditional expectation. 5 Martingales.
More informationMath 6810 (Probability) Fall Lecture notes
Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),
More information1. Stochastic Processes and filtrations
1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S
More informationProblem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1
Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :
More informationMartingale Theory and Applications
Martingale Theory and Applications Dr Nic Freeman June 4, 2015 Contents 1 Conditional Expectation 2 1.1 Probability spaces and σ-fields............................ 2 1.2 Random Variables...................................
More informationSTOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI
STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications
More informationLecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.
Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal
More informationProbability Theory. Richard F. Bass
Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2
More informationSTOCHASTIC MODELS FOR WEB 2.0
STOCHASTIC MODELS FOR WEB 2.0 VIJAY G. SUBRAMANIAN c 2011 by Vijay G. Subramanian. All rights reserved. Permission is hereby given to freely print and circulate copies of these notes so long as the notes
More informationFE 5204 Stochastic Differential Equations
Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013 Conditional expectations, filtration and martingales Content. 1. Conditional expectations 2. Martingales, sub-martingales
More informationLecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011
Random Walks and Brownian Motion Tel Aviv University Spring 20 Instructor: Ron Peled Lecture 5 Lecture date: Feb 28, 20 Scribe: Yishai Kohn In today's lecture we return to the Chung-Fuchs theorem regarding
More informationProbability Theory I: Syllabus and Exercise
Probability Theory I: Syllabus and Exercise Narn-Rueih Shieh **Copyright Reserved** This course is suitable for those who have taken Basic Probability; some knowledge of Real Analysis is recommended( will
More informationInference for Stochastic Processes
Inference for Stochastic Processes Robert L. Wolpert Revised: June 19, 005 Introduction A stochastic process is a family {X t } of real-valued random variables, all defined on the same probability space
More informationLectures on Markov Chains
Lectures on Markov Chains David M. McClendon Department of Mathematics Ferris State University 2016 edition 1 Contents Contents 2 1 Markov chains 4 1.1 The definition of a Markov chain.....................
More informationCONVERGENCE OF RANDOM SERIES AND MARTINGALES
CONVERGENCE OF RANDOM SERIES AND MARTINGALES WESLEY LEE Abstract. This paper is an introduction to probability from a measuretheoretic standpoint. After covering probability spaces, it delves into the
More informationMath-Stat-491-Fall2014-Notes-V
Math-Stat-491-Fall2014-Notes-V Hariharan Narayanan November 18, 2014 Martingales 1 Introduction Martingales were originally introduced into probability theory as a model for fair betting games. Essentially
More informationAdvanced Probability
Advanced Probability Perla Sousi October 10, 2011 Contents 1 Conditional expectation 1 1.1 Discrete case.................................. 3 1.2 Existence and uniqueness............................ 3 1
More informationCHAPTER 1. Martingales
CHAPTER 1 Martingales The basic limit theorems of probability, such as the elementary laws of large numbers and central limit theorems, establish that certain averages of independent variables converge
More informationAdmin and Lecture 1: Recap of Measure Theory
Admin and Lecture 1: Recap of Measure Theory David Aldous January 16, 2018 I don t use bcourses: Read web page (search Aldous 205B) Web page rather unorganized some topics done by Nike in 205A will post
More informationRandom Process Lecture 1. Fundamentals of Probability
Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus
More information1 Stat 605. Homework I. Due Feb. 1, 2011
The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due
More informationUseful Probability Theorems
Useful Probability Theorems Shiu-Tang Li Finished: March 23, 2013 Last updated: November 2, 2013 1 Convergence in distribution Theorem 1.1. TFAE: (i) µ n µ, µ n, µ are probability measures. (ii) F n (x)
More informationExercises Measure Theoretic Probability
Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary
More information. Find E(V ) and var(v ).
Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number
More informationJUSTIN HARTMANN. F n Σ.
BROWNIAN MOTION JUSTIN HARTMANN Abstract. This paper begins to explore a rigorous introduction to probability theory using ideas from algebra, measure theory, and other areas. We start with a basic explanation
More informationProbability Theory Review
Cogsci 118A: Natural Computation I Lecture 2 (01/07/10) Lecturer: Angela Yu Probability Theory Review Scribe: Joseph Schilz Lecture Summary 1. Set theory: terms and operators In this section, we provide
More information4 Expectation & the Lebesgue Theorems
STA 205: Probability & Measure Theory Robert L. Wolpert 4 Expectation & the Lebesgue Theorems Let X and {X n : n N} be random variables on a probability space (Ω,F,P). If X n (ω) X(ω) for each ω Ω, does
More informationLecture 22: Variance and Covariance
EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce
More informationMATH 418: Lectures on Conditional Expectation
MATH 418: Lectures on Conditional Expectation Instructor: r. Ed Perkins, Notes taken by Adrian She Conditional expectation is one of the most useful tools of probability. The Radon-Nikodym theorem enables
More informationNotes 1 : Measure-theoretic foundations I
Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,
More informationP (A G) dp G P (A G)
First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume
More informationPart III Advanced Probability
Part III Advanced Probability Based on lectures by M. Lis Notes taken by Dexter Chua Michaelmas 2017 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after
More informationSTAT 430/510 Probability Lecture 7: Random Variable and Expectation
STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011 Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula
More informationExercise Exercise Homework #6 Solutions Thursday 6 April 2006
Unless otherwise stated, for the remainder of the solutions, define F m = σy 0,..., Y m We will show EY m = EY 0 using induction. m = 0 is obviously true. For base case m = : EY = EEY Y 0 = EY 0. Now assume
More informationDynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)
Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationProbability Theory II. Spring 2016 Peter Orbanz
Probability Theory II Spring 2016 Peter Orbanz Contents Chapter 1. Martingales 1 1.1. Martingales indexed by partially ordered sets 1 1.2. Martingales from adapted processes 4 1.3. Stopping times and
More informationMATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)
MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More information1 Sequences of events and their limits
O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For
More informationNotes 15 : UI Martingales
Notes 15 : UI Martingales Math 733 - Fall 2013 Lecturer: Sebastien Roch References: [Wil91, Chapter 13, 14], [Dur10, Section 5.5, 5.6, 5.7]. 1 Uniform Integrability We give a characterization of L 1 convergence.
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More informationNotes on Measure, Probability and Stochastic Processes. João Lopes Dias
Notes on Measure, Probability and Stochastic Processes João Lopes Dias Departamento de Matemática, ISEG, Universidade de Lisboa, Rua do Quelhas 6, 1200-781 Lisboa, Portugal E-mail address: jldias@iseg.ulisboa.pt
More informationLecture 4 An Introduction to Stochastic Processes
Lecture 4 An Introduction to Stochastic Processes Prof. Massimo Guidolin Prep Course in Quantitative Methods for Finance August-September 2017 Plan of the lecture Motivation and definitions Filtrations
More informationPart II Probability and Measure
Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)
More informationAn essay on the general theory of stochastic processes
Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16
More informationStochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes
BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents
More informationPROBABILITY THEORY II
Ruprecht-Karls-Universität Heidelberg Institut für Angewandte Mathematik Prof. Dr. Jan JOHANNES Outline of the lecture course PROBABILITY THEORY II Summer semester 2016 Preliminary version: April 21, 2016
More informationStochastic integration. P.J.C. Spreij
Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................
More informationStochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik
Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2
More informationIf Y and Y 0 satisfy (1-2), then Y = Y 0 a.s.
20 6. CONDITIONAL EXPECTATION Having discussed at length the limit theory for sums of independent random variables we will now move on to deal with dependent random variables. An important tool in this
More informationLecture 21 Representations of Martingales
Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let
More informationNotes on Measure Theory. Let A 2 M. A function µ : A [0, ] is finitely additive if, A j ) =
Notes on Measure Theory Definitions and Facts from Topic 1500 For any set M, 2 M := {subsets of M} is called the power set of M. The power set is the set of all sets. Let A 2 M. A function µ : A [0, ]
More information3. DISCRETE RANDOM VARIABLES
IA Probability Lent Term 3 DISCRETE RANDOM VARIABLES 31 Introduction When an experiment is conducted there may be a number of quantities associated with the outcome ω Ω that may be of interest Suppose
More informationADVANCED PROBABILITY: SOLUTIONS TO SHEET 1
ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 Last compiled: November 6, 213 1. Conditional expectation Exercise 1.1. To start with, note that P(X Y = P( c R : X > c, Y c or X c, Y > c = P( c Q : X > c, Y
More informationSelected Exercises on Expectations and Some Probability Inequalities
Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ
More informationSTOR 635 Notes (S13)
STOR 635 Notes (S13) Jimmy Jin UNC-Chapel Hill Last updated: 1/14/14 Contents 1 Measure theory and probability basics 2 1.1 Algebras and measure.......................... 2 1.2 Integration................................
More informationReview of Basic Probability Theory
Review of Basic Probability Theory James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) 1 / 35 Review of Basic Probability Theory
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More informationSurvival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University
Survival Analysis: Counting Process and Martingale Lu Tian and Richard Olshen Stanford University 1 Lebesgue-Stieltjes Integrals G( ) is a right-continuous step function having jumps at x 1, x 2,.. b f(x)dg(x)
More information(A n + B n + 1) A n + B n
344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals
More information4 Expectation & the Lebesgue Theorems
STA 7: Probability & Measure Theory Robert L. Wolpert 4 Expectation & the Lebesgue Theorems Let X and {X n : n N} be random variables on the same probability space (Ω,F,P). If X n (ω) X(ω) for each ω Ω,
More informationSolution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have
362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications
More informationFundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales
Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time
More informationPlan Martingales cont d. 0. Questions for Exam 2. More Examples 3. Overview of Results. Reading: study Next Time: first exam
Plan Martingales cont d 0. Questions for Exam 2. More Examples 3. Overview of Results Reading: study Next Time: first exam Midterm Exam: Tuesday 28 March in class Sample exam problems ( Homework 5 and
More informationMarkov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains
Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time
More informationA PECULIAR COIN-TOSSING MODEL
A PECULIAR COIN-TOSSING MODEL EDWARD J. GREEN 1. Coin tossing according to de Finetti A coin is drawn at random from a finite set of coins. Each coin generates an i.i.d. sequence of outcomes (heads or
More informationNotes on Stochastic Calculus
Notes on Stochastic Calculus David Nualart Kansas University nualart@math.ku.edu 1 Stochastic Processes 1.1 Probability Spaces and Random Variables In this section we recall the basic vocabulary and results
More informationthe time it takes until a radioactive substance undergoes a decay
1 Probabilities 1.1 Experiments with randomness Wewillusethetermexperimentinaverygeneralwaytorefertosomeprocess that produces a random outcome. Examples: (Ask class for some first) Here are some discrete
More information1 Gambler s Ruin Problem
1 Gambler s Ruin Problem Consider a gambler who starts with an initial fortune of $1 and then on each successive gamble either wins $1 or loses $1 independent of the past with probabilities p and q = 1
More informationWeek 12-13: Discrete Probability
Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible
More informationNotes 13 : Conditioning
Notes 13 : Conditioning Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Sections 0, 4.8, 9, 10], [Dur10, Section 5.1, 5.2], [KT75, Section 6.1]. 1 Conditioning 1.1 Review
More informationRandom variables (discrete)
Random variables (discrete) Saad Mneimneh 1 Introducing random variables A random variable is a mapping from the sample space to the real line. We usually denote the random variable by X, and a value that
More informationMeasure and integration
Chapter 5 Measure and integration In calculus you have learned how to calculate the size of different kinds of sets: the length of a curve, the area of a region or a surface, the volume or mass of a solid.
More informationStochastic Calculus (Lecture #3)
Stochastic Calculus (Lecture #3) Siegfried Hörmann Université libre de Bruxelles (ULB) Spring 2014 Outline of the course 1. Stochastic processes in continuous time. 2. Brownian motion. 3. Itô integral:
More informationLecture 11. Probability Theory: an Overveiw
Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the
More information1 Analysis of Markov Chains
1 Analysis of Markov Chains 1.1 Martingales Martingales are certain sequences of dependent random variables which have found many applications in probability theory. In order to introduce them it is useful
More informationSTAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes
STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes This section introduces Lebesgue-Stieltjes integrals, and defines two important stochastic processes: a martingale process and a counting
More informationCS 361: Probability & Statistics
February 19, 2018 CS 361: Probability & Statistics Random variables Markov s inequality This theorem says that for any random variable X and any value a, we have A random variable is unlikely to have an
More informationBrownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539
Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory
More informationRS Chapter 1 Random Variables 6/5/2017. Chapter 1. Probability Theory: Introduction
Chapter 1 Probability Theory: Introduction Basic Probability General In a probability space (Ω, Σ, P), the set Ω is the set of all possible outcomes of a probability experiment. Mathematically, Ω is just
More informationConstruction of a general measure structure
Chapter 4 Construction of a general measure structure We turn to the development of general measure theory. The ingredients are a set describing the universe of points, a class of measurable subsets along
More informationECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018
ECE534, Spring 2018: s for Problem Set #4 Due Friday April 6, 2018 1. MMSE Estimation, Data Processing and Innovations The random variables X, Y, Z on a common probability space (Ω, F, P ) are said to
More information180B Lecture Notes, W2011
Bruce K. Driver 180B Lecture Notes, W2011 January 11, 2011 File:180Lec.tex Contents Part 180B Notes 0 Course Notation List......................................................................................................................
More informationTheorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1
Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be
More informationRandom Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin
Random Variables Statistics 110 Summer 2006 Copyright c 2006 by Mark E. Irwin Random Variables A Random Variable (RV) is a response of a random phenomenon which is numeric. Examples: 1. Roll a die twice
More information17. Convergence of Random Variables
7. Convergence of Random Variables In elementary mathematics courses (such as Calculus) one speaks of the convergence of functions: f n : R R, then lim f n = f if lim f n (x) = f(x) for all x in R. This
More informationHomework 4 Solution, due July 23
Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var
More informationLecture 11: Random Variables
EE5110: Probability Foundations for Electrical Engineers July-November 2015 Lecture 11: Random Variables Lecturer: Dr. Krishna Jagannathan Scribe: Sudharsan, Gopal, Arjun B, Debayani The study of random
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.
EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015 Midterm Exam Last name First name SID Rules. You have 80 mins (5:10pm - 6:30pm)
More informationDeep Learning for Computer Vision
Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair
More informationABSTRACT EXPECTATION
ABSTRACT EXPECTATION Abstract. In undergraduate courses, expectation is sometimes defined twice, once for discrete random variables and again for continuous random variables. Here, we will give a definition
More informationPart 2 Continuous functions and their properties
Part 2 Continuous functions and their properties 2.1 Definition Definition A function f is continuous at a R if, and only if, that is lim f (x) = f (a), x a ε > 0, δ > 0, x, x a < δ f (x) f (a) < ε. Notice
More informationRandom Walks and Quantum Walks
Random Walks and Quantum Walks Stephen Bartlett, Department of Physics and Centre for Advanced Computing Algorithms and Cryptography, Macquarie University Random Walks and Quantum Walks Classical random
More informationBrownian Motion and Conditional Probability
Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical
More informationExercises Measure Theoretic Probability
Exercises Measure Theoretic Probability Chapter 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family
More informationErgodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.
Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions
More information