is well-defined, and is distributed according to a Cauchy law If t 0, compute P( X t Y ).

Size: px
Start display at page:

Download "is well-defined, and is distributed according to a Cauchy law If t 0, compute P( X t Y )."

Transcription

1 MA PROBLEM LIST DISCRETE TIME PROCESSES AND BROWNIAN MOTION Problem 1. Let γ a,b be the function: 1. Gaussian vectors and CLT γ a,b (x) = 1 Γ(a)b a xa 1 e x/b 1 {x>}, where a, b > and Γ(x) = e t t x 1 dt Show that γ a,b is a density Let X a random variable with density γ a,b. Check, for λ > : E[e λx 1 ] = (1 + λb), E[X] = ab, V arx = a ab Let X (resp. X ) a random variable with density γ a,b (resp. γ a,b). We assume X and X independent. Show that X + X admits the density γ a+a,b Application: Let X 1, X 2,..., X n, n i.i.d random variables, with law N (, 1). Show that X X X 2 n is Gamma distributed. Problem 2. Let X be a random variable distributed as N 1 (m, σ 2 ) Assume m =. We set Y = e αx2 with α. Compute E[Y n ] Find the density of X 1 when m = 2 and σ = 1. Problem 3. Let ɛ be a Rademacher random variable, that is: P(ɛ = 1) = P(ɛ = 1) = 1/2. Assume that ɛ is independent of X, where X N 1 (, 1). Show that the law of ɛx is still Gaussian. Problem 4. Let X, Y be two independent standard Gaussian random variables Show that X Y is well-defined, and is distributed according to a Cauchy law If t, compute P( X t Y ). Problem 5. If (X, Y ) is a centered Gaussian vector in R 2 with E[X 2 ] = E[Y 2 ] = 1 and if E[XY ] = r with r ( 1, 1), compute P(XY ). Hint: one can prove and use the following claim: (X, Y ) = (X, sx + 1 s 2 Z) with X, Z N (, 1) independent and s (, 1) to be determined. Then we invoke the result shown in Problem 4. 1

2 2 MA PROBLEM LIST Problem 6. Let X, Y N (, 1) be two independent random variables. For all a ( 1, 1), show that: [ ( a )] [ E [exp (axy )] = E exp 2 X2 E exp ( a )] 2 Y 2. Problem 7. Let X and Y two independent standard Gaussian random variables N (, 1). We set U = X 2 + Y 2 and V = X U. Show that U and V are independent, and compute their law. Problem 8. Let A be the matrix defined by Show that there exist a centered Gaussian vector G with covariance matrix A. The coordinates of G are denoted by X, Y and Z Is G a random variable with density? Compute the characteristic function of G Characterize the law of U = X + Y + Z Show that (X Y, X + Z) is a Gaussian vector Determine the set of random variables ξ = ax + by + cz, independent of U. Problem 9. Let Q be a positive definite quadratic form defined on R n. We introduce a function f given as ( f(x) = λ exp Q(x) ), x R n According to Q, compute the unique value λ such that f is a density. Hint: show that f can be seen as the density of a Gaussian vector Application: n = 2, Q(x, y) = 3x 2 + y 2 + 2xy. Problem 1. Let X = (X 1,..., X n ) be a centered Gaussian vector with covariance matrix Id n Show that the random vector (X 1 X,..., X n X) is independent of X, where X = 1 n n i=1 X i Deduce that the random variables X and W = max 1 i n X i min 1 i n X i are independent. Why is this result (somewhat) surprising? Problem 11. A restaurant can serve 75 meals. In practice, it has been established that 2% of customers with a reservation do not show up The restaurant owner has accepted 9 reservations. What is the probability that more than 65 persons will come?

3 MA PROBLEM LIST What is the maximal number of reservations which can be accepted if we wish to serve all customers with probability.9? Problem 12. Let (X n ; n 1) be a sequence of i.i.d R k -valued random variables, which are assumed to be square integrable. In the sequel (ξ n ; n 1) designates a sequence of i.i.d bounded real-valued random variables. We assume that (X n ; n 1) is independent of (ξ n ; n 1) and also that either X 1 or ξ 1 is centered. We set Y n = 1 n ξ n 1/2 i X i. Show that Y n converges in distribution as n goes to infinity. Characterize the limiting law. Problem 13. The aim of this problem is to show that the Laplace transform characterizes probability laws on R +. To this aim, for all t > and x > we set, a n (x, t) = n/t i=1 y n 1 x n (n 1)! e yx dy Invoking the law of large numbers (resp. central limit theorem), show that { lim a 1 {x>t} if x t n(x, t) = n 1 if x = t Let X be a random variable with values in R +. We set G(θ) = E[e θx ]. (1) Using Question 13.1, show that: n/t y n 1 d n G lim n ( 1)n (n 1)! dy (y)dy = 1 P (X = t) + P (X > t). n 2 (2) Deduce that G characterizes the distribution of X. Problem 14. Let X and Y two real valued i.i.d random variables. We assume that X+Y 2 has the same law as X and Y. We also suppose that this common law admits a variance, denoted by σ Show that X is centered random variable Show that if X 1, X 2, Y 1 and Y 2 are independent random variables having the same law as X, then 1 2 (X 1 + X 2 + Y 1 + Y 2 ) has the same law as X Applying the central limit theorem, show that X is distributed as N (, σ 2 ). Problem 15. The aim of this problem is to give an example of application for the multidimensional central limit theorem. Let (Y i ; i 1) be a sequence of i.i.d real valued random variables. We will denote by F common cumulative distribution function and ˆF n the empirical cumulative distribution function for the n-sample (Y 1,..., Y n ): ˆF n (x) = 1 n 1 {Yi x}, x R. n i=1

4 4 MA PROBLEM LIST Let x a fixed real number. Show that: ˆF n (x) converges a.s. to F (x), when n ; n( ˆF n (x) F (x)) converges in law, when n, to a centered Gaussian random variable with variance F (x)(1 F (x)) We will generalize this result to a multidimensional setting. Let x 1, x 2,..., x d be a sequence of real numbers such that x 1 < x 2 <... < x d, and X n be the random vector in R d, with coordinates X n (1), X n (2),, X n (d) where: X n (i) = 1 {Yn xi }; 1 i d, for all n 1. Show that: ( n(fn (x 1 ) F (x 1 )),..., n(f n (x d ) F (x d ))) converges in law, when n, to a centered Gaussian vector for which we will compute the covariance matrix. 2. Conditional expectation Problem 16. Let X 1 and X 2 two independent random variables, both following a Poisson law with parameter λ. Let Y = X 1 + X 2. Compute P(X 1 = i Y ). Problem 17. Let (X, Y ) be a vector of R 2, distributed uniformly over the unit disc. Compute the conditional density of X given Y. Problem 18. Let (X, Y ) be a couple of random variables with joint density Compute E[X Y ] Compute P(X < 1 Y ). f(x, y) = 4y(x y) exp( (x + y))1 y x. Problem 19. Let X 1 and X 2 be two random variables such that X 1 = X 2 on B F. Show that E[X 1 F] = E[X 2 F] a.s on B. Problem 2. We consider a head or tail type game, where the probability of getting head (resp. tail) is p (resp. 1 p), with < p < 1. Player A throws the dice. He wins as soon as the number of heads exceeds the number of tails by a quantity of 2. He looses if the number of tails exceeds the number of heads by a quantity of 2. The game is stopped whenever A has won or lost Let E n be the event: "the game is not over after 2n throws", n 1. Show that P(E n ) = r n where r is a real number to be determined Compute the probability that the player A wins and show that the game will stop a.s.

5 MA PROBLEM LIST 5 Problem 21. Let N be a random variable with values in {, 1,..., n}; we denote by α k = P(N = k). We consider a sequence (ɛ n ; n ) of independent random variables, whose common law is given by P(ɛ = 1) = p; P(ɛ = ) = q, with p + q = 1, p >, q >. We assume that N is independent of the family (ɛ n ; n 1). We define a random variable X by the relation: X = N k=1 ɛ k Compute the law of X. Express E[X] in terms of E[N]. E[X 2 ] in terms of E[N] and E[N 2 ] Let p ], 1[. Determine the law of N if we wish the conditional law of N given X = to be a binomial law B(n, p ). Problem 22. We consider the relations: P(X = ) = 1 3 ; P(X = 2n ) = P(X = 2 n ) = 2 n ; n 1. (1) Show that the relations (1) define a probability law for a random variable X Consider the following transition probability: Q(,.) = 1 2 (δ 2 + δ 2 ), Q(x,.) = 1 2 (δ + δ 2x ), x R, where δ a designates a Dirac measure in a. Let Y be a second real valued random variable such that the conditional law of Y given X is given by the transition probability Q. Show that E(Y X) = X and that Y and X share the same law. Problem 23. Let X and Y two real valued and independent random variables, with uniform law on [, 1]. We set U = inf{x, Y } and V = sup{x, Y }. Compute E[U V ] and the best prediction of U by an affine function of V. Problem 24. Let G G. Show that P(G A) = P(A G)dP G P(A G)dP. Ω This can be seen as a general version of Baye s formula. Problem 25. Let X and Y be two random variables such that X Y, and a > Show that E[X F] E[Y F] Show that P( X a F) E[X2 F] a 2. Problem 26. We note B n the set of Borel sets of R n, and let S n be the set of symmetric Borel sets A of R n, i.e. A = A Show that S n is a sub σ-algebra of B n and that a random variable Y is S n -measurable if and only if Y ( x) = Y (x).

6 6 MA PROBLEM LIST We say that a probability measure P over (R n, B n ) is is symmetric if P(A) = P( A) for all A lying in B n. Show that if φ is a real valued integrable random variable defined on (R n, B n, P ), we have: E[φ S n ](x) = 1 (φ(x) + φ( x)) We assume n = 1 and we denote by X the identity application of R onto R. Determine E[φ X ] and E[φ X 2 ]. Problem 27. Give an example on Ω = {a, b, c} for which E [E[X F 1 ] F 2 ] E [E[X F 2 ] F 1 ]. Problem 28. Let X and Y be two random variables Show that if X and Y are independent, then E[X Y ] = E[X] Give an example of random variables with values in { 1,, 1} such that X and Y are not independent, in spite of the fact that E[X Y ] = E[X]. Problem 29. Let n 1 be a fixed integer and p 1, p 2, p 3 three positive real numbers satisfying p 1 + p 2 + p 3 = 1. We set: p i 1 p j 2 p n i j 3 p i,j = n! i! j! (n i j)! whenever i + j n, and p i,j = if i + j > n Show that there exists a couple of random variables (X, Y ) such that P(X = i, Y = j) = p i,j Determine the law of X, the law of Y and the law of Y given X, expressed as a conditional regular law Compute E[XY ] thanks to the conditional regular law introduced in the previous question Problem 3. Let X 1, X 2,..., X n, n be some real valued integrable random variables, independent and equally distributed. We set m = E[X 1 ] and S n = n i=1 X i Compute E[S n X i ] for all i, 1 i n Compute E[X i S n ] for all i, 1 i n We assume now that n = 2 and that the random variables X i have a common density ϕ. Compute the conditional density of X i given S 2. Give a specific expression whenever the law of each X i is an exponential law. Problem 31. The random vector (X, Y ) has a density f X,Y (x, y) = x exp( 1 2π 2 x(y + x))1 {x>}1 {y>}. Determine the conditional distribution of Y given X, expressed as a conditional regular law.

7 MA PROBLEM LIST 7 Problem 32. Let X and Y be two real valued random variables, such that Y follows an exponential law. We assume that given Y, X is distributed according to a Poisson law with parameter Y (given as a conditional regular law) Compute the law of the couple (X, Y ), the law of X, and the law of Y given X as a conditional regular law Show that E[(Y X) 2 ] = 1, conditioning first with respect to Y, then integrating with respect to Y. Problem 33. Let X 1, X 2,..., X n be n real valued independent random variables, admitting a common density p Show that for all i j, P(X i = X j ) =. In the following we set X (1), X (2),..., X (n) for the sequence {X 1, X 2,..., X n } arranged in increasing order: X (1) < X (2) < < X (n) Show that X (1), X (2),..., X (n) admits a density of the form: f(x 1, x 2,..., x n ) = n! p(x 1 )p(x 2 ) p(x n )1 {x1 <x 2 < <x n} We assume now that the common law of the random variables X i is the uniform law on [a, b]. (1) Determine the density of (X (1), X (n) ). (2) We set µ n (a, b; x 1, x 2,..., x n ) for the density of X (1), X (2),..., X (n). Show that conditionally on X (1) and X (n) the vector X (2), X (3),..., X (n 1) admits the CRL given by µ n 2 (X 1, X n ; x 2, x 3,..., x n 1 ). Deduce that ( X(2) X (1),..., X ) (n 1) X (1) X (n) X (1) X (n) X (1) is a random variable independent of (X (1), X (n) ) and possesses a density µ n 1 (, 1; ). Problem 34. We consider a measurable space (Ω, F) and some families {A i ; i n} of subsets of Ω such that A i F Show that if the A i are independent and each one is a π-system, then the σ-algebras σ(a i ) are independent Let {X i ; i n} be a collection of real valued random variables. Show that these random variables are independent if and only if P (X 1 x 1,..., X n x n ) = i n P (X i x i ) for any vector (x 1,..., x n ) R n.

8 8 MA PROBLEM LIST 3. Martingales Problem 35. Let {X n ; n 1} be a martingale with respect to a filtration G n, and let F n = σ{x 1,..., X n }. Show that F n G n and that X n is a F n -martingale. Problem 36. We say that f : R d R is a super-harmonic function whenever f. These functions satisfy: f(x) f(y)dπ(y), B(x,r) where B(x, r) = {y; x y = r} is the boundary of the ball centered at x with radius r, and π is the surface measure of this boundary. Let f be a super-harmonic function on R d, and let {ξ n ; n 1} be a sequence of iid random variables, with common uniform law on B(, 1). We set S n = ξ n + S n 1 and S = x. Show that X n = f(s n ) is a supermartingale. Problem 37. Let {ξ n ; n 1} be a sequence of independent random variables such that ξ j L 1 (Ω) and E[ξ j ] =. We set X n = ξ i1 ξ ik. Show that X is a martingale. 1 i 1 <...<i k n Problem 38. Let {X n ; n 1} and {Y n ; n 1} be two sub-martingales with respect to F n. Show that X n Y n is a sub-martingale. Problem 39. Let {Y n ; n 1} be a iid sequence of positive random variables such that E[Y j ] = 1, P(Y j = 1) < 1 and P(Y j = ) =. We set Show that X is a martingale Show that lim n X n = a.s X n = Π j n Y j. Problem 4. We wish to study a branching process defined in the following way: let {ξi n ; i, n 1} be a sequence of iid random integer valued random variables. We set Z = 1 and pour n, ( Zn ) Z n+1 = i=1 ξ n+1 i 1 (Zn>). This process is called Galton Watson process, and represents the number of living individuals at each generation in various biological models. We set and µ = E[ξ n i ] Show that Zn µ n is a F n -martingale. F n = σ{ξ m i ; 1 m n, i 1},

9 4.2. Show that Zn µ n converges a.s to a random variable Z We assume now that µ < 1. (1) Show that P(Z n > ) E[Z n ]. (2) Show that Z n converges in probability to. (3) Show that Z n = for n large enough We assume now that µ = 1 and P(ξ n i = 1) < 1. MA PROBLEM LIST 9 (1) Show that Z n converges a.s to a random variable Z. (2) Suppose that P(Z = k) > for k. Show that there exists N > such that P(Z n = k for all n N) >. (3) Show that P(Z n = k for all n N) = for all k >. (4) Deduce that Z = Eventually we assume that µ > 1, and we will show that To this aim, we set p k = P(ξ n i P(Z n > for all n ) >. = k), and set φ(s) = p k s k, s [, 1]. k= Namely, φ is the moment generating function of ξ n i. (1) Show that φ is increasing, convex, and that lim s 1 φ (s) = µ. (2) Let θ m = P(Z m = ). Show that θ m = φ(θ m 1 ). (3) Invoking the fact that φ (1) > 1, show that there exists at least one root for the equation φ(x) = x in [, 1). Let ρ be the smallest of those roots. (4) Show that φ is strictly convex, and deduce that ρ is the unique root of φ(x) = x in [, 1). (5) Show that the extinction probability is such that P(Z n = for some n ) = ρ < Galton and Watson were interested in family name survivals. Suppose that each family has exactly three children, and that the gender distribution is uniform. In 19 th century England, only males could keep their family names. Compute the survival probability in this context. Problem 41. Let {Y n ; n 1} be a sequence of independent random variables, with common Gaussian law N (, σ 2 ), where σ >. We set F n = σ(y 1,..., Y n ) and X n = n i=1 Y i. Recall that: ( ) u 2 σ 2 E[exp(uY 1 )] = exp. 2 We also set, for all u R, Zn u = exp (ux n 12 ) nu2 σ 2.

10 1 MA PROBLEM LIST Show that {Z u n; n 1} is a F n -martingale for all u R We wish to study the almost sure convergence of Z u n for u R. (1) Show that for all u R, Zn u converges almost surely. (2) Show that K n 1 (ux n 12 ) n nu2 σ 2 converges almost surely, and determine its limit. (3) Find the almost sure limit of Zn u for u R We now study the L 1 -convergence of Z u n, for u R. (1) Find lim n E[Z u n]. (2) Is the martingale Z u n converging in L 1? Problem 42. At time 1, an urn contains 1 green ball and 1 blue ball. A ball is drawn, and replaced by 2 balls of the same color as the one which has been drawn. This gives a new composition at time 2. This procedure is then repeated successively. We set Y n for the number of green balls at time n, and write X n = Yn for the proportion of green balls at n+1 time n. We also set F n = σ(y 1,..., Y n ) Show that E[Y n+1 F n ] = (Y n + 1)X n + Y n (1 X n ) Show that {X n ; n 1} is a F n -martingale, which converges almost surely to a random variable U By means of the dominated convergence theorem, show that for all k 1, we have lim n E[X k n] = E[U k ] Fix k 1. We set, for n 1, Z n = Y n(y n + 1)... (Y n + k 1) (n + 1)(n + 2)... (n + k). (1) Let us define the random variables 1 {Yn+1 =Y n} and 1 {Yn+1 =Y n+1}. Relying on these quantities, show that {Z n ; n 1} is a F n -martingale. (2) Express the almost sure limit of Z n as a function of the random variable U. (3) Compute the value of E[U k ]. (4) Show that these moments are those of the law U([, 1]). Problem 43. Let (X n ) n N be a martingale with respect to a filtration F n. We assume that there exists a constant M > such that for all n 1 E [ X n X n 1 Fn 1 ] M Show that if (V n ) n 1 is a predictable (i.e V n is F n 1 measurable) process taking positive values, then we have [ ] V n E X n X n 1 F n 1 M V n. n=1 a.s. n=1

11 MA PROBLEM LIST Let ν be an integrable stopping time. Show that X ν is integrable, and that X ν p converges to X ν in L 1. Deduce that E(X ν ) = E(X ). Hint : write X ν X ν p = 1 {ν p<n ν} (X n X n 1 ). n= Show that if ν 1 ν 2 are two stopping times with ν 2 integrable, then E[X ν2 ] = E[X ν1 ]. Problem 44. Let (Y n ) n 1 be a sequence of independent random variables with a common law given by P(Y n = 1) = p = 1 P(Y n = 1) = 1 q. We define (S n ) n N by S = and S n = n k=1 Y k We assume that p = q = 1 2. We set T a = inf{n, S n = a} (a Z ). Show that E(T a ) = Let T = T a,b = inf{n, S n = a or S n = b} (a, b N). Using the value of E(S T ), compute the probability of the event (S T = a) Show that Z n = S 2 n n is a martingale, and from the value of E(Z T ) compute E(T ) We assume that p > q and we set µ = E(Y k ). Show that ( ) Sn q X n = S n nµ and U n = p are martingales. Deduce the value of P(S T = a) and E(T ). 4. Discrete models in finance Problem 45. This problem is concerned with the Cox, Ross and Rubinstein model: we consider a unique risky asset whose price at time n is called R n, as well as a non risky asset with price S n = (1 + r) n. We assume the following about R n : between time n and n + 1 the relative variation of price is either a or b, with 1 < a < b. Otherwise stated: R n+1 = (1 + a)r n or R n+1 = (1 + b)r n, n =,..., N 1. The natural space for all possible results is thus Ω = {1 + a, 1 + b} N, and we also consider F = {, Ω}, F = P(Ω), and F n = σ(r 1,..., R n ). The set Ω is equipped with a probability P such that all singletons of Ω have a non zero probability. Set T n = Rn R n 1, and note that F n = σ(t 1,..., T n ) Show that the actualized price R n is a P-martingale if and only if E[T n+1 F n ] = 1 + r Deduce that, in order to have a viable market, the rate r should satisfy the condition r (a, b) Give an example of possible arbitrage whenever r (a, b).

12 12 MA PROBLEM LIST We assume in the remainder of the problem that r (a, b), and we set p = b r b a. Show that R n is a martingale under P if and only if the random variables T j are independent, equally distributed, with a common law given by: Then prove that the market is complete. P(T j = 1 + a) = p = 1 P(T j = 1 + b) Let C n (resp. P n ) be the value at time n, of a European call (resp. put). (1) Show that C n P n = R n K(1 + r) (N n). This general relation is known as call-put parity. (2) Show that C n can be written as C n = c(n, R n ), where c is a function which will be expressed thanks to the constants K, a, b, p Show that a perfect hedging strategy for a call is defined by a quantity H n = (n, R n 1 ) of risky asset which should be held at time n, where is a function which can be expressed in terms of c. Problem 46. In this problem we consider a multinomial Cox, Ross and Rubinstein model: the unique risky asset has a price R n at time n, and the non risky asset price is given by S n = (1 + r) n. We assume the following for the risky asset price: between time n and n + 1 the relative variation of price belongs to the set {a 1, a 2,..., a k }, with k 3 and 1 < a 1 < a 2 <... < a k. Otherwise stated: R n+1 = (1 + a j )R n with j {1, 2,..., k}, n =,..., N 1. The natural space for all possible results is thus Ω = {1 + a 1,..., 1 + a k } N, and we also consider F = {, Ω}, F = P(Ω), and F n = σ(r 1,..., R n ). The set Ω is equipped with a probability P such that all singletons of Ω have a non zero probability. Set T n = Rn R n 1, and note that F n = σ(t 1,..., T n ). We set p n,j = P (T n = 1 + a j ), j {1, 2,..., k}, n =,..., N Show that the actualized price R n is a P-martingale if and only if E[T n+1 F n ] = 1 + r Deduce that, in order to have a viable market, the rate r should satisfy the condition r [a 1 ; a k ] Give an example of possible arbitrage whenever r < a We assume in the remainder of the problem that r = 1 k a j. k Let Q be the set of probability measures Q on Ω satisfying: (i) Under Q, the family {T n ; n N 1} is a family of i.i.d random variables. (ii) R n is a Q-martingale. j=1

13 MA PROBLEM LIST 13 (1) Let Q (1) be the probability on Ω defined by: the family of random variables {T n ; n N 1} is a family of i.i.d random variables of common law given by: Q (1) (T n = 1 + a j ) = 1 k, j {1, 2,..., k}. Show that Q (1) Q. (2) Show that Q is an infinite set. (3) Show that the market is incomplete We now work under the probability Q (1). Let C n be the value of a European call with strike K and maturity N. Show that C n can be written under the form C n = c(n, R n ), where c is a function which will be expressed thanks to K, a 1,..., a k. Note that a multinomial law can be used here. This law can be defined as follows: we consider an urn with a proportion p j of balls of type j, for j {1,..., k}, with k j=1 p j = 1. We draw n times from this urn and we call X j the number of balls of type j obtained in this way. Then for for any tuple of integers (n 1,..., n j ) such that k j=1 n j = n, we have P (X 1 = n 1,..., X k = n k ) = n! k j=1 n j! La law of the vector (X 1,..., X k ) ia called multinomial law with parameters (n, k, p 1,..., p k ). k j=1 p n j j. 5. Markov chains Problem 47. Let (X n ; n ) be a Markov chain on (E, E), with transition probability π. We set F n = σ(x, X 1,..., X n ). A function f : (E, E) (R, R) is said to be invariant or harmonic for π if f(y) π(x, y) < K and f(y)π(x, y) = f(x), pour tout x de E. E Let f be a harmonic function such that E f(y) ν(y) <, where ν is the law of X. Show that (f(x n ); n ) is a F n - martingale Suppose that E is a finite set which contains two absorbing states a and b: π(a, a) = π(b, b) = 1. We set σ(x) = inf{n ; X n = x} and T = σ(a) σ(b). (1) Show: P({X n = c; n } X = c) = 1 whenever c = a or b. (2) Suppose that π(x, {a, b}) >, for all x E. We set: ɛ = inf{π(x, {a, b}); x E, x a, x b} A n = {X i a, X i b, for all i n}. Show that P(A n X = x) (1 ɛ) n. Deduce that P x (T < ) = 1 for all x in E We suppose that P x (T < ) = 1 for all x in E. Let f be a π-harmonic function, bounded and such that f(a) f(b). Compute the absorption probabilities in a and b: ρ x,a = P(X T = a X = x), ρ x,b = P(X T = b X = x) E

14 14 MA PROBLEM LIST Study the particular case where E = {1, 2, 3,..., n}, π(1, 1) = 1, π(n, n) = 1, π(i, i + 1) = π(i, i 1) = 1/2 for 2 i n 1, with n 3. Problem 48. Draw the graph and classify the states for the Markov chains whose transition probabilities are given below: 1/2 1/2 1/2 1/2 1/2 1/ /2 1/2 1 1/2 1/2 1/4 1/2 1/4 1/2 1/2 1/2 1/2 1/2 1/2 Problem 49. We consider a game according to the following rules: assume that the player possesses an initial amount equal to $z, with z N. At a given time n, he draws randomly (independently of the past) an integer l {1, 2, 3}. Then If l = 1, the player does not win or loose anything. If l = 2, the player s fortune is doubled. Si l = 3, the entire fortune is lost. We denote by X n the player s fortune at time n Show that for all n N, X n+1 can be written as X n+1 = 2X n 1 (ξn+1 =2) + X n 1 (ξn+1 =1), where the random variables {ξ k ; k 1} are independent, with uniform law on {1, 2, 3} Show that {X n ; n } is a Markov chain Compute the transition matrix of X n, denoted by π = (π i,j ) We designate by π (n) i,j the elements of the matrix πn. By means of a recursion procedure, show that for all i N and n 1 we have: ( n j) π (n) i,j (i, 2j i) = Determine the nature of each state. 3 n for j {,..., n}, and π (n) i,j (i, ) = 1 ( 2 3) n Let σ be the first non zero instant for which the player s fortune is. Compute, for every initial state z and for all n N, the quantity P z (σ = n). Show that σ follows a geometric law with parameter 1 3.

15 MA PROBLEM LIST 15 Problem 5. In this problem, X n designates the number of particles at instant n in a given volume V. We assume that in the interval [n, n+1), each of the X n particles has a probability p (with < p < 1) of escaping from the volume V. We also assume that a random number of particles, following a Poisson law with parameter λ, gets into V. We suppose independence of all the random phenomena considered previously Show that (X n ) is a Markov chain with transition probability given by: x Q(x, y) = Ck x (y x + k)! pk (1 p) x k λ y x+k e λ k=(x y) Compute E[e itx 1 X = x] in a direct way Deduce the expression of the characteristic function of X 1 when X follows a Poisson law P(θ). Then show that P(λ/p) is an invariant measure Show that (X n ) is an irreducible and positive recurrent Markov chain What is the limit of X n, as n goes to? Problem 51. Consider a Markov chain on the vertices of a triangle (we denote E = {1, 2, 3} this state space), defined as follows: at each instant one moves in the trigonometric sense with probability α and in the other direction with probability 1 α, where < α < Write the transition matrix p of this chain Compute the stationary distribution ν Show that for all x, y E, we have lim n p n (x, y) = ν(y). We shall use the following criterion: let X be an irreducible recurrent Markov chain with transition ˆp on a finite state space E. If there exists a m 1 satisfying ˆp m (x, y) > for all x, y E, then lim n ˆp n (x, y) = µ(y), where µ is the invariant probability of X Compute, for any initial distribution µ, lim P µ (X n = 1, X n+1 = 2), and lim P µ (X n = 2, X n+1 = 1). n n For which values of α do we get a reversible stationary law? 6. Ergodic theorems Problem 52. Let (Ω, F, P) be a probability space, a map ϕ preserving P, a random variable X. We consider the class I of invariant events Show that I is a σ-field Show that X I if and only if X ϕ = X almost surely. Problem 53. Consider a centered stationary sequence X with autocovariance function c(m) = E[X k X k+m ].

16 16 MA PROBLEM LIST Show that Var ( 1 n ) n X i = 2 j 1 n c(i) c() n 2 n. i=1 j=1 i= We now assume that lim j 1 j i= c(i) = σ2. Prove that ( ) lim Var 1 n X i = σ 2. n n i=1 Problem 54. We consider the rotation of the circle context: take (Ω, F, P) = ([, 1], B, λ) where B is the Borel σ-algebra and λ stands for the Lebesgue measure. Let θ (, 1). Then we define X = {X n ; n } with X n (ω) = (ω + nθ) mod 1 = ω + nθ [ω + nθ]. The natural shift related to this sequence is denoted by ϕ If θ = m for two integers such that m < n, show that ϕ is not ergodic by considering n the set n 1 ( A = B + k ) [, where B, 1 ). n n k= We now consider θ Q c and we wish to prove that ϕ is ergodic. (1) Let f L 2 ([, 1)) with Fourier series decomposition f = k Z c ke 2πıkx. Show that f ϕ = f if and only if f =. (2) Deduce that ϕ is ergodic. Problem 55. Let n 1 and X a stationary sequence of random variables. We consider a sequence Y with (Y 1,..., Y n ) = (X 1,..., X n ) and such that the blocks (Y nk+1,..., Y n(k+1) ) are i.i.d. Let ν be a random variable independent of X, Y and uniformly distributed over {1,..., n}. We consider the sequence Z defined by Z m = Y ν+m for m Show that Z is stationary and ergodic. Problem 56. Let X be a sequence of independent identically distributed random variables with zero mean and unit variance. For n 1 let Y n = α i X n+i, i= where the α i are constants satisfying i= α2 i < Use the martingale convergence theorem to show that the above summation converges almost surely and in mean square Prove that lim n Ȳ n =, almost surely and in L 1 (Ω).

17 MA PROBLEM LIST Brownian motion Problem 57. Let B be a standard Brownian motion Compute, for all couple (s, t), the quantities E[B t F s ] and E[B s B 2 t ] (we do not assume s t here) Compute E[B 2 t B 2 s] What is the law of B t + B s? Compute E[1 (Bt )] and E[B 2 t 1 (Bt )] Compute E[ t ebs ds] and E[e αbt t eγbs ds] for α, γ >. Problem 58. For any continuous bounded function f : R R and all u t, show that E[f(B t )] = E[f(G u + B t u )] with a random variable G N (, 1) independent of B t u. Problem 59. Let f : R R be a C 2 function whose second derivative has at most exponential growth. Show that E[f(x + B t )] = f(x) t E[f (x + B s )] ds. Hint: One can use the following Gaussian integration by parts formula: let N N (, 1) and ψ C 1 with exponential growth. Then E[N ψ(n)] = E[ψ (N)]. The Gaussian integration by parts should be proved first. Problem 6. Consider a standard Brownian motion B. For all λ, µ R, compute ) 2 ] E [ ( µb 1 + λ 1 B u du Problem 61. Show that the integral 1 B s α s ds is finite almost surely if α < Gaussian processes Problem 62. Let (X n, n 1) a sequence of centered Gaussian random variables, converging in law to a random variable X. Show that X is also a centered Gaussian random variable. Deduce that the process Y = {Y t, t } given by Y t = t B udu is Gaussian. Compute its expected value and its covariance function. Problem 63. We define the Brownian bridge by Z t = B t tb 1 for t Show that Z is a Gaussian process independent of B 1. Give its law, that is its mean and its covariance function Show that the process Z, with Z t = Z 1 t, has the same law as Z Show that the process Y, with Y t = (1 t)b t, < t < 1, has the same law as Z. 1 t

18 18 MA PROBLEM LIST 9. Martingales Problem 64. Among the following processes, what are those who enjoy the martingale property? Hint: use the Fubini type relation E[ t B udu F s ] = t E[B u F s ] du M t = B t 3 3 t B s ds? Z t = B t 3 3tB t? X t = tb t t B s ds? Y t = t 2 B t 2 t B sds? Problem 65. Let G t = F t σ(b 1 ). Check that B is not a G t -martingale. Hint: get a contradiction, showing that if B is a G t -martingale, then E[B t B 1 ] = E[B s B 1 ] for s, t 1. Problem 66. Let Z = {Z t, t } le process defined par Z t = B t t Show that Z is a Gaussian process Compute the expected value and the covariance function of Z. Deduce that Z is a Brownian motion Show that Z is not a F B t -martingale, where (F B t ) is the natural filtration of B. Hint: compute E[Z t Z s F B s ] for s < t Deduce that F Z F B, but F Z F B. Problem 67. Let φ be a bounded adapted process on (Ω, F, (F t ) t, P ) and M a (F t )- martingale. We set Prove that Y t = M t [ T Y t = E t t φ s ds, t [, T ]. B s s ds. φ s ds + Y T F t ], t [, T ]. (2) In the other direction, if Y satisfies (2) with a bounded adapted process φ, show that M defined by is a martingale. M t = Y t + t φ s ds, t [, T ], Problem 68. Let (M t ) t be a square integrable F t -martingale (that is such that M t L 2 for all t) Show that E[(M t M s ) 2 F s ] = E[M t 2 F s ] M s 2 for t > s Deduce that E[(M t M s ) 2 ] = E[M t 2 ] E[M s 2 ] for t > s Consider the function Φ defined by Φ(t) = E[M t 2 ]. Check that Φ is increasing.

19 MA PROBLEM LIST 19 Problem 69. Show that if M is a F t -martingale, it is also a martingale with respect to its natural filtration G t = σ(m s, s t). Problem 7. Let τ be a positive random variable defined on (Ω, F, (F t ) t, P ). Show that Z t = P(τ t F t ) is a sub-martingale. Problem 71. Let X be a centered process with independent increments, such that for all n N and any < t 1 < t 2 <... < t n, the random variables X t1, X t2 X t1,..., X tn X tn 1 are independent. In addition, we assume that X is integrable, and that (F t ) is the natural filtration of X. Show that X is a martingale. If we further suppose that X is square integrable, show that X 2 t E[X 2 t ] is also a (F t )-martingale. 1. Hitting times In this section, a designates a real number and T a is the random time defined by T a = inf{t : B t = a}. Problem 72. Show that T a is a stopping time. Compute E[e λta ] for all λ. Show that P(T a < ) = 1 and that E[T a ] =. Problem 73. Prove (avoid computations) that for b > a >, the random variable T b T a is independent of T a. Deduce that the process (T a ) a has independent and stationary increments. Problem 74. Let a < < b and T = T a T b. Compute P(T a < T b ) and E[T ]. Hint: Apply the optional sampling theorem to B t and B 2 t t. Problem 75. Find an expression for Z t = P(T a > 1 F t ) for t 1 and a >. Recall (d) that sup u t B u = B t. Problem 76. Let I = inf s T1 B s. Show that I has a density given by f I (x) = 1 (1+x) 2 1 [,+ [ (x). Hint: Use {I x} = {T 1 < T x }. Problem 77. Let T 1 = inf{t : B t = 1}. Use a Brownian scaling in order to show the following identities in law: (1) T 1 (d) = 1 S 2 1, with S 1 = sup(b u, u 1); (2) T a (d) = a 2 T 1.

20 2 MA PROBLEM LIST 11. Wiener integral Problem 78. In this problem we consider the process X defined by X t = t (sin s) db s Show that, for each t, the random variable X t is well defined Show that X = (X t ) t is a Gaussian process. Compute its expected value and its covariance function Compute E[X t F s ] for s, t Show that X t = (sin t)b t t (cos s)b s ds for all t. Problem 79. Let X be the process defined on (, 1) by: X t = (1 t) t Show that X satisfies: X = and dx t = X t t 1 dt + db t, t (, 1) Show that X is a Gaussian process. Compute its expected value and its covariance function Show that lim t 1 X t = in L 2 (Ω). db s. 1 s 12. Itô s formula Problem 8. Write the following processes as Itô processes, specifying their drift and their diffusion coefficient. (1) X t = B t 2 (2) X t = t + exp(b t ); (3) X t = B t 3 3tB t ; (4) X t = (B t + t) exp( B t t/2); (5) X t = exp(t/2) sin(b t ). Problem 81. Let X and Y defined by: ( t ) X t = exp a(s)ds, et Y t = Y + t [ ( b(s) exp s )] a(u)du db s, where a, b : R R are bounded functions. We set Z t = X t Y t. Show that dz t = a(t)z t dt + b(t)db t. Problem 82. Let Z be the process given by Z t = t X t Y t, where X and Y are defined by: Find an expression for dz t. dx t = f(t) dt + σ(t)db t, and dy t = η(t)db t. Problem 83. Show that Y = (Y t ) t defined by Y t = sin(b t )+ 1 2 Find its expected value and its variance. t sin(b s) ds is a martingale.

21 MA PROBLEM LIST 21 Problem 84. Let us assume that the following system admits a solution (X, Y ): { Xt = x + t Y s db s Y t = y t X s db s, t. Show that X t 2 + Y t 2 = (x 2 + y 2 )e t for all t. Problem 85. We define Y and Z in the following way for t : Y t = t e s db s, and Z t = Give an expression for E[Z t ], E[Z 2 t ] and E[Z t Z s ] for s, t. t Y s db s. Problem 86. Let σ be an adapted continuous process in L 2 (Ω R), and let X t = t σ s db s 1 t 2 σ2 s ds. We set Y t = exp(x t ) and Z t = Yt Give an explicit expression for the dynamics of Y, that is dy t Show that Y is a local martingale on [, T ] for all T >. If σ = 1, show that Y is a martingale on [, T ] for all T >. Compute E[Y t ] in this case Compute dz t. Problem 87. Let a, b, c, z be real valued constants, and let Z be the process defined by: t ) Z t = e (a c2 /2)t+cB t (z + b e (a c2 /2)s cb s ds, t. Give a simple expression for dz t. Problem 88. Let (X t ) t be a process satisfying X t = x + t a sds + t σ sdb s for t. In the previous formula, x is a real number, a is a continuous process satisfying t a s ds < for all t, and σ is an adapted continuous process verifying t E[σ2 s]ds < for all t. We wish to show that if X, then x =, a and σ Apply Itô s formula to Y t = exp( X 2 t ) Prove the claim. Problem 89. Let X be an Itô process. A function s is called scale function for X if s(x) is a local martingale. Determine the scale functions of the following processes: (1) B t + νt; (2) X t = exp(b t + νt); (3) X t = x + t b(x s)ds + t σ(x s)db s. Problem 9. Let f : R R be a C 1 b function Construct a function ψ : [, 1] R R (expressed as an expected value) such that, for t [, 1], we have E[f(B 1 ) F t ] = ψ(t, B t ).

22 22 MA PROBLEM LIST 9.2. Write Itô s formula for ψ and simplify as much as possible Show that, for all t [, 1] we have: E[f(B 1 ) F t ] = E[f(B 1 )] + t E[f (B 1 ) F s ]db s. Problem 91. Let S be the solution of: ds t = rs t dt + S t σ(t, S t )db t, t [, T ], where r is a constant and where σ : R + R R is a function C 1,1 with bounded derivatives Show that E[Φ(S T ) F t ] is a martingale (as a function of t) for any bounded measurable function Φ. In the sequel, we admit that E[Φ(S T ) F t ] = E[Φ(S T ) S t ] for all t [, T ] (Markov property for S) Let ϕ(t, x) be the function defined by ϕ(t, S t ) = E[Φ(S T ) S t ] (the existence of ϕ is admitted). Write dz t with Z t = ϕ(t, S t ) Invoking the fact that ϕ(t, S t ) is a martingale, and admitting that ϕ is C 1,2, show that for all t > and all x > we have: ϕ (t, x) + rx ϕ t x (t, x) σ2 (t, x)x 2 2 ϕ (t, x) =. x2 What is the value of ϕ(t, x)? Problem 92. Let B be a d-dimensional Brownian motion. We consider an open bounded subset G of R d, and τ = inf{t ; B t G}. We denote by K the diameter of G, i.e K = sup{ x y ; x, y G} Show that there exists ε = ε K (, 1) such that P x (τ 1) ε for all x G Deduce that there exists ρ = ρ K (, 1) such that P x (τ > k) ρ k for all k 1 and x G Deduce that E x [τ p ] < for all p 1. In particular, show that τ < P x -almost surely for all x G Let ϕ C (Ḡ) be a harmonic function, i.e such that ϕ = sur G. Prove that E x [ϕ(b τ )] = ϕ(x) We now seek some harmonic functions ϕ : R d R having the form ϕ(x) = f( x 2 ) with f : R R. (1) Prove that f is solution of the differential equation f (y) = d f (y) for y >. 2y (2) Deduce the following form for radial harmonic functions: x if d = 1 ϕ(x) = ln( x ) if d = 2 x 2 d if d 3

23 MA PROBLEM LIST 23 Problem 93. We now consider a particular case of Problem 92, namely we fix the dimension d = Let a < x < b and τ = inf{t ; B t (a, b)}. Show that P x (B τ = a) = b x b a, P x (B τ = b) = x a b a For x R we set T x = inf{t ; B t = x}. Prove that P x (T y < ) = 1 for all x, y R Let now s > and x, y R. Show that P x (B t = y for a t s) = Let T y be the random set given by the points such that B t = y. Applying Markov s property, show that P x (T y unbounded) = 1. Problem 94. The situation of Problem 92 is now particularized to the dimension d = 2. For r > we set S r = inf{t ; B t = r} Let x R 2 such that < r < x < R. Prove that P x (S r < S R ) = ln(r) ln( x ) ln(r) ln(r) Invoking the same kind of arguments as in Problem 93, show that B is recurrent, that is for any couple x, y R 2 and r > we have P x (T B(y,r) < ) = Whenever x, show that P x (T < ) =, i.e the 2-dimensional Brownian motion does not hit points. Hint: for a fixed R > we have (T < S R ) n 1 ( S1/n < S R ). Problem 95. Consider now the case of a Brownian motion in dimension d 3. For r > we set S r = inf{t ; B t = r} Let x R d such that x > r >. Prove that P x (S r < ) = (r/ x ) d Let A n = { B t > n 1/2 for all t S n }. Show that P x (lim sup n A n ) = 1 for all x R d Show that P x -almost surely we have lim t B t =, for all x R d. 13. Geometrical Brownian motion Problem 96. Let S satisfying the following stochastic differential equation: where b and σ are constants. Let S t = e bt S t. ds t = S t (b dt + σ db t ), S = 1, (3) Show that ( S t ) t is a martingale. Deduce the value of E[S t ] and E[S t F s ] for any couple (t, s) Give an expression for the drift term and the diffusion coefficient of 1 S.

24 24 MA PROBLEM LIST Show that S t = exp[(b 1 2 σ2 )t + σb t ] satisfies (3), and that for all T t. S T = S t exp[(b 1 2 σ2 )(T t) + σ(b T B t )] Let L be a process verifying dl t = L t θ t db t where θ t is an adapted continuous process in L 2 (Ω R). We set Y t = S t L t. Compute dy t Let ζ t be defined by dζ t = ζ t (r dt + θ t db t ) Show that ζ t = L t e rt. Compute d(ζ 1 ) t and then d(s ζ) t. How can we choose θ in such a way that ζs is a martingale? Problem 97. Let f be a bounded measurable function and S = (S t ) t be a process verifying the equation ds t = S t (r f t )dt + σdb t, S = x R. The following questions are independent Show that e rt S t + t f s e rs S s ds is a local martingale Show that S t = xe rt t fu du + σ t e r(t s) t s fu du db s is a possible expression for S. In the sequel, we work with this formula for S Compute the expected value and the variance of S t for t Let T, K >. Compute E[(S T K) + ] whenever f is constant.

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1 Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

lim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f),

lim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f), 1 Part I Exercise 1.1. Let C n denote the number of self-avoiding random walks starting at the origin in Z of length n. 1. Show that (Hint: Use C n+m C n C m.) lim n C1/n n = inf n C1/n n := ρ.. Show that

More information

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2) 14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

1 Exercises for lecture 1

1 Exercises for lecture 1 1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )

More information

the convolution of f and g) given by

the convolution of f and g) given by 09:53 /5/2000 TOPIC Characteristic functions, cont d This lecture develops an inversion formula for recovering the density of a smooth random variable X from its characteristic function, and uses that

More information

Modern Discrete Probability Branching processes

Modern Discrete Probability Branching processes Modern Discrete Probability IV - Branching processes Review Sébastien Roch UW Madison Mathematics November 15, 2014 1 Basic definitions 2 3 4 Galton-Watson branching processes I Definition A Galton-Watson

More information

1. Let X and Y be independent exponential random variables with rate α. Find the densities of the random variables X 3, X Y, min(x, Y 3 )

1. Let X and Y be independent exponential random variables with rate α. Find the densities of the random variables X 3, X Y, min(x, Y 3 ) 1 Introduction These problems are meant to be practice problems for you to see if you have understood the material reasonably well. They are neither exhaustive (e.g. Diffusions, continuous time branching

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p Doob s inequality Let X(t) be a right continuous submartingale with respect to F(t), t 1 P(sup s t X(s) λ) 1 λ {sup s t X(s) λ} X + (t)dp 2 For 1 < p

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Problem set 2 The central limit theorem.

Problem set 2 The central limit theorem. Problem set 2 The central limit theorem. Math 22a September 6, 204 Due Sept. 23 The purpose of this problem set is to walk through the proof of the central limit theorem of probability theory. Roughly

More information

Markov processes and queueing networks

Markov processes and queueing networks Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

More information

Stochastic Process (ENPC) Monday, 22nd of January 2018 (2h30)

Stochastic Process (ENPC) Monday, 22nd of January 2018 (2h30) Stochastic Process (NPC) Monday, 22nd of January 208 (2h30) Vocabulary (english/français) : distribution distribution, loi ; positive strictement positif ; 0,) 0,. We write N Z,+ and N N {0}. We use the

More information

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk Instructor: Victor F. Araman December 4, 2003 Theory and Applications of Stochastic Systems Lecture 0 B60.432.0 Exponential Martingale for Random Walk Let (S n : n 0) be a random walk with i.i.d. increments

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

Theoretical Tutorial Session 2

Theoretical Tutorial Session 2 1 / 36 Theoretical Tutorial Session 2 Xiaoming Song Department of Mathematics Drexel University July 27, 216 Outline 2 / 36 Itô s formula Martingale representation theorem Stochastic differential equations

More information

. Find E(V ) and var(v ).

. Find E(V ) and var(v ). Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,

More information

Probability and Measure

Probability and Measure Chapter 4 Probability and Measure 4.1 Introduction In this chapter we will examine probability theory from the measure theoretic perspective. The realisation that measure theory is the foundation of probability

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan Monte-Carlo MMD-MA, Université Paris-Dauphine Xiaolu Tan tan@ceremade.dauphine.fr Septembre 2015 Contents 1 Introduction 1 1.1 The principle.................................. 1 1.2 The error analysis

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion IEOR 471: Stochastic Models in Financial Engineering Summer 27, Professor Whitt SOLUTIONS to Homework Assignment 9: Brownian motion In Ross, read Sections 1.1-1.3 and 1.6. (The total required reading there

More information

Exponential martingales: uniform integrability results and applications to point processes

Exponential martingales: uniform integrability results and applications to point processes Exponential martingales: uniform integrability results and applications to point processes Alexander Sokol Department of Mathematical Sciences, University of Copenhagen 26 September, 2012 1 / 39 Agenda

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Spring 2012 Math 541B Exam 1

Spring 2012 Math 541B Exam 1 Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote

More information

Solutions to the Exercises in Stochastic Analysis

Solutions to the Exercises in Stochastic Analysis Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional

More information

UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES

UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES Applied Probability Trust 7 May 22 UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES HAMED AMINI, AND MARC LELARGE, ENS-INRIA Abstract Upper deviation results are obtained for the split time of a

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

2. (a) What is gaussian random variable? Develop an equation for guassian distribution Code No: R059210401 Set No. 1 II B.Tech I Semester Supplementary Examinations, February 2007 PROBABILITY THEORY AND STOCHASTIC PROCESS ( Common to Electronics & Communication Engineering, Electronics &

More information

) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D

) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D 3 Independent Random Variables II: Examples 3.1 Some functions of independent r.v. s. Let X 1, X 2,... be independent r.v. s with the known distributions. Then, one can compute the distribution of a r.v.

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

A Short Introduction to Diffusion Processes and Ito Calculus

A Short Introduction to Diffusion Processes and Ito Calculus A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,

More information

Convergence at first and second order of some approximations of stochastic integrals

Convergence at first and second order of some approximations of stochastic integrals Convergence at first and second order of some approximations of stochastic integrals Bérard Bergery Blandine, Vallois Pierre IECN, Nancy-Université, CNRS, INRIA, Boulevard des Aiguillettes B.P. 239 F-5456

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

Properties of an infinite dimensional EDS system : the Muller s ratchet

Properties of an infinite dimensional EDS system : the Muller s ratchet Properties of an infinite dimensional EDS system : the Muller s ratchet LATP June 5, 2011 A ratchet source : wikipedia Plan 1 Introduction : The model of Haigh 2 3 Hypothesis (Biological) : The population

More information

Gaussian vectors and central limit theorem

Gaussian vectors and central limit theorem Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables

More information

Lectures on Markov Chains

Lectures on Markov Chains Lectures on Markov Chains David M. McClendon Department of Mathematics Ferris State University 2016 edition 1 Contents Contents 2 1 Markov chains 4 1.1 The definition of a Markov chain.....................

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

1. Aufgabenblatt zur Vorlesung Probability Theory

1. Aufgabenblatt zur Vorlesung Probability Theory 24.10.17 1. Aufgabenblatt zur Vorlesung By (Ω, A, P ) we always enote the unerlying probability space, unless state otherwise. 1. Let r > 0, an efine f(x) = 1 [0, [ (x) exp( r x), x R. a) Show that p f

More information

Lecture 2: Review of Basic Probability Theory

Lecture 2: Review of Basic Probability Theory ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent

More information

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 Last compiled: November 6, 213 1. Conditional expectation Exercise 1.1. To start with, note that P(X Y = P( c R : X > c, Y c or X c, Y > c = P( c Q : X > c, Y

More information

where r n = dn+1 x(t)

where r n = dn+1 x(t) Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution

More information

Optimal Stopping and Maximal Inequalities for Poisson Processes

Optimal Stopping and Maximal Inequalities for Poisson Processes Optimal Stopping and Maximal Inequalities for Poisson Processes D.O. Kramkov 1 E. Mordecki 2 September 10, 2002 1 Steklov Mathematical Institute, Moscow, Russia 2 Universidad de la República, Montevideo,

More information

Uniformly Uniformly-ergodic Markov chains and BSDEs

Uniformly Uniformly-ergodic Markov chains and BSDEs Uniformly Uniformly-ergodic Markov chains and BSDEs Samuel N. Cohen Mathematical Institute, University of Oxford (Based on joint work with Ying Hu, Robert Elliott, Lukas Szpruch) Centre Henri Lebesgue,

More information

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1 Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet. For ξ, ξ 2, i.i.d. with P(ξ i = ± = /2 define the discrete-time random walk W =, W n = ξ +... + ξ n. (i Formulate and prove the property

More information

MS 3011 Exercises. December 11, 2013

MS 3011 Exercises. December 11, 2013 MS 3011 Exercises December 11, 2013 The exercises are divided into (A) easy (B) medium and (C) hard. If you are particularly interested I also have some projects at the end which will deepen your understanding

More information

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt. The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes

More information

MAT 135B Midterm 1 Solutions

MAT 135B Midterm 1 Solutions MAT 35B Midterm Solutions Last Name (PRINT): First Name (PRINT): Student ID #: Section: Instructions:. Do not open your test until you are told to begin. 2. Use a pen to print your name in the spaces above.

More information

4 Sums of Independent Random Variables

4 Sums of Independent Random Variables 4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability Chapter 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family

More information

Stability of Stochastic Differential Equations

Stability of Stochastic Differential Equations Lyapunov stability theory for ODEs s Stability of Stochastic Differential Equations Part 1: Introduction Department of Mathematics and Statistics University of Strathclyde Glasgow, G1 1XH December 2010

More information

BRANCHING PROCESSES 1. GALTON-WATSON PROCESSES

BRANCHING PROCESSES 1. GALTON-WATSON PROCESSES BRANCHING PROCESSES 1. GALTON-WATSON PROCESSES Galton-Watson processes were introduced by Francis Galton in 1889 as a simple mathematical model for the propagation of family names. They were reinvented

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0). Exercises: sheet 1 1. Prove: Let X be Poisson(s) and Y be Poisson(t) distributed. If X and Y are independent, then X + Y is Poisson(t + s) distributed (t, s > 0). This means that the property of a convolution

More information

Mean-field dual of cooperative reproduction

Mean-field dual of cooperative reproduction The mean-field dual of systems with cooperative reproduction joint with Tibor Mach (Prague) A. Sturm (Göttingen) Friday, July 6th, 2018 Poisson construction of Markov processes Let (X t ) t 0 be a continuous-time

More information

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0). Exercises: sheet 1 1. Prove: Let X be Poisson(s) and Y be Poisson(t) distributed. If X and Y are independent, then X + Y is Poisson(t + s) distributed (t, s > 0). This means that the property of a convolution

More information

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time

More information

Stat 516, Homework 1

Stat 516, Homework 1 Stat 516, Homework 1 Due date: October 7 1. Consider an urn with n distinct balls numbered 1,..., n. We sample balls from the urn with replacement. Let N be the number of draws until we encounter a ball

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

STA205 Probability: Week 8 R. Wolpert

STA205 Probability: Week 8 R. Wolpert INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and

More information

Branching Processes II: Convergence of critical branching to Feller s CSB

Branching Processes II: Convergence of critical branching to Feller s CSB Chapter 4 Branching Processes II: Convergence of critical branching to Feller s CSB Figure 4.1: Feller 4.1 Birth and Death Processes 4.1.1 Linear birth and death processes Branching processes can be studied

More information

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012 Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 202 The exam lasts from 9:00am until 2:00pm, with a walking break every hour. Your goal on this exam should be to demonstrate mastery of

More information

P i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=

P i [B k ] = lim. n=1 p(n) ii <. n=1. V i := 2.7. Recurrence and transience Consider a Markov chain {X n : n N 0 } on state space E with transition matrix P. Definition 2.7.1. A state i E is called recurrent if P i [X n = i for infinitely many n]

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Characteristic Functions and the Central Limit Theorem

Characteristic Functions and the Central Limit Theorem Chapter 6 Characteristic Functions and the Central Limit Theorem 6.1 Characteristic Functions 6.1.1 Transforms and Characteristic Functions. There are several transforms or generating functions used in

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information