1. Aufgabenblatt zur Vorlesung Probability Theory
|
|
- Rose George
- 6 years ago
- Views:
Transcription
1 Aufgabenblatt zur Vorlesung By (Ω, A, P ) we always enote the unerlying probability space, unless state otherwise. 1. Let r > 0, an efine f(x) = 1 [0, [ (x) exp( r x), x R. a) Show that p f is integrable w.r.t. the Lebesgue measure λ 1 for every polynomial p. b) Compute the integrals x k f(x) λ 1 (x) for k = 0, 1, Let f(x, y) = 1/(2π) exp( (x 2 + y 2 )/2) enote the ensity of the two-imensional stanar normal istribution w.r.t. λ 2. a) Consier a ranom vector (U, V ) taking values in ]0, 1[ 2 with P (U,V ) = 1 [0,1] 2 λ 2. Define T : ]0, 1[ 2 R 2 by T (u, v) = 2 ln u ((cos(2π v), sin(2π v)), an put (X, Y ) = T (U, V ). Show that P (X,Y ) = f λ 2. b) Consier a two-imensional ranom vector (X, Y ) such that P X = P Y = N(0, 1). Prove or isprove P (X,Y ) = f λ Let F an F n with n N enote non-ecreasing functions on R with limits 0 an 1 for x an x, respectively. Assume that F is continuous. Show that pointwise convergence of (F n ) n towars F implies uniform convergence. 4. Consier the space L p = L p ([0, 1], B([0, 1]), λ), where 1 p < an where λ enotes the Lebesgue measure. a) Show that C([0, 1]) is ense in L p. Hint: Aufgabe 3.3 from Measure an Integration Theory (SS 2016) an algebraic inuction. b) For f L p an n N we efine I n,k = [(k 1)/n, k/n[ as well as Show that (f n ) converges to f in L p. f n = n n k=1 I n,k f λ 1 In,k. Abgabe: , 10:00 Uhr
2 Aufgabenblatt zur Vorlesung P -a.s. 1. Suppose that X n X, an let ε ]0, 1[. Show that there exists a set A A with P (A) 1 ε an lim X n (ω) X(ω) = 0. sup n ω A P P -a.s. 2. Suppose that X n X an X n X n+1 P -a.s. for every n N. Show that X n X. 3. Consier a probability space (Ω, A, P ) with a countable set Ω. Prove or isprove that almost sure convergence is equivalent to convergence in probability in this case. 4. Let Ω = ]0, 1[, A = B(Ω), an consier the uniform istribution P on Ω. Define for any istribution function F. a) Show that F is the istribution function of P X. X(ω) = inf{z R : ω F (z)}, ω ]0, 1[, b) Let U be uniformly istribute on ]0, 1[. Determine a measurable mapping T : ]0, 1[ [0, [ such that T U is exponentially istribute with parameter λ > 0. Abgabe: , 10:00
3 Aufgabenblatt zur Vorlesung 1. Let Q n enote the binomial istribution with parameters n N an p n ]0, 1[ such that λ = lim n n p n ]0, [. Show that (Q n ) n N converges weakly to the Poisson istribution with parameter λ. 2. Let f n an f enote probability ensities w.r.t. the Lebesgue measure λ 1. a) Suppose that f n converges to f λ 1 -almost everywhere. Show that f n λ 1 converges weakly to f λ 1. b) Provie an example, where f n λ 1 converges weakly to f λ 1 but f n converges to f only on a set of Lebesgue measure zero. 3. Consier the set { n P = λ k ε xk : n N, λ k > 0, k=1 n k=1 } λ k = 1, x k R of probability measures on (R, B). Prove that P is ense in the set of all probability measures on (R, B) w.r.t. weak convergence, i.e., for every probability measure µ on (R, B) a suitable sequence in P converges weakly to µ. 4. Give a proof of the implication (v) (i) in the Portemanteau Theorem. Hint: Employ a suitable partition of the range of f C b (M). Abgabe: , 10:00
4 Aufgabenblatt zur Vorlesung 1. Let Q n = N(µ n, σ 2 n), where µ n R an σ n 0. By efinition, N(µ, 0) = ε µ. a) Show that (Q n ) n N converges weakly iff (µ n ) n N an (σ n ) n N converge. Determine the weak limit in case of convergence. b) Characterize the tightness of the set {Q n : n N}. 2. Consier ranom variables X n, X, Y n, Y on a common probability space. Prove or isprove a) X n X Y n Y X n + Y n X + Y. b) X, X n L 1 X n X (E(X n )) n N converges lim n E(X n ) = E(X). c) (X n ) n N u.i. X L 1 L X n X X 1 n X. 3. Let (Ω, A) enote a measurable space, an let M(Ω) enote the set of all probability measures on A. a) Show that efines a metric on M(Ω). (P, Q) = sup P (A) Q(A), P, Q M(Ω), A A b) Prove or isprove the continuity of the mapping ]0, [ M(R), λ π(λ), where π(λ) enotes the Poission istribution with parameter λ. c) Assume that Ω is a complete separable metric space an that A is the corresponing Borel σ-algebra. Characterize those spaces Ω, where convergence w.r.t. is equivalent to weak convergence. 4. Let Ω i = {0, 1}, A i = P(Ω i ), an P i = p ε 1 + (1 p) ε 0 for p ]0, 1[ an i N. Consier the corresponing prouct space (Ω, A, P ). Determine the istribution of the ranom variable X n : Ω {0,..., n} : ω {i {1,..., n} : ω i = 1}. Construct a ranom variable on (Ω, A, P ) that is geometrically istribute with parameter p. Construct ranom variables X an Y on (Ω, A, P ) that o not coincie almost surely but have the same istribution. Abgabe: , 10:00
5 Aufgabenblatt zur Vorlesung 1. Consier the probability space from Exercise 4.4 an efine a sequence of ranom variables, which take values in Z = {0,..., m 1}, given by X 0 (ω) = 0 an { X n 1 (ω) + 1 mo m if ω n = 1, X n (ω) = X n 1 (ω) 1 mo m if ω n = 0. Put p n,k = P ({X n = k}) for n N 0 an k Z. Derive a recursive formula for (p n ) n N0, an compute initial segments of this sequence numerically for ifferent values of m. Formulate conjectures concerning the convergence of (X n ) n N0 an (E(X n )) n N0. 2. Put I = ]0, 1[, an let B B(I k ) with λ k (B) > 0. The probability measure ν on B(I k ) that is efine by ν(a) = λ k (A B)/λ k (B) is calle the uniform istribution on B. Consier the simplex D = {(x, y) I 2 : x + y < 1}, an let K(x, ) enote the uniform istribution on the section D(x) for x I. a) Show that K is a Markov kernel from I to B(I). Use a ranom number generator to simulate the istribution µ K with ifferent choices of probability measures µ on B(I). In particular, take the uniform istribution µ on I. b) Determine a probability measure µ on B(I) such that µ K is the uniform istribution on D. Compute the expectation I 2 y (µ K)(x, y). 3. Consier a queue where, per time step, n new customers arrive with probability b n for n N 0, an, in case of a non-empty queue, the customer at the hea of the queue is serve an leaves. a) Choose an appropriate measurable space to moel the lengths of the queue at all times i N 0. Define the corresponing transition kernel. b) Suppose that initially the queue is empty. Derive a recursive formula for the probability of length k N 0 of the queue at time i N 0. Derive a formula for the probability of lengths (k 1,..., k i ) of the queue at times 1,..., i. 4. Let X 1 an X 2 be inepenent with P Xi = 1/2 (ε 1 + ε 1 ) for i = 1, 2. Verify that (X 1, X 2, X 1 X 2 ) is pairwise inepenent but not inepenent. Abgabe: , 10:00
6 Aufgabenblatt zur Vorlesung 1. Consier ranom variables X n, X, Y n, Y on a common probability space, an assume inepenence of (X n, Y n ) for every n as well as inepenence of (X, Y ). Prove Cf. Exercise 4.2.a). X n X Y n Y X n + Y n X + Y. 2. a) Show that X an Y are inepenent iff E(f X g Y ) 2 E(f X E(f X)) 2 for all Borel-measurable mappings f, g : R R with f X, g Y L 2. Interpretation in terms of preiction problems? b) Construct a probability space together with square-integrable ranom variables X an Y on this space such that X an Y are uncorrelate but not inepenent, an E(X g Y ) 2 > E(X E(X)) 2 for every measurable mapping g that is ifferent from the constant mapping E(X). 3. Verify the facts that are state in Remark III Let (X n ) n N be i.i.. with X 1 L 2. Put S n = n i=1 X i. Show that 1 n 1 n (X i S n /n) 2 P -a.s. Var(X 1 ), i=1 i.e., the sample variance converges almost surely to the population variance. (Use the Strong Law of Large Numbers.) Abgabe: , 10:00
7 Aufgabenblatt zur Vorlesung 1. Consier a set D B with 0 < λ (D) <, a square-integrable function f : D R, an a sequence (U n ) n N that is i.i.. with U 1 being uniformly istribute on D. Put a = D f(x) x, M n = λ (D) n n f U k, n = a M n. k=1 a) Show that E( n ) 2 λ (D) n D f 2 (x) x. b) Suppose you only know a constant c > 0 such that f(x) c for every x D. Determine an integer n 0 such that P ({ n 10 2}) 10 4 for every n n 0. c) Let α ]0, 1/2[. Show that with probability one. lim n nα n = 0 2. Consier the situation of Remark III Prove the Glivenko-Cantelli Theorem. Hint: For ε > 0, choose N N with N 1/ε an consier the istribution function F at the knots x N k = inf{x R : F (x) k/n} for k = 1,..., N 1. Cf. Exercise Consier an inepenent sequence (X n ) n N of ranom variables with P ({X n = 0}) = 1 1 n log(n + 1), P ({X n = ±n}) = Show that (X X n )/n oes not converge almost surely. Hint: Consier the event lim sup n { X n = n}. 1 2n log(n + 1). 4. Use characteristic functions to verify the following facts. a) The sum of inepenent, normally istribute ranom variables is normally istribute, too. b) For ranom variables X n, X, Y n, Y such that (X n, Y n ) is inepenent for every n an (X, Y ) is inepenent we have Cf. Aufgabe 6.1. X n X Y n Y X n + Y n X + Y. Abgabe: , 10:00
8 Aufgabenblatt zur Vorlesung 1. a) Let µ n = B(n, p n ) enote a binomial istribution with parameters n N an p n ]0, 1[. Assume that lim n n p n = λ > 0. Use Fourier transforms to show that w µ n π(λ), where π(λ) enotes a Poisson istribution with parameter λ. b) Consier probability measures µ an µ n on B k as well as probability measures ν an ν n on w w w B l such that µ n µ as well as ν n ν. Conclue that µ n ν n µ ν. 2. Implement a program to visualize the Central Limit Theorem. Which of the limit theorems are in action? 3. Construct an inepenent sequence of ranom variables X n such that (i) X n L 2 an E(X n ) = 0 for every n N, (ii) the istributions P S n converge weakly to N(0, 1), where S n = n i=1 X i/ ( n i=1 Var(X i)) 1/2, an (iii) the ranom variables X nk = X k / ( n i=1 Var(X i)) 1/2 are not asymptotically negligible. 4. Let X an Y be inepenent an ientically istribute ranom variables with 0 < Var(X) <. Moreover, let Z = (X + Y )/a where a > 0. Show that P Z = P X implies (i) a = 2, (ii) ( ϕ X (x/2 n ) ) 2 2n = ϕ X (x) for all x R an n N, (iii) X N(0, Var(X)). Abgabe: , 10:00
9 Aufgabenblatt zur Vorlesung 1. Consier a k-imensional ranom vector, whose istribution has a ensity with respect to the Lebesgue measure. State an prove a generalization of Theorem III.7.1 in this context. 2. Consier a normally istribute k-imensional ranom vector Y on (Ω, A, P ) with zero mean, which is not constant almost surely. Show that there exists an integer l k, a matrix L R k l of full rank, an an l-imensional stanar normally istribute ranom vector X on (Ω, A, P ) sucht that Y = LX. Note the ifference between this fact an the result that is establishe in the proof of Theorem III Let X = (X 1, X 2 ) be a two-imensional ranom vector, whose components X 1 an X 2 are stanar normally istribute. Prove or isprove that X is normally istribute. 4. Let X (j) = (X (j) t ) t [0, [ with j J enote stochastic processes on (Ω, A, P ) with state space (Ω, A ). Show that (X (j) ) j J is inepenent iff ( (j) (X t 1,..., X (j) t m ) ) j J 0 is inepenent for every finite set J 0 J an all m N an 0 t 1 < < t m. Abgabe: , 10:00
10 Aufgabenblatt zur Vorlesung 1. Let M = C([0, [) be equippe with the metric ρ that is introuce in Section IV.1, an let µ be the Wiener measure on B(M). Put Z(f) = 1 a) Show that Z : M R is B(M)-B-measurable. b) By 0 f(t) t, f M. ((t, f), (s, g)) = max( t s, ρ(f, g)) we efine a metric on [0, [ M. Show that (t, f) f(t) efines an B([0, [ M)-B measurable mapping [0, [ M R an that B([0, [ M) = B([0, [) B(M). c) Show that E(Z) = 0 an Var(Z) = 1/3. Hint: Fubini s Theorem. ) Determine the istribution of Z. Hint: Consier Riemann sums. 2. a) Let X = (X 1,..., X k ) enote a k-imensional normally istribute ranom vector such that Cov(X i, X j ) = 0 for 1 i n < j k. Show that (X 1,..., X n ) an (X n+1,..., X k ) are inepenent. b) Construct a two-imensional ranom vector X = (X 1, X 2 ) such that X 1 an X 2 are normally istribute an uncorrelate but not inepenent. In the sequel, X = (X t ) t [0, [ enotes a one-imensional Brownian motion on (Ω, A, P ). 3. Let T > 0 as well as Y t = X T +t X T. Show that Y = (Y t ) t [0, [ is a Brownian motion. Moreover, show that (X t ) t [0,T ] an Y are inepenent. 4. a) Let Z = sup 0 t T X t. Show that Z is A-B-measurable an P ({Z u}) T/u 2 for every u > 0. Hint: Kolmogorov s inequality. b) Show that lim t t α X t = 0 P -a.s. for every α > 1/2. Hint: Use a), Aufgabe 3, an the strong law of large numbers. c) Conclue that Y = (Y t ) t [0,T ] with Y t = t X 1/t for t > 0 an Y 0 = 0 is a Brownian motion. Abgabe: , 10:00
11 Aufgabenblatt zur Vorlesung 1. Let X = (X t ) t [0,T ] with T > 0 be a one-imensional Brownian motion. Moreover, efine Y t = t/t X T, Z t = X t Y t for t [0, T ]. a) Show that (Y t ) t [0,T ] an (Z t ) t [0,T ] are inepenent. b) Show that every finite-imensional marginal istribution of Z is a normal istribution as well as E(Z t ) = 0 an Cov(Z s, Z t ) = s (T t)/t for 0 s t T. c) Implement a program to simulate the istribution of (Y 0, Y 1/n,..., Y 1, Z 0, Z 1/n,..., Z 1 ) for any n N. For visualization use piecewise linear interpolation of U = (Y 0, Y 1/n,..., Y 1 ), V = (Z 0, Z 1/n,..., Z 1 ), as well as U + V. 2. Provie a proof of the Raon-Nikoym Theorem for two finite measures ν an µ such that ν µ by reucing this case to the statement that was establishe in the lecture. 3. Let (Ω, A, P ) = ([0, 1], B([0, 1]), λ) with λ enoting the Lebesgue measure. Put { X(ω) = 2ω 2 2ω, if ω < 1/2,, Y (ω) = 2ω 1, otherwise, for ω [0, 1]. Determine E(X Y ) an E(X Y = y) for y [0, 1]. 4. a) Let X an Y be i.i.. with E( X ) <. Determine E(X X + Y ). b) Consier an i.i.. sequence X 1, X 2,.... Determine where S k = k i=1 X i. E(X 1 σ({s n, S n+1,... })), Abgabe: , 10:00
12 Aufgabenblatt zur Vorlesung 1. In the sequel (X 0,..., X k ) enotes a (k + 1)-imensional normally istribute ranom vector. a) Assume that (X 1,..., X k ) is inepenent, an put I = {i {1,..., k} : σ 2 (X i ) > 0}. Show that E(X 0 (X 1,..., X k )) = Cov(X 0, X i )/σ 2 (X i ) (X i E(X i )) + E(X 0 ). i I Hint: Aufgabe b) Determine E(X 0 (X 1,..., X k )) without any further assumption on (X 1,..., X k ). c) Consier a one-imensional Brownian motion (Y t ) t [0, [. Determine E(Y t Z) for t 0, where Z = Y 1 or Z = 1 0 Y s s. 2. Consier the setting from Aufgabe At first, etermine the regular conitional istribution P X Y, an thereafter etermine the regular conitional istribution P i Y. 3. A point is picke uniformly at ranom from the surface of the unit sphere. With θ an φ enoting the longitue an latitue, respectively, etermine the regular conitional istribution of θ given φ an of φ given θ, respectively. 4. a) Consier a probability space (Ω, A, P ) with inepenent sub-σ-algebras G 1, G 2 A. Let Y : Ω R be G 1 -measurable an f : R Ω R be B G 2 -measurable an boune. Show that the mapping g : R R efine by g(y) = f(y, ) P for all y R is well-efine an that Ω E(f(Y ( ), ) G 1 ) = g Y. Hint: Algebraic inuction; at first consier suitable mappings of the form f(y, ω) = 1 A (y)1 B (ω). b) Let (W t ) t [0, [ be a Brownian motion, an let 0 s < t as well as Γ B. Use part a) to etermine P ({W t Γ} F W s ). Abgabe: , 10:00
13 Aufgabenblatt zur Vorlesung 1. Construct ranom variables X, Y L 2 on a common probability space an a measurable mapping ϕ : R R such that E( X ϕ Y ) < E( X E(X Y ) ). 2. Consier the situation of Example V.3.8. Moreover, we assume that (Y n ) n N is ientically istribute with P Y1 = 1/2 (ε 1 + ε 1 ). For α, β Z with α < 0 < β we set τ α = min{t N 0 : X t = α} an τ β = min{t N 0 : X t = β}. Show that P (τ α < ) = P (τ β < ) = 1, an P (τ α τ β ) = β β α an P (τ β τ α ) = α β α. 3. A martingale X is calle square-integrable if X t L 2 for all t I. Show that every square-integrable martingale has uncorrelate increments over isjoint intervals. 4. a) Let (Y s ) s N be an i.i.. sequence of ranom variables such that E( Y 1 ) <. Moreover, let F 0 = {, Ω} an F t = σ({y 1,..., Y t }) for t N. For any stopping time τ w.r.t. (F t ) t N we put τ Z = Y s. Show that Z L 1 an if E(τ) <. b) Consier the particular case s=1 E(Z) = E(Y 1 )E(τ), P Y1 = 1 2 (ε 1 + ε 1 ) an Determine E(τ). τ = inf{t N: t Y s = 1}. s=1 Abgabe: , 10:00
Convergence of Random Walks
Chapter 16 Convergence of Ranom Walks This lecture examines the convergence of ranom walks to the Wiener process. This is very important both physically an statistically, an illustrates the utility of
More informationP. Billingsley, Probability and Measure, Wiley, New York, P. Gänssler, W. Stute, Wahrscheinlichkeitstheorie, Springer, Berlin, 1977.
Probability Theory Klaus Ritter TU Kaiserslautern, WS 2017/18 Literature In particular, H. Bauer, Probability Theory, de Gruyter, Berlin, 1996. P. Billingsley, Probability and Measure, Wiley, New York,
More information1 Math 285 Homework Problem List for S2016
1 Math 85 Homework Problem List for S016 Note: solutions to Lawler Problems will appear after all of the Lecture Note Solutions. 1.1 Homework 1. Due Friay, April 8, 016 Look at from lecture note exercises:
More informationTopic 7: Convergence of Random Variables
Topic 7: Convergence of Ranom Variables Course 003, 2016 Page 0 The Inference Problem So far, our starting point has been a given probability space (S, F, P). We now look at how to generate information
More informationTransforms. Convergence of probability generating functions. Convergence of characteristic functions functions
Transforms For non-negative integer value ranom variables, let the probability generating function g X : [0, 1] [0, 1] be efine by g X (t) = E(t X ). The moment generating function ψ X (t) = E(e tx ) is
More informationLecture 1: Review of Basic Asymptotic Theory
Lecture 1: Instructor: Department of Economics Stanfor University Prepare by Wenbo Zhou, Renmin University Basic Probability Theory Takeshi Amemiya, Avance Econometrics, 1985, Harvar University Press.
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationSelf-normalized Martingale Tail Inequality
Online-to-Confience-Set Conversions an Application to Sparse Stochastic Banits A Self-normalize Martingale Tail Inequality The self-normalize martingale tail inequality that we present here is the scalar-value
More information(A n + B n + 1) A n + B n
344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationRandom Process Lecture 1. Fundamentals of Probability
Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus
More informationLogarithmic spurious regressions
Logarithmic spurious regressions Robert M. e Jong Michigan State University February 5, 22 Abstract Spurious regressions, i.e. regressions in which an integrate process is regresse on another integrate
More informationLecture 5. Symmetric Shearer s Lemma
Stanfor University Spring 208 Math 233: Non-constructive methos in combinatorics Instructor: Jan Vonrák Lecture ate: January 23, 208 Original scribe: Erik Bates Lecture 5 Symmetric Shearer s Lemma Here
More informationPart II Probability and Measure
Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)
More informationExercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).
Exercises: sheet 1 1. Prove: Let X be Poisson(s) and Y be Poisson(t) distributed. If X and Y are independent, then X + Y is Poisson(t + s) distributed (t, s > 0). This means that the property of a convolution
More informationProbability: Handout
Probability: Handout Klaus Pötzelberger Vienna University of Economics and Business Institute for Statistics and Mathematics E-mail: Klaus.Poetzelberger@wu.ac.at Contents 1 Axioms of Probability 3 1.1
More informationSTAT 7032 Probability Spring Wlodek Bryc
STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,
More informationLecture Introduction. 2 Examples of Measure Concentration. 3 The Johnson-Lindenstrauss Lemma. CS-621 Theory Gems November 28, 2012
CS-6 Theory Gems November 8, 0 Lecture Lecturer: Alesaner Mąry Scribes: Alhussein Fawzi, Dorina Thanou Introuction Toay, we will briefly iscuss an important technique in probability theory measure concentration
More information1 Stat 605. Homework I. Due Feb. 1, 2011
The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due
More informationREAL ANALYSIS I HOMEWORK 5
REAL ANALYSIS I HOMEWORK 5 CİHAN BAHRAN The questions are from Stein an Shakarchi s text, Chapter 3. 1. Suppose ϕ is an integrable function on R with R ϕ(x)x = 1. Let K δ(x) = δ ϕ(x/δ), δ > 0. (a) Prove
More informationExercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).
Exercises: sheet 1 1. Prove: Let X be Poisson(s) and Y be Poisson(t) distributed. If X and Y are independent, then X + Y is Poisson(t + s) distributed (t, s > 0). This means that the property of a convolution
More informationIntroduction to the Vlasov-Poisson system
Introuction to the Vlasov-Poisson system Simone Calogero 1 The Vlasov equation Consier a particle with mass m > 0. Let x(t) R 3 enote the position of the particle at time t R an v(t) = ẋ(t) = x(t)/t its
More informationunder the null hypothesis, the sign test (with continuity correction) rejects H 0 when α n + n 2 2.
Assignment 13 Exercise 8.4 For the hypotheses consiere in Examples 8.12 an 8.13, the sign test is base on the statistic N + = #{i : Z i > 0}. Since 2 n(n + /n 1) N(0, 1) 2 uner the null hypothesis, the
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More informationPDE Notes, Lecture #11
PDE Notes, Lecture # from Professor Jalal Shatah s Lectures Febuary 9th, 2009 Sobolev Spaces Recall that for u L loc we can efine the weak erivative Du by Du, φ := udφ φ C0 If v L loc such that Du, φ =
More informationBrownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539
Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory
More informationNOTES ON EULER-BOOLE SUMMATION (1) f (l 1) (n) f (l 1) (m) + ( 1)k 1 k! B k (y) f (k) (y) dy,
NOTES ON EULER-BOOLE SUMMATION JONATHAN M BORWEIN, NEIL J CALKIN, AND DANTE MANNA Abstract We stuy a connection between Euler-MacLaurin Summation an Boole Summation suggeste in an AMM note from 196, which
More information6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )
6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined
More informationLevy Process and Infinitely Divisible Law
Stat205B: Probability Theory (Spring 2003) Lecture: 26 Levy Process an Infinitely Divisible Law Lecturer: James W. Pitman Scribe: Bo Li boli@stat.berkeley.eu Levy Processes an Infinitely Divisible Law
More informationExercises Measure Theoretic Probability
Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary
More informationX n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)
14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationA D VA N C E D P R O B A B I L - I T Y
A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2
More information{σ x >t}p x. (σ x >t)=e at.
3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ
More informationLecture 5: Expectation
Lecture 5: Expectation 1. Expectations for random variables 1.1 Expectations for simple random variables 1.2 Expectations for bounded random variables 1.3 Expectations for general random variables 1.4
More informationProbability Theory. Richard F. Bass
Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2
More informationP (A G) dp G P (A G)
First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume
More informationA note on asymptotic formulae for one-dimensional network flow problems Carlos F. Daganzo and Karen R. Smilowitz
A note on asymptotic formulae for one-imensional network flow problems Carlos F. Daganzo an Karen R. Smilowitz (to appear in Annals of Operations Research) Abstract This note evelops asymptotic formulae
More informationJointly continuous distributions and the multivariate Normal
Jointly continuous istributions an the multivariate Normal Márton alázs an álint Tóth October 3, 04 This little write-up is part of important founations of probability that were left out of the unit Probability
More information) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D
3 Independent Random Variables II: Examples 3.1 Some functions of independent r.v. s. Let X 1, X 2,... be independent r.v. s with the known distributions. Then, one can compute the distribution of a r.v.
More informationSome Examples. Uniform motion. Poisson processes on the real line
Some Examples Our immeiate goal is to see some examples of Lévy processes, an/or infinitely-ivisible laws on. Uniform motion Choose an fix a nonranom an efine X := for all (1) Then, {X } is a [nonranom]
More informationVerona Course April Lecture 1. Review of probability
Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is
More information1. Stochastic Processes and filtrations
1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S
More informationExercises in stochastic analysis
Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with
More informationTutorial on Maximum Likelyhood Estimation: Parametric Density Estimation
Tutorial on Maximum Likelyhoo Estimation: Parametric Density Estimation Suhir B Kylasa 03/13/2014 1 Motivation Suppose one wishes to etermine just how biase an unfair coin is. Call the probability of tossing
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationLeast Distortion of Fixed-Rate Vector Quantizers. High-Resolution Analysis of. Best Inertial Profile. Zador's Formula Z-1 Z-2
High-Resolution Analysis of Least Distortion of Fixe-Rate Vector Quantizers Begin with Bennett's Integral D 1 M 2/k Fin best inertial profile Zaor's Formula m(x) λ 2/k (x) f X(x) x Fin best point ensity
More informationAnalysis IV, Assignment 4
Analysis IV, Assignment 4 Prof. John Toth Winter 23 Exercise Let f C () an perioic with f(x+2) f(x). Let a n f(t)e int t an (S N f)(x) N n N then f(x ) lim (S Nf)(x ). N a n e inx. If f is continuously
More informationReview of Probability Theory II
Review of Probability Theory II January 9-3, 008 Exectation If the samle sace Ω = {ω, ω,...} is countable and g is a real-valued function, then we define the exected value or the exectation of a function
More information7 Convergence in R d and in Metric Spaces
STA 711: Probability & Measure Theory Robert L. Wolpert 7 Convergence in R d and in Metric Spaces A sequence of elements a n of R d converges to a limit a if and only if, for each ǫ > 0, the sequence a
More informationI. ANALYSIS; PROBABILITY
ma414l1.tex Lecture 1. 12.1.2012 I. NLYSIS; PROBBILITY 1. Lebesgue Measure and Integral We recall Lebesgue measure (M411 Probability and Measure) λ: defined on intervals (a, b] by λ((a, b]) := b a (so
More information1 Exercises for lecture 1
1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )
More information17. Convergence of Random Variables
7. Convergence of Random Variables In elementary mathematics courses (such as Calculus) one speaks of the convergence of functions: f n : R R, then lim f n = f if lim f n (x) = f(x) for all x in R. This
More informationTime-of-Arrival Estimation in Non-Line-Of-Sight Environments
2 Conference on Information Sciences an Systems, The Johns Hopkins University, March 2, 2 Time-of-Arrival Estimation in Non-Line-Of-Sight Environments Sinan Gezici, Hisashi Kobayashi an H. Vincent Poor
More informationStochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik
Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2
More informationJUST THE MATHS UNIT NUMBER DIFFERENTIATION 2 (Rates of change) A.J.Hobson
JUST THE MATHS UNIT NUMBER 10.2 DIFFERENTIATION 2 (Rates of change) by A.J.Hobson 10.2.1 Introuction 10.2.2 Average rates of change 10.2.3 Instantaneous rates of change 10.2.4 Derivatives 10.2.5 Exercises
More informationA PAC-Bayesian Approach to Spectrally-Normalized Margin Bounds for Neural Networks
A PAC-Bayesian Approach to Spectrally-Normalize Margin Bouns for Neural Networks Behnam Neyshabur, Srinah Bhojanapalli, Davi McAllester, Nathan Srebro Toyota Technological Institute at Chicago {bneyshabur,
More informationProbability and Measure
Probability and Measure Robert L. Wolpert Institute of Statistics and Decision Sciences Duke University, Durham, NC, USA Convergence of Random Variables 1. Convergence Concepts 1.1. Convergence of Real
More informationBrownian Motion and Stochastic Calculus
ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s
More informationTheorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1
Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be
More information3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?
MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due ). Show that the open disk x 2 + y 2 < 1 is a countable union of planar elementary sets. Show that the closed disk x 2 + y 2 1 is a countable
More informationLower Bounds for the Smoothed Number of Pareto optimal Solutions
Lower Bouns for the Smoothe Number of Pareto optimal Solutions Tobias Brunsch an Heiko Röglin Department of Computer Science, University of Bonn, Germany brunsch@cs.uni-bonn.e, heiko@roeglin.org Abstract.
More informationFundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales
Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time
More information3 hours UNIVERSITY OF MANCHESTER. 22nd May and. Electronic calculators may be used, provided that they cannot store text.
3 hours MATH40512 UNIVERSITY OF MANCHESTER DYNAMICAL SYSTEMS AND ERGODIC THEORY 22nd May 2007 9.45 12.45 Answer ALL four questions in SECTION A (40 marks in total) and THREE of the four questions in SECTION
More informationFunction Spaces. 1 Hilbert Spaces
Function Spaces A function space is a set of functions F that has some structure. Often a nonparametric regression function or classifier is chosen to lie in some function space, where the assume structure
More informationAgmon Kolmogorov Inequalities on l 2 (Z d )
Journal of Mathematics Research; Vol. 6, No. ; 04 ISSN 96-9795 E-ISSN 96-9809 Publishe by Canaian Center of Science an Eucation Agmon Kolmogorov Inequalities on l (Z ) Arman Sahovic Mathematics Department,
More informationGood luck! (W (t(j + 1)) W (tj)), n 1.
Av Matematisk statistik TENTAMEN I SF940 SANNOLIKHETSTEORI/EXAM IN SF940 PROBABILITY THE- ORY, WEDNESDAY OCTOBER 5, 07, 0800-300 Examinator : Boualem Djehiche, tel 08-7907875, email: boualem@kthse Tillåtna
More informationThe main results about probability measures are the following two facts:
Chapter 2 Probability measures The main results about probability measures are the following two facts: Theorem 2.1 (extension). If P is a (continuous) probability measure on a field F 0 then it has a
More informationProbability and Measure
Chapter 4 Probability and Measure 4.1 Introduction In this chapter we will examine probability theory from the measure theoretic perspective. The realisation that measure theory is the foundation of probability
More informationµ X (A) = P ( X 1 (A) )
1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration
More informationInterconnected Systems of Fliess Operators
Interconnecte Systems of Fliess Operators W. Steven Gray Yaqin Li Department of Electrical an Computer Engineering Ol Dominion University Norfolk, Virginia 23529 USA Abstract Given two analytic nonlinear
More informationAn Introduction to Stochastic Processes in Continuous Time
An Introduction to Stochastic Processes in Continuous Time Flora Spieksma adaptation of the text by Harry van Zanten to be used at your own expense May 22, 212 Contents 1 Stochastic Processes 1 1.1 Introduction......................................
More information19 Eigenvalues, Eigenvectors, Ordinary Differential Equations, and Control
19 Eigenvalues, Eigenvectors, Orinary Differential Equations, an Control This section introuces eigenvalues an eigenvectors of a matrix, an iscusses the role of the eigenvalues in etermining the behavior
More informationMATH 6605: SUMMARY LECTURE NOTES
MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic
More informationMath 115 Section 018 Course Note
Course Note 1 General Functions Definition 1.1. A function is a rule that takes certain numbers as inputs an assigns to each a efinite output number. The set of all input numbers is calle the omain of
More informationLecture 22: Variance and Covariance
EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce
More informationPROBABILITY THEORY II
Ruprecht-Karls-Universität Heidelberg Institut für Angewandte Mathematik Prof. Dr. Jan JOHANNES Outline of the lecture course PROBABILITY THEORY II Summer semester 2016 Preliminary version: April 21, 2016
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationExercises. T 2T. e ita φ(t)dt.
Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.
More informationthe convolution of f and g) given by
09:53 /5/2000 TOPIC Characteristic functions, cont d This lecture develops an inversion formula for recovering the density of a smooth random variable X from its characteristic function, and uses that
More informationThe derivative of a function f(x) is another function, defined in terms of a limiting expression: f(x + δx) f(x)
Y. D. Chong (2016) MH2801: Complex Methos for the Sciences 1. Derivatives The erivative of a function f(x) is another function, efine in terms of a limiting expression: f (x) f (x) lim x δx 0 f(x + δx)
More informationLower bounds on Locality Sensitive Hashing
Lower bouns on Locality Sensitive Hashing Rajeev Motwani Assaf Naor Rina Panigrahy Abstract Given a metric space (X, X ), c 1, r > 0, an p, q [0, 1], a istribution over mappings H : X N is calle a (r,
More informationLecture 10: October 30, 2017
Information an Coing Theory Autumn 2017 Lecturer: Mahur Tulsiani Lecture 10: October 30, 2017 1 I-Projections an applications In this lecture, we will talk more about fining the istribution in a set Π
More informationCHAPTER 1 : DIFFERENTIABLE MANIFOLDS. 1.1 The definition of a differentiable manifold
CHAPTER 1 : DIFFERENTIABLE MANIFOLDS 1.1 The efinition of a ifferentiable manifol Let M be a topological space. This means that we have a family Ω of open sets efine on M. These satisfy (1), M Ω (2) the
More informationEstimation of arrival and service rates for M/M/c queue system
Estimation of arrival and service rates for M/M/c queue system Katarína Starinská starinskak@gmail.com Charles University Faculty of Mathematics and Physics Department of Probability and Mathematical Statistics
More informationProduct measure and Fubini s theorem
Chapter 7 Product measure and Fubini s theorem This is based on [Billingsley, Section 18]. 1. Product spaces Suppose (Ω 1, F 1 ) and (Ω 2, F 2 ) are two probability spaces. In a product space Ω = Ω 1 Ω
More informationWeak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij
Weak convergence and Brownian Motion (telegram style notes) P.J.C. Spreij this version: December 8, 2006 1 The space C[0, ) In this section we summarize some facts concerning the space C[0, ) of real
More informationLECTURE NOTES ON DVORETZKY S THEOREM
LECTURE NOTES ON DVORETZKY S THEOREM STEVEN HEILMAN Abstract. We present the first half of the paper [S]. In particular, the results below, unless otherwise state, shoul be attribute to G. Schechtman.
More informationOn permutation-invariance of limit theorems
On permutation-invariance of limit theorems I. Beres an R. Tichy Abstract By a classical principle of probability theory, sufficiently thin subsequences of general sequences of ranom variables behave lie
More informationPROBLEMS. (b) (Polarization Identity) Show that in any inner product space
1 Professor Carl Cowen Math 54600 Fall 09 PROBLEMS 1. (Geometry in Inner Product Spaces) (a) (Parallelogram Law) Show that in any inner product space x + y 2 + x y 2 = 2( x 2 + y 2 ). (b) (Polarization
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationFE 5204 Stochastic Differential Equations
Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture
More informationLecture 12. F o s, (1.1) F t := s>t
Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let
More informationPart III Advanced Probability
Part III Advanced Probability Based on lectures by M. Lis Notes taken by Dexter Chua Michaelmas 2017 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after
More informationStochastic integration. P.J.C. Spreij
Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................
More informationThe planar Chain Rule and the Differential Equation for the planar Logarithms
arxiv:math/0502377v1 [math.ra] 17 Feb 2005 The planar Chain Rule an the Differential Equation for the planar Logarithms L. Gerritzen (29.09.2004) Abstract A planar monomial is by efinition an isomorphism
More informationMarkovian N-Server Queues (Birth & Death Models)
Markovian -Server Queues (Birth & Death Moels) - Busy Perio Arrivals Poisson (λ) ; Services exp(µ) (E(S) = /µ) Servers statistically ientical, serving FCFS. Offere loa R = λ E(S) = λ/µ Erlangs Q(t) = number
More informationNotes 1 : Measure-theoretic foundations I
Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,
More informationLeast-Squares Regression on Sparse Spaces
Least-Squares Regression on Sparse Spaces Yuri Grinberg, Mahi Milani Far, Joelle Pineau School of Computer Science McGill University Montreal, Canaa {ygrinb,mmilan1,jpineau}@cs.mcgill.ca 1 Introuction
More information2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?
MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due 9/5). Prove that every countable set A is measurable and µ(a) = 0. 2 (Bonus). Let A consist of points (x, y) such that either x or y is
More information