) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D
|
|
- Allison Dorsey
- 6 years ago
- Views:
Transcription
1 3 Independent Random Variables II: Examples 3.1 Some functions of independent r.v. s. Let X 1, X 2,... be independent r.v. s with the known distributions. Then, one can compute the distribution of a r.v. of the form f(x 1, X 2,... Let us look at some examples. Example (Relation between gamma and beta distributions Let X and Y be real r.v. s with P ((X, Y γ r,a γ r,b (cf.example Then, In particular, P ( (X + Y, P (X + Y γ r,a+b Let us prove (3.1.The well-known formula X γ X+Y r,a+b β a,b. (3.1 and P ( X X+Y β a,b. B(a, b Γ(aΓ(b Γ(a + b (3.2 will also be reproduced in the course of the proof. We first note the following simple equality for an interval J: 1 x a 1 (z x b 1 dx z a+b 1 B(a, bβ a,b (J, zj where zj {zx, ; x J}. (3.1 is equivalent to that 2 P ( (X + Y, X X+Y I J γ r,a+b (Iβ a,b (J for all intervals I (,, J (, 1. We first show that 3 LHS of 2 B(a, bγ(a + b γ r,a+b (Iβ a,b (J. Γ(aΓ(b Let us write D {(x, y (, 2 x ; (x + y, I J}. Then, x+y LHS of 2 γ r,a γ r,b (D ( (rx a 1 (ry b 1 e (x+yr r 2 dxdy Γ(aΓ(b D zx+y 1 r a+b e zr dz x a 1 (z x b 1 dx Γ(aΓ(b I zj 1 B(a, b (rz a+b 1 e zr rdzβ a,b (J Γ(aΓ(b I (1.22 B(a, bγ(a + b γ r,a+b (Iβ a,b (J RHS 3. Γ(aΓ(b Letting I (, and J (, 1 in 3, we get 1 B(a, bγ(a + b, i.e., (3.2. Γ(aΓ(b Finally, plugging this back in 3, we arrive at 2. 18
2 Exercise Let X,Y and Z be r.v. s with (X, Y law γ r,a γ s,b. and Z law β a,b. Prove then that ( s Z P (X/Y A P r 1 Z A (r/sa x a 1 dx, A B((,. B(a, b (1 + rx/s a+b When r a m/2 and s b n/2 (m, n N, the above distribution is called the F m n distribution and is used in statistics. X 1 Y 1 Hint: Let P ((X 1, Y 1 γ 1,a γ 1,b. Then, P ((X, Y P ((X 1 /r, Y 1 /s and X 1 X 1 +Y 1. Then use ( X 1 X 1 +Y 1 Exercise Prove the following extension of Exercise Let X j >, j 1,.., n + 1 be independent r.v. s with P (X j γ r,aj and S def X X n+1. Then, S and T def ( X j S n j1 are independent r.v. s such that P (S γ r,a1 +..+a n+1 and P (T B Γ(a ( a n+1 x a x a n 1 n 1 Γ(a 1 Γ(a n+1 B for any Borel set B { x (, 1 n ; } n j1 x j < 1. A an+1 1 n x j dx 1 dx n Exercise Let e and U are independent r.v. such that P (e γ 1,1 and U is uniformly distributed on (, 2π. Prove then that 2e(cos U, sin U have the standard normal distribution on R 2. j1 Exercise ( Let S n X Xn, 2 where (X j j 1 are real i.i.d. with P (X j ν v (v >, cf. (1.2 Prove then that for m, n 1, 2,.., (( S m P S m+n, γ 1/(2v,(m+n/2 β m/2,n/2. S (( m+n P S n, (S m+n S n /m γ 1/(2v,n/2 Fn m. (cf. Exercise S n /n Hint: Exercise 1.3.7, Example and Exercise Example (Poisson process Let τ 1, τ 2,... be i.i.d. with P (τ j γ r,1 (cf. (1.22 and T n τ τ n. For t, we define N t sup {n N ; T n t}, (3.3 where. Then, P (N t π rt (cf. (1.19. This can be seen as follows. It is enough to prove that 1 P (N t n e rt mn (rt m m! 19
3 Since this is obvious for n, we assume that n 1. Then, P (N t n (3.3 P (T n t (3.1 γ r,n ((, t] (1.22 r n (n 1! We will conclude 1 by showing that: 2 1 (n 1! s y n 1 e y dy e s t mn x n 1 e xr dx xy/r s m m!, s. 1 (n 1! rt y n 1 e y dy. Let f(s and g(s be the LHS and RHS of 2 respectively. Then, f( g(, since n 1. Moreover, g (s e s mn s m m! + e s mn s m 1 (m 1! e s s n 1 (n 1! f (s, and hence f g. (N t t is called the Poisson process with the parameter r. N t has, for example, the following interpretation; T n is the time when the n-th customer arrives at the COOP cafeteria in a day and N t is the number of customers who visited the cafeteria up to time t. Exercise Let {X i } n i1 be r.v. s with P ((X i n i1 n i1γ ri,1 (cf. (1.22 and M n min i1,...,n X i. Prove then that for any j 1,..., n and x, P (M n X j and X j > x In particular, P (M n γ r r n,1 r j n i1 r exp i ( x n r i. Exercise (Thinning of a Poisson r.v. Let N be a r.v. with P (N π r and let (X n n be i.i.d. with values in a finite set S. We suppose that N and (X n n are independent. Prove then that N s N j 1{X j s} (s S are independent and that P (N s π p(sr, where p(s P (X s. Exercise (geometric distribution Let G inf{n 1 ; X n 1}, where (X n n 1 are {, 1}-valued i.i.d. with P (X n 1 p. Then, show that P (G n p(1 p n 1, E[G] 1/p, and var(g (1 p/p. The distribution of G is called the p-geometric distribution. The geometric distribution can be thought of as a discrete analogue of the exponential distribution. Exercise Let G, τ 1, τ 2,... be independent r.v. s such that P (G n p(1 p n 1 (n 1, 2,... and P (τ j γ r,1 (cf. (1.22. Prove then that P (τ τ G γ pr,1. Exercise (binomial distribution Let S n X X n, where (X n n 1 are {, 1}- valued i.i.d. with P (X n 1 p. Prove then that ES n np, vars n np(1 p n/4, P (S n r p r (1 p n r for r, 1,..., n. r The distribution of S n is called the p-binomial distribution. Hint: vars n varx varx n (cf. ( i1
4 Exercise Let X (X j n j1 and S n X X n, where X 1,..., X n are {, 1}-valued i.i.d. with P (X j 1 p (, 1. Prove the following: i P (X x S n m 1, m regardless of the value of p, for any m, 1,..., n and x (x j n j1 {, 1} n with x x n m. ii d Ef(X 1 cov(f(x, S dp p(1 p n for any f : {, 1} n R. Exercise ( (Relation between geometric and binomial distributions Let G 1, G 2,... be i.i.d. such that P (G 1 n p(1 p n 1 (p-geometric distribution and let S n sup{r ; G G r n} for n N. Prove then that X n S n S n 1, n N are {, 1}-valued i.i.d. such that P (X n 1 p. In particular, S n has p-binomial distribution (Exercise The r.v. s (S n n 1 above can be thought of as a discrete-time analogue of Poisson process (Example Example ( Let d N, δ, r >, and λ > d. We consider an R d -valued r.v. X law δra A d Γ (a x λ exp( r x δ dx, where a λ+d, and A δ d 2π d 2 /Γ( d (the area of the unit sphere in 2 Rd, cf. Example Let Y law γ s,b be a r.v. independent of X, cf. (1.22. We will show that Y 1/2 X law δra s b A d B(a, b There are two important special cases: x λ dx. (3.4 (s + r x δ a+b 1 (δ, r, λ (2,,, and (s, b (1/2, 1/2: In this case, X law N(, c 2 I 2c 2 d. On the other 1/2 law hand, we see from (1.23 that Y Z, where Z law N(, 1. Moreover, it is easy to see that the right-hand-side of (3.4 is the (c-cauchy distribution. Therfore, (3.4 says that if r.v s X law N(, c 2 I d and Z law N(, 1 are independent, then X/ Z law (c-cauchy distribution. (3.5 (d, δ, r, λ (1, 2, 1/2,, and (s, b (n/2, n/2: with n N. In this case, the distribution given by (3.4 is called the T n -distribution used in statistics. The proof of (3.4 goes as follows. P (Y 1/2 X B P (Y dy P (X dx1 B (y 1/2 x R d δr a s b y b 1 e sy dy 1 B (y 1/2 x x λ e r x δ dx A d Γ(aΓ(b R d δr a s b y a+b 1 e sy dy z λ e ry z δ dz A d Γ(aΓ(b B δr a s b z λ dz y a+b 1 e (s+r z δ y dy. A d Γ(aΓ(b B 21
5 We easily see from the definition of the Gamma-function that y a+b 1 e (s+ry z δ y dy Thus, together with (3.2, we conclude that P (Y 1/2 X B δra s b A d B(a, b Γ (a + b (s + r z δ a+b. B z λ (s + r z δ a+b dz. 3.2 A proof of Weierstrass approximation theorem Example (Weierstrass approximation theorem Let I [, 1] and f C(I R. Then, there exist polynomials f n : R R (n 1 such that 1 lim n max θ I f n(θ f(θ. To prove this, we fix θ I and n N for a moment and let S n be a r.v. such that P (S n r θ r (1 θ n r for r,..., n. r Then, f n (θ def. Ef(n 1 S n n f(n 1 rp (S n r is a polynomial in θ. On the other hand, we see from Exercise that 2 vars n n/4. The key to prove 1 is 8 : 3 P ( n 1 S n θ ε 1 for any ε >. 4ε 2 n In fact, using Chebyshev s inequality (Exercise and 2, P ( n 1 S n θ ε We now conclude 1 from 3 as follows: r Chebyshev ε 2 E [ n 1 S n θ 2] ε 2 n 2 vars n 2 1 4ε 2 n. f n (θ f(θ E f(n 1 S n f(θ E [ f(n 1 S n f(θ 1{ n 1 S n θ n 1/3 } ] +E [ f(n 1 S n f(θ 1{ n 1 S n θ < n 1/3 } ] 3 d 2 sup f(θ θ I 4n + sup f(θ f(θ 1/3 θ,θ I θ θ <n 1/3, as n uniformly in θ, where in the last line, we have used the uniform continuity of f. 8 This is a special case of the weak law of large numbers. 22
6 Exercise (Weierstrass approximation theorem in higher dimensions Let I [, 1] d and f C(I R. Prove that there exist polynomials f n : R d R (n 1 such that lim max f n(θ f(θ. Hint: Fix θ (θ ν d ν1 I and n N for a moment. Let n θ I S n (S ν n d ν1, where S 1 n,..., S d n are independent r.v. s with P (S ν n r r ( r n, 1 ν d. Then, P (S n x d ν1 x ν (θ ν xν (1 θ ν n xν. Exercise (i Let f C b ([, and f n (x e nx k (nx k f k! ( k, n N, x. n (θ ν r (1 θ ν n r Prove then that lim f n (x f(x for all x. Hint: We may assume x >, since f n ( n f(. Let S n be r.v. with P (S n π nx (cf. (1.19. Then, f n (x E[f( Sn ]. n (ii (Injectivity of the Laplace transform Let µ 1, µ 2 P([, be such that e sx dµ 1 (x e sx dµ 2 (x for all s. [, [, Use (i to show that µ 1 µ 2. Hint: Show that f [, ndµ 1 f [, ndµ 2 for any f C b ([,. Exercise ( Show the following: (i For any n N and z C\{}, Q n (z def. 1 2 z n z n n 2 z z n 1 l,m<n l m where we define Q n (1 n. Hint: Let s n (z 1 + z z n 1. Then, z l µ. (3.6 2 z n z n (1 z n (1 z n (1 z(1 z 1 s n (zs n (z 1. (ii F n (θ def. Q n (e 2πiθ for all θ R, 1 F n (θdθ 1. These show that F n is a density of a probability measure on [, 1] with respect to the Lebesgue measure. F n is called the Fejér kernel. Exercise ( (Uniform approximation by trigonometric polynomials A function Q : R C is called a trigonometric polynomial, if it is a finite linear combination of {θ e 2πinθ } n Z. Let f C(R C be of the period 1 and f n (θ 1 f(θ φf n (φdφ, where F n is the Fejér kernel (Exercise Prove then that f n is a trigonometric polynomial and that lim sup f n (θ f(θ. n θ 1 Hint: f n (θ 1 f(φf n(θ φdφ by the periodicity. Then, use (3.6 to see that f n is a trigonometric polynomial. 23
7 3.3 ( Decimal fractions as i.i.d. In this subsection, we consider a probability sapce (Ω, F, P and a r.v. U with the uniform distribution on (, 1, i. e., P {U B} dt for all B B((, 1. B Example (Decimal fractions are i.i.d. Suppose that q 2 is an integer. For n 1 and s 1,..., s n {,..., q 1}, we define I s1...,s n [, 1 and d n : Ω {,..., q 1} by { I s1...s n q k s k + x ; x [, q n}, 1 k n d n (ω s if U(ω s 1,...,s n 1 I s1 s n 1 s. Note that {I s1...,s n 1 s} q 1 s are obtained by dividing I s1...,s n 1 into q smaller intervals with equal length (q n and that the interval I s1...,s n 1 s is the (s + 1-th one from the left. This means that d n (ω is nothing but the n-th digit in the q-adic expansion of the number U(ω [, 1 and therefore that U(ω q k d k (ω for all ω Ω, (3.7 k 1 We will prove that (d n n 1 are i.i.d. with P (d n s q 1, s,..., q 1. (3.8 We see from the definition above that for all s 1,..., s n {,..., q 1}, n {ω ; d j (ω s j } {ω ; U(ω I s1 s n } j1 and hence that 1 P {d j s j } P (U I s1 s n I s1 s n q n j1 Moreover, this implies 2 P (d n s n q 1 for all n 1 and s n {,..., q 1}, since P (d n s n 1 s 1,...,s n 1 P ( n {d j s j } j1 s 1,...,s n 1 q n q 1. We now conclude (3.8 from 1 and 2 (cf. Exercise Example Construction of a sequence of independent random variables with discrete state spaces: Let µ n P(S n, B n (n 1,... be a sequence of probability measures, where for each n 1, S n is a countable set and B n is the collection of all subsets in S n. We will 24
8 construct a sequence X n : (Ω, F (S n, B n of independent r.v. s such that P (X n µ n for all n 1. The construction is just a slight extension of Example We first construct a sequence I s1 s n of sub-intervals of [, 1 inductively as follows, where n 1, and (s 1,..., s n S 1 S n. We split [, 1 into disjoint intervals {I s } s S1 with length I s µ 1 (s for each s S 1. Suppose that we have disjoint intervals I s1 s n 1 such that I s1 s n 1 µ 1 (s 1 µ n 1 (s n 1 for (s 1,..., s n 1 S 1 S n 1. We then split each I s1 s n 1 into disjoint intervals {I s1 s n 1 s n } sn Sn so that I s1 s n 1 s n µ 1 (s 1 µ n 1 (s n 1 µ n (s n for each s n S n. We now define We see from the definition that X n (ω s if U(ω s 1,...,s n 1 I s1 s n 1 s. n {ω ; X j (ω s j } {ω ; U(ω I s1 s n }. j1 and hence that 1 P {X j s j } I s1 s n µ 1 (s 1 µ n (s n. j1 Moreover, this implies: 2 P (X n s n µ n (s n for all n 1, since P (X n s n 1 P ( n j1{x j s j } s 1,...,s n 1 µ 1 (s 1 µ n 1 (s n 1 µ n (s n µ n (s n. s 1,...,s n 1 We conclude from 1 and 2 that (X n n 1 are independent and that P (X n Exercise µ n (cf. 25
or E ( U(X) ) e zx = e ux e ivx = e ux( cos(vx) + i sin(vx) ), B X := { u R : M X (u) < } (4)
:23 /4/2000 TOPIC Characteristic functions This lecture begins our study of the characteristic function φ X (t) := Ee itx = E cos(tx)+ie sin(tx) (t R) of a real random variable X Characteristic functions
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationMath 321 Final Examination April 1995 Notation used in this exam: N. (1) S N (f,x) = f(t)e int dt e inx.
Math 321 Final Examination April 1995 Notation used in this exam: N 1 π (1) S N (f,x) = f(t)e int dt e inx. 2π n= N π (2) C(X, R) is the space of bounded real-valued functions on the metric space X, equipped
More informationReal Analysis Math 131AH Rudin, Chapter #1. Dominique Abdi
Real Analysis Math 3AH Rudin, Chapter # Dominique Abdi.. If r is rational (r 0) and x is irrational, prove that r + x and rx are irrational. Solution. Assume the contrary, that r+x and rx are rational.
More informationX n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)
14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence
More information18.175: Lecture 3 Integration
18.175: Lecture 3 Scott Sheffield MIT Outline Outline Recall definitions Probability space is triple (Ω, F, P) where Ω is sample space, F is set of events (the σ-algebra) and P : F [0, 1] is the probability
More informationthe convolution of f and g) given by
09:53 /5/2000 TOPIC Characteristic functions, cont d This lecture develops an inversion formula for recovering the density of a smooth random variable X from its characteristic function, and uses that
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction
More information1* (10 pts) Let X be a random variable with P (X = 1) = P (X = 1) = 1 2
Math 736-1 Homework Fall 27 1* (1 pts) Let X be a random variable with P (X = 1) = P (X = 1) = 1 2 and let Y be a standard normal random variable. Assume that X and Y are independent. Find the distribution
More informationMeasure-theoretic probability
Measure-theoretic probability Koltay L. VEGTMAM144B November 28, 2012 (VEGTMAM144B) Measure-theoretic probability November 28, 2012 1 / 27 The probability space De nition The (Ω, A, P) measure space is
More information1 of 7 7/16/2009 6:12 AM Virtual Laboratories > 7. Point Estimation > 1 2 3 4 5 6 1. Estimators The Basic Statistical Model As usual, our starting point is a random experiment with an underlying sample
More informationLecture 17: The Exponential and Some Related Distributions
Lecture 7: The Exponential and Some Related Distributions. Definition Definition: A continuous random variable X is said to have the exponential distribution with parameter if the density of X is e x if
More information1. Aufgabenblatt zur Vorlesung Probability Theory
24.10.17 1. Aufgabenblatt zur Vorlesung By (Ω, A, P ) we always enote the unerlying probability space, unless state otherwise. 1. Let r > 0, an efine f(x) = 1 [0, [ (x) exp( r x), x R. a) Show that p f
More informationSelected Exercises on Expectations and Some Probability Inequalities
Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More informationExercises. T 2T. e ita φ(t)dt.
Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More information(U) =, if 0 U, 1 U, (U) = X, if 0 U, and 1 U. (U) = E, if 0 U, but 1 U. (U) = X \ E if 0 U, but 1 U. n=1 A n, then A M.
1. Abstract Integration The main reference for this section is Rudin s Real and Complex Analysis. The purpose of developing an abstract theory of integration is to emphasize the difference between the
More informationLimiting Distributions
We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results
More informationSobolev Spaces. Chapter 10
Chapter 1 Sobolev Spaces We now define spaces H 1,p (R n ), known as Sobolev spaces. For u to belong to H 1,p (R n ), we require that u L p (R n ) and that u have weak derivatives of first order in L p
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More informationCharacteristic Functions and the Central Limit Theorem
Chapter 6 Characteristic Functions and the Central Limit Theorem 6.1 Characteristic Functions 6.1.1 Transforms and Characteristic Functions. There are several transforms or generating functions used in
More information1 Review of di erential calculus
Review of di erential calculus This chapter presents the main elements of di erential calculus needed in probability theory. Often, students taking a course on probability theory have problems with concepts
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationOn the convergence of sequences of random variables: A primer
BCAM May 2012 1 On the convergence of sequences of random variables: A primer Armand M. Makowski ECE & ISR/HyNet University of Maryland at College Park armand@isr.umd.edu BCAM May 2012 2 A sequence a :
More informationQualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf
Part 1: Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section
More information04. Random Variables: Concepts
University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 215 4. Random Variables: Concepts Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative
More informationProblem set 2 The central limit theorem.
Problem set 2 The central limit theorem. Math 22a September 6, 204 Due Sept. 23 The purpose of this problem set is to walk through the proof of the central limit theorem of probability theory. Roughly
More informationFinite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product
Chapter 4 Hilbert Spaces 4.1 Inner Product Spaces Inner Product Space. A complex vector space E is called an inner product space (or a pre-hilbert space, or a unitary space) if there is a mapping (, )
More informationMath 180B Homework 9 Solutions
Problem 1 (Pinsky & Karlin, Exercise 5.1.3). Let X and Y be independent Poisson distributed random variables with parameters α and β, respectively. Determine the conditional distribution of X, given that
More informationChapter 1. Statistical Spaces
Chapter 1 Statistical Spaces Mathematical statistics is a science that studies the statistical regularity of random phenomena, essentially by some observation values of random variable (r.v.) X. Sometimes
More information7: FOURIER SERIES STEVEN HEILMAN
7: FOURIER SERIES STEVE HEILMA Contents 1. Review 1 2. Introduction 1 3. Periodic Functions 2 4. Inner Products on Periodic Functions 3 5. Trigonometric Polynomials 5 6. Periodic Convolutions 7 7. Fourier
More informationErgodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.
Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions
More informationCompletion Date: Monday February 11, 2008
MATH 4 (R) Winter 8 Intermediate Calculus I Solutions to Problem Set #4 Completion Date: Monday February, 8 Department of Mathematical and Statistical Sciences University of Alberta Question. [Sec..9,
More informationChapter 6. Convergence. Probability Theory. Four different convergence concepts. Four different convergence concepts. Convergence in probability
Probability Theory Chapter 6 Convergence Four different convergence concepts Let X 1, X 2, be a sequence of (usually dependent) random variables Definition 1.1. X n converges almost surely (a.s.), or with
More informationChapter 1. Sets and probability. 1.3 Probability space
Random processes - Chapter 1. Sets and probability 1 Random processes Chapter 1. Sets and probability 1.3 Probability space 1.3 Probability space Random processes - Chapter 1. Sets and probability 2 Probability
More informationStat 426 : Homework 1.
Stat 426 : Homework 1. Moulinath Banerjee University of Michigan Announcement: The homework carries 120 points and contributes 10 points to the total grade. (1) A geometric random variable W takes values
More informationWiener Measure and Brownian Motion
Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u
More informationI. ANALYSIS; PROBABILITY
ma414l1.tex Lecture 1. 12.1.2012 I. NLYSIS; PROBBILITY 1. Lebesgue Measure and Integral We recall Lebesgue measure (M411 Probability and Measure) λ: defined on intervals (a, b] by λ((a, b]) := b a (so
More informationProblem Set 5: Solutions Math 201A: Fall 2016
Problem Set 5: s Math 21A: Fall 216 Problem 1. Define f : [1, ) [1, ) by f(x) = x + 1/x. Show that f(x) f(y) < x y for all x, y [1, ) with x y, but f has no fixed point. Why doesn t this example contradict
More information1 Independent increments
Tel Aviv University, 2008 Brownian motion 1 1 Independent increments 1a Three convolution semigroups........... 1 1b Independent increments.............. 2 1c Continuous time................... 3 1d Bad
More information1. If 1, ω, ω 2, -----, ω 9 are the 10 th roots of unity, then (1 + ω) (1 + ω 2 ) (1 + ω 9 ) is A) 1 B) 1 C) 10 D) 0
4 INUTES. If, ω, ω, -----, ω 9 are the th roots of unity, then ( + ω) ( + ω ) ----- ( + ω 9 ) is B) D) 5. i If - i = a + ib, then a =, b = B) a =, b = a =, b = D) a =, b= 3. Find the integral values for
More informationJUST THE MATHS UNIT NUMBER INTEGRATION APPLICATIONS 13 (Second moments of a volume (A)) A.J.Hobson
JUST THE MATHS UNIT NUMBER 13.13 INTEGRATION APPLICATIONS 13 (Second moments of a volume (A)) by A.J.Hobson 13.13.1 Introduction 13.13. The second moment of a volume of revolution about the y-axis 13.13.3
More information3 hours UNIVERSITY OF MANCHESTER. 22nd May and. Electronic calculators may be used, provided that they cannot store text.
3 hours MATH40512 UNIVERSITY OF MANCHESTER DYNAMICAL SYSTEMS AND ERGODIC THEORY 22nd May 2007 9.45 12.45 Answer ALL four questions in SECTION A (40 marks in total) and THREE of the four questions in SECTION
More informationLimiting Distributions
Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the
More informationn [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1)
1.4. CONSTRUCTION OF LEBESGUE-STIELTJES MEASURES In this section we shall put to use the Carathéodory-Hahn theory, in order to construct measures with certain desirable properties first on the real line
More informationTheorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1
Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be
More informationExercises Measure Theoretic Probability
Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary
More informationREAL AND COMPLEX ANALYSIS
REAL AND COMPLE ANALYSIS Third Edition Walter Rudin Professor of Mathematics University of Wisconsin, Madison Version 1.1 No rights reserved. Any part of this work can be reproduced or transmitted in any
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationMS 3011 Exercises. December 11, 2013
MS 3011 Exercises December 11, 2013 The exercises are divided into (A) easy (B) medium and (C) hard. If you are particularly interested I also have some projects at the end which will deepen your understanding
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationChapter 7. Basic Probability Theory
Chapter 7. Basic Probability Theory I-Liang Chern October 20, 2016 1 / 49 What s kind of matrices satisfying RIP Random matrices with iid Gaussian entries iid Bernoulli entries (+/ 1) iid subgaussian entries
More informationL p Functions. Given a measure space (X, µ) and a real number p [1, ), recall that the L p -norm of a measurable function f : X R is defined by
L p Functions Given a measure space (, µ) and a real number p [, ), recall that the L p -norm of a measurable function f : R is defined by f p = ( ) /p f p dµ Note that the L p -norm of a function f may
More information2) Let X be a compact space. Prove that the space C(X) of continuous real-valued functions is a complete metric space.
University of Bergen General Functional Analysis Problems with solutions 6 ) Prove that is unique in any normed space. Solution of ) Let us suppose that there are 2 zeros and 2. Then = + 2 = 2 + = 2. 2)
More informationStochastic Models (Lecture #4)
Stochastic Models (Lecture #4) Thomas Verdebout Université libre de Bruxelles (ULB) Today Today, our goal will be to discuss limits of sequences of rv, and to study famous limiting results. Convergence
More informationAsymptotic Statistics-III. Changliang Zou
Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (
More informationIntegration on Measure Spaces
Chapter 3 Integration on Measure Spaces In this chapter we introduce the general notion of a measure on a space X, define the class of measurable functions, and define the integral, first on a class of
More informationSome Background Material
Chapter 1 Some Background Material In the first chapter, we present a quick review of elementary - but important - material as a way of dipping our toes in the water. This chapter also introduces important
More information1 Orthonormal sets in Hilbert space
Math 857 Fall 15 1 Orthonormal sets in Hilbert space Let S H. We denote by [S] the span of S, i.e., the set of all linear combinations of elements from S. A set {u α : α A} is called orthonormal, if u
More informationNotes on Measure Theory. Let A 2 M. A function µ : A [0, ] is finitely additive if, A j ) =
Notes on Measure Theory Definitions and Facts from Topic 1500 For any set M, 2 M := {subsets of M} is called the power set of M. The power set is the set of all sets. Let A 2 M. A function µ : A [0, ]
More information2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?
MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due 9/5). Prove that every countable set A is measurable and µ(a) = 0. 2 (Bonus). Let A consist of points (x, y) such that either x or y is
More information8 Laws of large numbers
8 Laws of large numbers 8.1 Introduction We first start with the idea of standardizing a random variable. Let X be a random variable with mean µ and variance σ 2. Then Z = (X µ)/σ will be a random variable
More informationExercises in Extreme value theory
Exercises in Extreme value theory 2016 spring semester 1. Show that L(t) = logt is a slowly varying function but t ǫ is not if ǫ 0. 2. If the random variable X has distribution F with finite variance,
More informationExercises Measure Theoretic Probability
Exercises Measure Theoretic Probability Chapter 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family
More informationReview of Probabilities and Basic Statistics
Alex Smola Barnabas Poczos TA: Ina Fiterau 4 th year PhD student MLD Review of Probabilities and Basic Statistics 10-701 Recitations 1/25/2013 Recitation 1: Statistics Intro 1 Overview Introduction to
More informationIntroduction to Fourier Analysis
Lecture Introduction to Fourier Analysis Jan 7, 2005 Lecturer: Nati Linial Notes: Atri Rudra & Ashish Sabharwal. ext he main text for the first part of this course would be. W. Körner, Fourier Analysis
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationGaussian Random Field: simulation and quantification of the error
Gaussian Random Field: simulation and quantification of the error EMSE Workshop 6 November 2017 3 2 1 0-1 -2-3 80 60 40 1 Continuity Separability Proving continuity without separability 2 The stationary
More information(1) Consider the space S consisting of all continuous real-valued functions on the closed interval [0, 1]. For f, g S, define
Homework, Real Analysis I, Fall, 2010. (1) Consider the space S consisting of all continuous real-valued functions on the closed interval [0, 1]. For f, g S, define ρ(f, g) = 1 0 f(x) g(x) dx. Show that
More informationAnalysis Comprehensive Exam, January 2011 Instructions: Do as many problems as you can. You should attempt to answer completely some questions in both
Analysis Comprehensive Exam, January 2011 Instructions: Do as many problems as you can. You should attempt to answer completely some questions in both real and complex analysis. You have 3 hours. Real
More informationQualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf
Part : Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section
More informationLaplace s Equation. Chapter Mean Value Formulas
Chapter 1 Laplace s Equation Let be an open set in R n. A function u C 2 () is called harmonic in if it satisfies Laplace s equation n (1.1) u := D ii u = 0 in. i=1 A function u C 2 () is called subharmonic
More informationFunctional Analysis Exercise Class
Functional Analysis Exercise Class Wee November 30 Dec 4: Deadline to hand in the homewor: your exercise class on wee December 7 11 Exercises with solutions Recall that every normed space X can be isometrically
More information1. Let A R be a nonempty set that is bounded from above, and let a be the least upper bound of A. Show that there exists a sequence {a n } n N
Applied Analysis prelim July 15, 216, with solutions Solve 4 of the problems 1-5 and 2 of the problems 6-8. We will only grade the first 4 problems attempted from1-5 and the first 2 attempted from problems
More informationReal Analysis Problems
Real Analysis Problems Cristian E. Gutiérrez September 14, 29 1 1 CONTINUITY 1 Continuity Problem 1.1 Let r n be the sequence of rational numbers and Prove that f(x) = 1. f is continuous on the irrationals.
More informationMathematical Statistics
Mathematical Statistics Chapter Three. Point Estimation 3.4 Uniformly Minimum Variance Unbiased Estimator(UMVUE) Criteria for Best Estimators MSE Criterion Let F = {p(x; θ) : θ Θ} be a parametric distribution
More informationSupplementary Notes for W. Rudin: Principles of Mathematical Analysis
Supplementary Notes for W. Rudin: Principles of Mathematical Analysis SIGURDUR HELGASON In 8.00B it is customary to cover Chapters 7 in Rudin s book. Experience shows that this requires careful planning
More informationThe main results about probability measures are the following two facts:
Chapter 2 Probability measures The main results about probability measures are the following two facts: Theorem 2.1 (extension). If P is a (continuous) probability measure on a field F 0 then it has a
More informationSection Taylor and Maclaurin Series
Section.0 Taylor and Maclaurin Series Ruipeng Shen Feb 5 Taylor and Maclaurin Series Main Goal: How to find a power series representation for a smooth function us assume that a smooth function has a power
More informationTHEOREMS, ETC., FOR MATH 515
THEOREMS, ETC., FOR MATH 515 Proposition 1 (=comment on page 17). If A is an algebra, then any finite union or finite intersection of sets in A is also in A. Proposition 2 (=Proposition 1.1). For every
More informationNotes on uniform convergence
Notes on uniform convergence Erik Wahlén erik.wahlen@math.lu.se January 17, 2012 1 Numerical sequences We begin by recalling some properties of numerical sequences. By a numerical sequence we simply mean
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationCMSC Discrete Mathematics FINAL EXAM Tuesday, December 5, 2017, 10:30-12:30
CMSC-37110 Discrete Mathematics FINAL EXAM Tuesday, December 5, 2017, 10:30-12:30 Name (print): Email: This exam contributes 40% to your course grade. Do not use book, notes, scrap paper. NO ELECTRONIC
More informationExercises in stochastic analysis
Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with
More informationMASTERS EXAMINATION IN MATHEMATICS
MASTERS EXAMINATION IN MATHEMATICS PURE MATHEMATICS OPTION FALL 2007 Full points can be obtained for correct answers to 8 questions. Each numbered question (which may have several parts) is worth the same
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationMathematics 324 Riemann Zeta Function August 5, 2005
Mathematics 324 Riemann Zeta Function August 5, 25 In this note we give an introduction to the Riemann zeta function, which connects the ideas of real analysis with the arithmetic of the integers. Define
More informationRiemann integral and volume are generalized to unbounded functions and sets. is an admissible set, and its volume is a Riemann integral, 1l E,
Tel Aviv University, 26 Analysis-III 9 9 Improper integral 9a Introduction....................... 9 9b Positive integrands................... 9c Special functions gamma and beta......... 4 9d Change of
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationHomework 11. Solutions
Homework 11. Solutions Problem 2.3.2. Let f n : R R be 1/n times the characteristic function of the interval (0, n). Show that f n 0 uniformly and f n µ L = 1. Why isn t it a counterexample to the Lebesgue
More informationElementary Probability. Exam Number 38119
Elementary Probability Exam Number 38119 2 1. Introduction Consider any experiment whose result is unknown, for example throwing a coin, the daily number of customers in a supermarket or the duration of
More information6.1 Moment Generating and Characteristic Functions
Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,
More informationMath Homework 2
Math 73 Homework Due: September 8, 6 Suppose that f is holomorphic in a region Ω, ie an open connected set Prove that in any of the following cases (a) R(f) is constant; (b) I(f) is constant; (c) f is
More informationl(y j ) = 0 for all y j (1)
Problem 1. The closed linear span of a subset {y j } of a normed vector space is defined as the intersection of all closed subspaces containing all y j and thus the smallest such subspace. 1 Show that
More informationImmerse Metric Space Homework
Immerse Metric Space Homework (Exercises -2). In R n, define d(x, y) = x y +... + x n y n. Show that d is a metric that induces the usual topology. Sketch the basis elements when n = 2. Solution: Steps
More informationInteresting Probability Problems
Interesting Probability Problems Jonathan Mostovoy - 4665 University of Toronto August 9, 6 Contents Chapter Questions a).8.7............................................ b)..............................................
More information13. Examples of measure-preserving tranformations: rotations of a torus, the doubling map
3. Examples of measure-preserving tranformations: rotations of a torus, the doubling map 3. Rotations of a torus, the doubling map In this lecture we give two methods by which one can show that a given
More information