Good luck! Problem 1. (b) The random variables X 1 and X 2 are independent and N(0, 1)-distributed. Show that the random variables X 1

Size: px
Start display at page:

Download "Good luck! Problem 1. (b) The random variables X 1 and X 2 are independent and N(0, 1)-distributed. Show that the random variables X 1"

Transcription

1 Avd. Matematisk statistik TETAME I SF2940 SAOLIKHETSTEORI/EXAM I SF2940 PROBABILITY THE- ORY, WEDESDAY OCTOBER 26, 2016, Examinator : Boualem Djehiche, tel , boualem@kth.se Tillåtna hjälpmedel/permitted means of assistance: Appendix 2 in A. Gut: An Intermediate Course in Probability, Formulas for probability theory SF2940, L. Råde & B. Westergren: Mathematics Handbook for Science and Engineering and pocket calculator. All used notation must be explained and defined. Reasoning and the calculations must be so detailed that they are easy to follow. Each problem yields max 10 p. You may apply results stated in a part of an exam question to another part of the exam question even if you have not solved the first part. 25 points will guarantee a passing result. If you have received 5 bonus points from the home assignments, you may skip Problem 1(a. If you have received 10 bonus points, you may skip the whole Problem 1. Solutions to the exam questions will be available at starting from Friday 28 October Good luck! Problem 1 (a The random variables X 1, X 2,..., are non-negative integer-valued and independent and identically distributed (i.i.d.. The random variable Po(a, a > 0, is independent of X 1, X 2,...,. Set S = X 1 + X X. If we know that S Po(b, where 0 < b < a, show that for each k = 1, 2, 3,..., X k Be( b i.e. Bernoulli-distributed with parameter b/a. (5 p a (b The random variables X 1 and X 2 are independent and (0, 1-distributed. Show that the random variables X 1 X 2 and X1 2 + X2 2 are independent and determine their distributions. (5 p (Hint: Use polar coordinates. Problem 2 X 1, X 2,... is a sequence of independent and identically distributed random variables with mean zero and variance σ 2. The random variable is independent of the sequence (X n n 1 and Po (, > 0. Show that

2 cont. exam sf2940 Fall (a P 1, when. (2 p (b d (0, 1, when. (2 p (c X 1 + X X d (0, σ 2,. (6 p Let X and Y be random variables such that Problem 3 Y X = x (x, a 2 with X (µ, b 2. ( X (a Determine the distribution of the vector. (5 p Y (b Determine the distribution of X Y = y. (5 p Problem 4 Let X and Y be two random variables with finite second moment (square-integrable defined on the probability space (Ω, F, P. Let G be a sub-σ-algebra of F. (a Prove (or find a counter-example that if X and Y are equally distributed (have the same probability distribution then E[X G] = E[Y G] almost surely. (3 p (b Prove that if E[X Y ] = Y and E[Y X] = X, then X = Y almost surely. (3 p (c Let X 1, X 2,..., X n be identically distributed and positive random variables. Show that for any k n [ ] X1 + X X k E = k X 1 + X X n n. (4 p Problem 5 On the space of random variables on (Ω, F, P we set [ ] X Y d(x, Y := E. 1 + X Y

3 cont. exam sf2940 Fall (a Show that d is a metric, that is d(x, X = 0, d(x, Y = d(y, X and d(x, Y d(x, Z + d(z, Y. Let X 1, X 2,..., be a sequence of random variables on the probability space (Ω, F, P. (2 p (b Show that if d(x n, X 0 as n, then X n converges to X in probability as n. (4 p (c Let X 1, X 2,..., be random variables such that P (X n = 1 = 1 1 n, P (X 1/5 n = n = 1, n 2. n1/5 Determine the limit X such that d(x n, X 0 and show that X n does not converge to X in L 1 i.e. E[ X n X ] 0 as n. (4 p (Hint: the function f(t := t, t 0, may be useful. 1+t

4 Avd. Matematisk statistik Suggested solutions to the exam Wednesday October 26, Problem 1 (a Let ϕ X1 (t denote the characteristic function of X 1 and g (s = exp ((s 1 be the probability generating function of Po(a, a > 0. Using the composition formula with characteristic function (5.18 in the Lecture otes, we have ϕ S (t = g (ϕ X1 (t = exp (a(ϕ X1 (t 1. ow, if S Po(b, we should have ϕ S (t = exp (b(e it 1. Hence, which implies that (note that 0 < b/a < 1 exp (a(ϕ X1 (t 1 = exp ( b(e it 1, ϕ X1 (t = b a eit + (1 b a, that is X 1, X 2,... are all Bernoulli Be( b a -distributed. (b Introduce the transformation (X 1, X 2 := ϕ(r, Θ, R > 0, Θ (0, 2π, defined by { X1 = R cos Θ, X 2 = R sin Θ. Then, R = X1 2 + X2 2 and cotanθ = X 2 X 1. Furthermore, the corresponding Jacobian is x x r θ J ϕ := = cos θ r sin θ sin θ r sin θ = r. We have y r y θ f (R,Θ (r, θ = f (X1,X 2 (r cos θ, r sin θ J ϕ = 1 r 2 2π re 2 = fθ (θf R (r, r > 0, θ (0, 2π where, f Θ (θ = 1, θ (0, 2π which is the density of the uniform distribution over (0, 2π 2π and f R (r = re r2 2, r > 0 which is the density of the Rayleigh R(2 distribution. Θ U(0, 2π and R R(2 and are independent. Problem 2 The random variable Po( has E[] = Var( = and its characteristic function is ϕ (t = exp ((e it 1.

5 cont. exam sf2940 Fall (a This question can be solved either using Chebychev s inequality or the Cramér-Slutzky s theorem. Using Chebychev s inequality we have, for every fixed ɛ > 0, ( P 1 > ɛ = P ( > ɛ 1 ɛ 2 Var( = 2 ɛ 2 = 1 2 ɛ 2 0, as. Using the Cramér-Slutzky s theorem, then, when, We have P 1 if and only if ( t ϕ (t = ϕ d 1 if and only if ϕ = exp ((e i t 1. (t e it. But, by Lemma (in the course lecture notes we have, for every real x, where In particular, when x 0, we write e ix = n (ix k + R n (x, (1 k! k=0 ( x n+1 R n (x min (n + 1!, 2 x n. (2 n! e ix = n (ix k + o(x n, (3 k! k=0 where o(x n denotes the rest term R n (x. Hence, when, we obtain (taking n=2 e i t it = 1 + t2 t2 + o(. ( lim ϕ (t = lim exp (it t2 2 + o(t2 = e it. (b We have to show that lim ϕ (t = e t 2 2. Indeed, ( t ϕ (t = e it ϕ = exp ((e i t 1 it. We may use (3 to obtain (e i t t 2 1 = it 2 + o(t2,. oting that using (2, o( t2 0 as, we finally obtain lim ϕ (t = e t 2 2.

6 cont. exam sf2940 Fall (c Set S = X 1 + X X and let ϕ X (t denote the characteristic function of the i.i.d. variables X 1, X 2,.... We have Since S S = S /. P 1 when, by the Cramér-Slutzky s theorem it suffices to show that d (0, σ 2 as i.e. we have to show that lim ϕ S (t = e t 2 σ 2 2. Since Po(, we have ϕ S (t = ϕ S ( t ( = g ϕ X ( t ( ( = exp ϕ X ( t 1. ow, since X has finite variance σ 2, we use the fact that, when, ϕ X ( t = 1 + it E[X] t2 E[X2 ] + o( t2 and noting that E[X] = 0 and E[X 2 ] = Var(X + (E[X] 2 = σ 2, we obtain ( lim ϕ X ( t 1 = t2 2 σ2. or lim ϕ S (t = e t 2 σ 2 2. Problem 3 ( X (a We compute the characteristic function of Y. We have ϕ (X,Y (s, t = E [ e isx+ity ] = E [ E [ e isx+ity X ]] = E [ e isx E [ e ity X ]]. Since Y X (X, a 2 we have E [ e ity X ] = e itx t2 2 a2. [ ϕ (X,Y (s, t = E e isx e itx t2 a2] 2 = e t2 2 a2 E [ e i(s+tx]. Using the assumption that X (µ, b 2 we further obtain ( ϕ (X,Y (s, t = exp isµ + itµ 1 2 (s2 b 2 + 2stb 2 + (a 2 + b 2 t 2. ( X Y (( µ µ ( b 2 b, 2 b 2 a 2 + b 2.

7 cont. exam sf2940 Fall (b We have X Y = y (µ 1 2, σ1 2 2, where µ 1 2 = µ + b2 a 2 + b (y µ, 2 σ2 1 2 = b 2 (1 b4 a 2 + b = 2 b2 b2. a 2 + b 2 ote that σ > 0. Problem 4 (a Recall the definition of the conditional expectation of an integrable random variable X w.r.t. a sub σ-algebra G. Z := E[X G] is the unique G-measurable random variable for which E[Zξ] = E[Xξ], for all bounded G measurable random variables ξ. But, if X and Y have the same distribution then it is not always true that In particular, it is not always true that E[Xξ] = E[Y ξ]. E[E[X G]ξ] = E[Xξ] = E[Y ξ] = E[E[Y G]ξ], for all bounded G-measurable random variables ξ. Here is a counter-example. We know that if X U[0, 1] then Y := 1 X U[0, 1]. Let ζ be a bounded random variable such that E[ζE[X G]] 0, but E[ζ] = 0 and G = σ(ζ i.e. the σ-algebra generated by ζ. If E[Xζ] = E[Y ζ] = E[ζ] E[Xζ]. Then 0 = E[ζ] = 2E[Xζ] = 2E[ζE[X G]] 0, leading to a contradiction. (In a previous version, the suggested solution was wrong. (b To show that X = Y almost surely it suffices to show that E[(X Y 2 ] = 0 because both are square-integrable. We have E[(X Y 2 ] = E[X 2 2XY + Y 2 ] = E[X 2 ] + E[Y 2 ] E[XY ] E[XY ]. But, using the assumption that X = E[Y X] we obtain E[XY ] = E[E[XY X]] = E[XE[Y X]] = E[X 2 ], and using that Y = E[X Y ] we also obtain E[XY ] = E[E[XY Y ]] = E[Y E[X Y ]] = E[Y 2 ]. E[(X Y 2 ] = E[X 2 ] + E[Y 2 ] E[XY ] E[XY ] = E[X 2 ] + E[Y 2 ] E[X 2 ] E[Y 2 ] = 0. (c Set, for k n, S k = X 1 + X X k. Since X 1, X 2,..., X n are identically distributed it is easy to see that E[X 1 S n ] = E[X 2 S n ] =... = E[X n S n ] almost surely.

8 cont. exam sf2940 Fall But, since S n = E[S n S n ] = n j=1 E[X j S n ] = ne[x 1 S n ], it follows that This in turns yields that E[X j S n ] = S n n almost surely, j = 1,..., n. E[S k S n ] = k E[X j S n ] = k n S n j=1 almost surely. E [ Sk S n ] = E [ [ ]] Sk E S n = E S n [ ] E [Sk S n ] S n = k [ ] n E Sn = k S n n. Problem 5 The following properties of the function t f(t = the problem. (i 0 f(t < 1 and is strictly increasing since f (t = 1 (1+t 2 > 0. t, t 0 turn out useful for solving 1+t (ii f is invertible because it is continuous (in fact differentiable and strictly increasing. (iii f(t + s f(t + f(s. indeed, noting that 1 + t + s 1 + t, 1 + s, we have f(t + s = t + s 1 + t + s = t 1 + t + s + s 1 + t + s t 1 + t + s 1 + s = f(t + f(s. (iii From (i and (ii if follows that if a, b and c are non-negative and satisfy a b + c then f(a f(b + f(c. [ ] (a We have d(x, X = E X X = 0. Since X Y = Y X we get d(x, Y = d(y, X. 1+ X X Lastly, by the triangle inequality, it holds that X Z X Y + Y Z. We may apply (iii to obtain that Take expectation to get X Z 1 + X Z X Y Y Z X Y 1 + Y Z. d(x, Y d(x, Z + d(z, Y. (b If d(x n, X 0 as n we want to show that for every fixed ɛ > 0, P ( X n X > ɛ 0 as n. Indeed, using the fact that f is a strictly increasing function and invertible, we have P ( X n X > ɛ = P (f( X n X > f(ɛ 1 f(ɛ E[f( X n X ] = 1 f(ɛ d(x n, X 0, as n, noting that since ɛ > 0, f(ɛ > 0. (c We first show that d(x n, 1 0 as n. Indeed, we have [ ] Xn 1 d(x n, 1 = E = n X n n 1 n = n 1 1 0, 1/5 n n1/5

9 cont. exam sf2940 Fall as n. Hence, X = 1. On the other hand Thus, X n 1 in L 1. E [ X n 1 ] = (n 1 1 +, n. n1/5

Good luck! (W (t(j + 1)) W (tj)), n 1.

Good luck! (W (t(j + 1)) W (tj)), n 1. Av Matematisk statistik TENTAMEN I SF940 SANNOLIKHETSTEORI/EXAM IN SF940 PROBABILITY THE- ORY, WEDNESDAY OCTOBER 5, 07, 0800-300 Examinator : Boualem Djehiche, tel 08-7907875, email: boualem@kthse Tillåtna

More information

Avd. Matematisk statistik

Avd. Matematisk statistik Avd. Matematisk statistik TENTAMEN I SF940 SANNOLIKHETSTEORI/EXAM IN SF940 PROBABILITY THE- ORY TUESDAY THE 9 th OF OCTOBER 03 08.00 a.m. 0.00 p.m. Examinator : Timo Koski, tel. 070 370047, email: tjtkoski@kth.se

More information

Lycka till!

Lycka till! Avd. Matematisk statistik TENTAMEN I SF294 SANNOLIKHETSTEORI/EXAM IN SF294 PROBABILITY THE- ORY MONDAY THE 14 T H OF JANUARY 28 14. p.m. 19. p.m. Examinator : Timo Koski, tel. 79 71 34, e-post: timo@math.kth.se

More information

Gaussian vectors and central limit theorem

Gaussian vectors and central limit theorem Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables

More information

1 Stat 605. Homework I. Due Feb. 1, 2011

1 Stat 605. Homework I. Due Feb. 1, 2011 The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Lecture 22: Variance and Covariance

Lecture 22: Variance and Covariance EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example continued : Coin tossing Math 425 Intro to Probability Lecture 37 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan April 8, 2009 Consider a Bernoulli trials process with

More information

Petter Mostad Mathematical Statistics Chalmers and GU

Petter Mostad Mathematical Statistics Chalmers and GU Petter Mostad Mathematical Statistics Chalmers and GU Solution to MVE55/MSG8 Mathematical statistics and discrete mathematics MVE55/MSG8 Matematisk statistik och diskret matematik Re-exam: 4 August 6,

More information

Solutions Final Exam May. 14, 2014

Solutions Final Exam May. 14, 2014 Solutions Final Exam May. 14, 2014 1. Determine whether the following statements are true or false. Justify your answer (i.e., prove the claim, derive a contradiction or give a counter-example). (a) (10

More information

MAS113 Introduction to Probability and Statistics

MAS113 Introduction to Probability and Statistics MAS113 Introduction to Probability and Statistics School of Mathematics and Statistics, University of Sheffield 2018 19 Identically distributed Suppose we have n random variables X 1, X 2,..., X n. Identically

More information

Chapter 7. Basic Probability Theory

Chapter 7. Basic Probability Theory Chapter 7. Basic Probability Theory I-Liang Chern October 20, 2016 1 / 49 What s kind of matrices satisfying RIP Random matrices with iid Gaussian entries iid Bernoulli entries (+/ 1) iid subgaussian entries

More information

Lecture 4: September Reminder: convergence of sequences

Lecture 4: September Reminder: convergence of sequences 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 4: September 6 In this lecture we discuss the convergence of random variables. At a high-level, our first few lectures focused

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Preliminary Exam 2018 Solutions to Morning Exam

Preliminary Exam 2018 Solutions to Morning Exam Preliminary Exam 28 Solutions to Morning Exam Part I. Solve four of the following five problems. Problem. Consider the series n 2 (n log n) and n 2 (n(log n)2 ). Show that one converges and one diverges

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

COMPLETION OF A METRIC SPACE

COMPLETION OF A METRIC SPACE COMPLETION OF A METRIC SPACE HOW ANY INCOMPLETE METRIC SPACE CAN BE COMPLETED REBECCA AND TRACE Given any incomplete metric space (X,d), ( X, d X) a completion, with (X,d) ( X, d X) where X complete, and

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

If g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get

If g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get 18:2 1/24/2 TOPIC. Inequalities; measures of spread. This lecture explores the implications of Jensen s inequality for g-means in general, and for harmonic, geometric, arithmetic, and related means in

More information

MS 3011 Exercises. December 11, 2013

MS 3011 Exercises. December 11, 2013 MS 3011 Exercises December 11, 2013 The exercises are divided into (A) easy (B) medium and (C) hard. If you are particularly interested I also have some projects at the end which will deepen your understanding

More information

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2) 14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence

More information

Lecture 3 - Expectation, inequalities and laws of large numbers

Lecture 3 - Expectation, inequalities and laws of large numbers Lecture 3 - Expectation, inequalities and laws of large numbers Jan Bouda FI MU April 19, 2009 Jan Bouda (FI MU) Lecture 3 - Expectation, inequalities and laws of large numbersapril 19, 2009 1 / 67 Part

More information

Statistics (1): Estimation

Statistics (1): Estimation Statistics (1): Estimation Marco Banterlé, Christian Robert and Judith Rousseau Practicals 2014-2015 L3, MIDO, Université Paris Dauphine 1 Table des matières 1 Random variables, probability, expectation

More information

Convergence in Mean Square Tidsserieanalys SF2945 Timo Koski

Convergence in Mean Square Tidsserieanalys SF2945 Timo Koski Avd. Matematisk statistik Convergence in Mean Square Tidsserieanalys SF2945 Timo Koski MEAN SQUARE CONVERGENCE. MEAN SQUARE CONVERGENCE OF SUMS. CAUSAL LINEAR PROCESSES 1 Definition of convergence in mean

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D

) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D 3 Independent Random Variables II: Examples 3.1 Some functions of independent r.v. s. Let X 1, X 2,... be independent r.v. s with the known distributions. Then, one can compute the distribution of a r.v.

More information

Proving the central limit theorem

Proving the central limit theorem SOR3012: Stochastic Processes Proving the central limit theorem Gareth Tribello March 3, 2019 1 Purpose In the lectures and exercises we have learnt about the law of large numbers and the central limit

More information

Math 127C, Spring 2006 Final Exam Solutions. x 2 ), g(y 1, y 2 ) = ( y 1 y 2, y1 2 + y2) 2. (g f) (0) = g (f(0))f (0).

Math 127C, Spring 2006 Final Exam Solutions. x 2 ), g(y 1, y 2 ) = ( y 1 y 2, y1 2 + y2) 2. (g f) (0) = g (f(0))f (0). Math 27C, Spring 26 Final Exam Solutions. Define f : R 2 R 2 and g : R 2 R 2 by f(x, x 2 (sin x 2 x, e x x 2, g(y, y 2 ( y y 2, y 2 + y2 2. Use the chain rule to compute the matrix of (g f (,. By the chain

More information

Probability and Measure

Probability and Measure Chapter 4 Probability and Measure 4.1 Introduction In this chapter we will examine probability theory from the measure theoretic perspective. The realisation that measure theory is the foundation of probability

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Final Examination Statistics 200C. T. Ferguson June 11, 2009

Final Examination Statistics 200C. T. Ferguson June 11, 2009 Final Examination Statistics 00C T. Ferguson June, 009. (a) Define: X n converges in probability to X. (b) Define: X m converges in quadratic mean to X. (c) Show that if X n converges in quadratic mean

More information

Stochastic Models (Lecture #4)

Stochastic Models (Lecture #4) Stochastic Models (Lecture #4) Thomas Verdebout Université libre de Bruxelles (ULB) Today Today, our goal will be to discuss limits of sequences of rv, and to study famous limiting results. Convergence

More information

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO QUESTION BOOKLET EE 26 Spring 2006 Final Exam Wednesday, May 7, 8am am DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO You have 80 minutes to complete the final. The final consists of five

More information

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v } Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v

More information

the convolution of f and g) given by

the convolution of f and g) given by 09:53 /5/2000 TOPIC Characteristic functions, cont d This lecture develops an inversion formula for recovering the density of a smooth random variable X from its characteristic function, and uses that

More information

Final Examination Solutions (Total: 100 points)

Final Examination Solutions (Total: 100 points) Final Examination Solutions (Total: points) There are 4 problems, each problem with multiple parts, each worth 5 points. Make sure you answer all questions. Your answer should be as clear and readable

More information

SDS : Theoretical Statistics

SDS : Theoretical Statistics SDS 384 11: Theoretical Statistics Lecture 1: Introduction Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin https://psarkar.github.io/teaching Manegerial Stuff

More information

Introduction to Topology

Introduction to Topology Introduction to Topology Randall R. Holmes Auburn University Typeset by AMS-TEX Chapter 1. Metric Spaces 1. Definition and Examples. As the course progresses we will need to review some basic notions about

More information

Inference and Regression

Inference and Regression Inference and Regression Assignment 3 Department of IOMS Professor William Greene Phone:.998.0876 Office: KMC 7-90 Home page:www.stern.nyu.edu/~wgreene Email: wgreene@stern.nyu.edu Course web page: www.stern.nyu.edu/~wgreene/econometrics/econometrics.htm.

More information

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1 Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Math 241 Final Exam, Spring 2013

Math 241 Final Exam, Spring 2013 Math 241 Final Exam, Spring 2013 Name: Section number: Instructor: Read all of the following information before starting the exam. Question Points Score 1 5 2 5 3 12 4 10 5 17 6 15 7 6 8 12 9 12 10 14

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 Last compiled: November 6, 213 1. Conditional expectation Exercise 1.1. To start with, note that P(X Y = P( c R : X > c, Y c or X c, Y > c = P( c Q : X > c, Y

More information

. Find E(V ) and var(v ).

. Find E(V ) and var(v ). Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number

More information

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

6.1 Moment Generating and Characteristic Functions

6.1 Moment Generating and Characteristic Functions Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,

More information

1.1 Limits and Continuity. Precise definition of a limit and limit laws. Squeeze Theorem. Intermediate Value Theorem. Extreme Value Theorem.

1.1 Limits and Continuity. Precise definition of a limit and limit laws. Squeeze Theorem. Intermediate Value Theorem. Extreme Value Theorem. STATE EXAM MATHEMATICS Variant A ANSWERS AND SOLUTIONS 1 1.1 Limits and Continuity. Precise definition of a limit and limit laws. Squeeze Theorem. Intermediate Value Theorem. Extreme Value Theorem. Definition

More information

Sampling Distributions

Sampling Distributions Sampling Distributions Mathematics 47: Lecture 9 Dan Sloughter Furman University March 16, 2006 Dan Sloughter (Furman University) Sampling Distributions March 16, 2006 1 / 10 Definition We call the probability

More information

University of Regina. Lecture Notes. Michael Kozdron

University of Regina. Lecture Notes. Michael Kozdron University of Regina Statistics 252 Mathematical Statistics Lecture Notes Winter 2005 Michael Kozdron kozdron@math.uregina.ca www.math.uregina.ca/ kozdron Contents 1 The Basic Idea of Statistics: Estimating

More information

The Central Limit Theorem

The Central Limit Theorem The Central Limit Theorem (A rounding-corners overiew of the proof for a.s. convergence assuming i.i.d.r.v. with 2 moments in L 1, provided swaps of lim-ops are legitimate) If {X k } n k=1 are i.i.d.,

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Preliminary Exam 2016 Solutions to Morning Exam

Preliminary Exam 2016 Solutions to Morning Exam Preliminary Exam 16 Solutions to Morning Exam Part I. Solve four of the following five problems. Problem 1. Find the volume of the ice cream cone defined by the inequalities x + y + z 1 and x + y z /3

More information

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk Instructor: Victor F. Araman December 4, 2003 Theory and Applications of Stochastic Systems Lecture 0 B60.432.0 Exponential Martingale for Random Walk Let (S n : n 0) be a random walk with i.i.d. increments

More information

Immerse Metric Space Homework

Immerse Metric Space Homework Immerse Metric Space Homework (Exercises -2). In R n, define d(x, y) = x y +... + x n y n. Show that d is a metric that induces the usual topology. Sketch the basis elements when n = 2. Solution: Steps

More information

Theoretical Statistics. Lecture 1.

Theoretical Statistics. Lecture 1. 1. Organizational issues. 2. Overview. 3. Stochastic convergence. Theoretical Statistics. Lecture 1. eter Bartlett 1 Organizational Issues Lectures: Tue/Thu 11am 12:30pm, 332 Evans. eter Bartlett. bartlett@stat.

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

Real Analysis. Joe Patten August 12, 2018

Real Analysis. Joe Patten August 12, 2018 Real Analysis Joe Patten August 12, 2018 1 Relations and Functions 1.1 Relations A (binary) relation, R, from set A to set B is a subset of A B. Since R is a subset of A B, it is a set of ordered pairs.

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 26: Probablity and Random Processes Problem Set 9 Fall 2007 Issued: Thursday, November, 2007 Due: Friday, November 9, 2007 Reading:

More information

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3. Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae

More information

Probability Background

Probability Background Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second

More information

Lecture 1 Measure concentration

Lecture 1 Measure concentration CSE 29: Learning Theory Fall 2006 Lecture Measure concentration Lecturer: Sanjoy Dasgupta Scribe: Nakul Verma, Aaron Arvey, and Paul Ruvolo. Concentration of measure: examples We start with some examples

More information

DRAFT - Math 101 Lecture Note - Dr. Said Algarni

DRAFT - Math 101 Lecture Note - Dr. Said Algarni 2 Limits 2.1 The Tangent Problems The word tangent is derived from the Latin word tangens, which means touching. A tangent line to a curve is a line that touches the curve and a secant line is a line that

More information

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS Name: ANSWE KEY Math 46: Probability MWF pm, Gasson Exam SOLUTIONS Problem Points Score 4 5 6 BONUS Total 6 Please write neatly. You may leave answers below unsimplified. Have fun and write your name above!

More information

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018 ECE534, Spring 2018: s for Problem Set #4 Due Friday April 6, 2018 1. MMSE Estimation, Data Processing and Innovations The random variables X, Y, Z on a common probability space (Ω, F, P ) are said to

More information

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45 Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions

More information

Mean Value Theorem. MATH 161 Calculus I. J. Robert Buchanan. Summer Department of Mathematics

Mean Value Theorem. MATH 161 Calculus I. J. Robert Buchanan. Summer Department of Mathematics Mean Value Theorem MATH 161 Calculus I J. Robert Buchanan Department of Mathematics Summer 2018 Background: Corollary to the Intermediate Value Theorem Corollary Suppose f is continuous on the closed interval

More information

Limiting Distributions

Limiting Distributions Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the

More information

Mean Value Theorem. MATH 161 Calculus I. J. Robert Buchanan. Summer Department of Mathematics

Mean Value Theorem. MATH 161 Calculus I. J. Robert Buchanan. Summer Department of Mathematics Mean Value Theorem MATH 161 Calculus I J. Robert Buchanan Department of Mathematics Summer 2018 Background: Corollary to the Intermediate Value Theorem Corollary Suppose f is continuous on the closed interval

More information

PROBABILITY THEORY LECTURE 3

PROBABILITY THEORY LECTURE 3 PROBABILITY THEORY LECTURE 3 Per Sidén Division of Statistics Dept. of Computer and Information Science Linköping University PER SIDÉN (STATISTICS, LIU) PROBABILITY THEORY - L3 1 / 15 OVERVIEW LECTURE

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

Lecture I: Asymptotics for large GUE random matrices

Lecture I: Asymptotics for large GUE random matrices Lecture I: Asymptotics for large GUE random matrices Steen Thorbjørnsen, University of Aarhus andom Matrices Definition. Let (Ω, F, P) be a probability space and let n be a positive integer. Then a random

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Metric Space Topology (Spring 2016) Selected Homework Solutions. HW1 Q1.2. Suppose that d is a metric on a set X. Prove that the inequality d(x, y)

Metric Space Topology (Spring 2016) Selected Homework Solutions. HW1 Q1.2. Suppose that d is a metric on a set X. Prove that the inequality d(x, y) Metric Space Topology (Spring 2016) Selected Homework Solutions HW1 Q1.2. Suppose that d is a metric on a set X. Prove that the inequality d(x, y) d(z, w) d(x, z) + d(y, w) holds for all w, x, y, z X.

More information

d(x n, x) d(x n, x nk ) + d(x nk, x) where we chose any fixed k > N

d(x n, x) d(x n, x nk ) + d(x nk, x) where we chose any fixed k > N Problem 1. Let f : A R R have the property that for every x A, there exists ɛ > 0 such that f(t) > ɛ if t (x ɛ, x + ɛ) A. If the set A is compact, prove there exists c > 0 such that f(x) > c for all x

More information

On the number of ways of writing t as a product of factorials

On the number of ways of writing t as a product of factorials On the number of ways of writing t as a product of factorials Daniel M. Kane December 3, 005 Abstract Let N 0 denote the set of non-negative integers. In this paper we prove that lim sup n, m N 0 : n!m!

More information

Part II Probability and Measure

Part II Probability and Measure Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

Math Final Exam.

Math Final Exam. Math 106 - Final Exam. This is a closed book exam. No calculators are allowed. The exam consists of 8 questions worth 100 points. Good luck! Name: Acknowledgment and acceptance of honor code: Signature:

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

The Central Limit Theorem: More of the Story

The Central Limit Theorem: More of the Story The Central Limit Theorem: More of the Story Steven Janke November 2015 Steven Janke (Seminar) The Central Limit Theorem:More of the Story November 2015 1 / 33 Central Limit Theorem Theorem (Central Limit

More information

Math 362, Problem set 1

Math 362, Problem set 1 Math 6, roblem set Due //. (4..8) Determine the mean variance of the mean X of a rom sample of size 9 from a distribution having pdf f(x) = 4x, < x

More information

Math 510 midterm 3 answers

Math 510 midterm 3 answers Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e

More information

Economics 241B Review of Limit Theorems for Sequences of Random Variables

Economics 241B Review of Limit Theorems for Sequences of Random Variables Economics 241B Review of Limit Theorems for Sequences of Random Variables Convergence in Distribution The previous de nitions of convergence focus on the outcome sequences of a random variable. Convergence

More information

Entrance Exam, Differential Equations April, (Solve exactly 6 out of the 8 problems) y + 2y + y cos(x 2 y) = 0, y(0) = 2, y (0) = 4.

Entrance Exam, Differential Equations April, (Solve exactly 6 out of the 8 problems) y + 2y + y cos(x 2 y) = 0, y(0) = 2, y (0) = 4. Entrance Exam, Differential Equations April, 7 (Solve exactly 6 out of the 8 problems). Consider the following initial value problem: { y + y + y cos(x y) =, y() = y. Find all the values y such that the

More information

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem

More information

ABSTRACT CONDITIONAL EXPECTATION IN L 2

ABSTRACT CONDITIONAL EXPECTATION IN L 2 ABSTRACT CONDITIONAL EXPECTATION IN L 2 Abstract. We prove that conditional expecations exist in the L 2 case. The L 2 treatment also gives us a geometric interpretation for conditional expectation. 1.

More information

Definitions & Theorems

Definitions & Theorems Definitions & Theorems Math 147, Fall 2009 December 19, 2010 Contents 1 Logic 2 1.1 Sets.................................................. 2 1.2 The Peano axioms..........................................

More information

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Spring 2012 Math 541B Exam 1

Spring 2012 Math 541B Exam 1 Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information