1 Exercises for lecture 1

Size: px
Start display at page:

Download "1 Exercises for lecture 1"

Transcription

1 1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X ) <, then E(X) = µ. Moreover, if F admits an unimodal density, then the mean = mediane = mode. b) If F is symmetric and all absolute moments µ k exist, then the moments µ k = 0 for all odd k. If F is symmetric with respect to µ and all the moments µ k exist, then µ k = 0 for all odd k (e.g., µ 3 = 0). Exercise 2 Provide an example of asymmetric density with α = 0. Exercise 3 Show that µ 4 /σ4 = 3 and β = 0 for normal distribution N(µ, σ 2 ). Exercise 4 Let (ξ n ) and (η n ) be two sequences of r.v.. Prove the following statements: 1 o. If a R is a constant then ξ n D a ξn P a, when n. 2 o. (Slutsky s theorem) If ξ n D a and ηn D η when n and a R is a constant then ξ n + η n D a + η, as n. Show that if a is a general r.v., these two relations do not hold (construct a counterexample). 3 o. Let ξ n P a, and let ηn D η when n, where a R is a constant and η is a random variable, Then ξ n η n D aη, as n. Would this result continue to hold if we suppose that a is a general random variable? Exercise 5 A r.v. Y takes values 1, 3 and 4 with the probabilities P (Y = 1) = 3/5, P (Y = 3) = 1/5 et P (Y = 4) = 1/5. How would you generate Y given a r.v. U U(0, 1). Exercise 6 Let U U(0, 1). 1. Explain how to simulate a dice with 6 faces given U. 2. Let Y = [6U + 1], where [a] is the integer part of a. What are possible values of Y and the corresponding probabilities? 1

2 Exercise 7 Suppose 2 balanced dices are drawn. Find joint probability distribution of X and Y if: 1. X is the maximum of the obtained values and Y is a sum; 2. X is the value of the first dice and Y is the maximum of the two; 3. X and Y are, respectively, the smallest and the largest value. Exercise 8 Suppose that X and Y are 2 independent Bernoulli B( 1 2 ) random variables. Let U = X + Y and V = X Y. 1. What is the joint probability distribution and marginal probability distributions of U and V, conditional distribution of U given V = 0 et V = are r.v. U and V independent? Exercise 9 Let ξ 1,..., ξ n be independent r.v., and let 1) Show that ξ min = min(ξ 1,..., ξ n ), ξ max = max(ξ 1,..., ξ n ). n P (ξ min x) = P (ξ i x), n P (ξ max < x) = P (ξ i < x). 2) Suppose, furthermore, that ξ 1,..., ξ n are identically distributed with uniform distribution U[0, a]. Compute E(ξ min ), E(ξ max ), Var(ξ min ) et Var(ξ max ) Exercise 10 Let ξ 1,..., ξ n be i.i.d. Bernoulli r.v. with P (ξ 1 = 0) = 1 λ i, where λ i > 0 and > 0 is small. Show that ( n ) ( n ) P ξ i = 1 = λ i + O( 2 ), P Exercise 11 P (ξ 1 = 1) = λ i ( n ξ i > 1 1) Prove that inf <a< E((ξ a) 2 ) is attained for a = E(ξ) and so inf E((ξ <a< a)2 ) = Var(ξ). ) = O( 2 ). 2) Let ξ be a nonnegative r.v. with c.d.f. F and finite expectation. Prove that E(ξ) = 0 (1 F (x))dx. 3) Show, using the result of 2), that if M is the median of the c.d.f. F of ξ, inf E( ξ a ) = E( ξ M ). <a< 2

3 Exercise 12 Let X 1 and X 2 be two independent r.v. with the exponential distribution E(λ). Show that min(x 1, X 2 ) and X 1 X 2 are r.v. with distributions, respectively, E(2λ) and E(λ). Exercise 13 Let X be the number of 6 in independent draws of a dice. Using the Central Limit Theorem estimate the probability that 1800 < X 2100 (Φ( 6) , Φ(2 6) ). Compare this approximation to that obtained using the Chebyshev inequality. Exercise 14 Suppose that r.v. ξ 1,..., ξ n are mutually independent and identically distributed with the c.d.f. F. For x R, let us define the random variable F n (x) = 1 n µ n, where µ n is the number of ξ 1,..., ξ n which satisfy ξ k < x. Show that for any x F n (x) P F (x) (the function F n (x) is called the empirical distribution function). Exercise 15 [Monté-Carlo method] We want to compute the integral I = 1 0 f(x)dx. Let X be a U[0, 1] random variable, then E(f(X)) = 1 0 f(x)dx = I. Let X 1,..., X n be uniformly distributed on [0, 1] i.i.d. r.v.. Let us consider the quantity f n = 1 n f(x i ) n and let us suppose that σ 2 = Var(f(X)) <. Prove that E( f n ) I et f n P I as n. Estimate P ( f n I < ɛ) using the CLT. Exercise 16 Weibull distributions is often used in the survival and reliability analysis. distribution from this family is given by c.d.f. { 0, x < 0 F (x) = 1 e 5x2, x 0. An example of a Explain how to generate a r.v. Z F given a uniform r.v. U. Exercise 17 Write down the algorithm of simulating a Poisson r.v. by inversion. Hint: there is no simple expression for the Poisson c.d.f. and the set of values is infinite. However, the Poisson c.d.f. can be easily computed recursively. Observe that if X is the poisson r.v., λ λk P (X = k) = e k! = λ P (X = k 1). k 3

4 2 Exercises for lecture 2 Exercise 18 We have X and Z, 2 r.v., independent with exponential distribution, X E(λ), Z E(1). Let Y = X + Z. Compute the regression function g(y) = E(X Y = y). Exercise 19 Suppose that the joint distribution of X and Y is given by { 1 e F (x, y) = 2x e y + e (2x+y) si x > 0, y > 0, 0 sinon. 1. Find the marginal distribution of X and Y. 2. Find the joint density of X and Y. 3. Compute the marginal densities of X and Y, conditional density of X given Y = y. 4. Are X and Y independent? Exercise 20 Consider the joint density function of X and Y given by: 1. Verify that f is a joint density. f(x, y) = 6 7 (x2 + xy ), 0 x 1, 0 y Find the density of X, the conditional density f Y X (y x). ( ) 3. Compute P Y > 1 2 X < 1 2. Exercise 21 The joint density X and Y is given by: f(x, y) = e (x+y), 0 x <, 0 y < Compute 1. P (X < Y ); 2. P (X < a). Exercise 22 Two point are chosen at random on the opposite sides of the middle point the interval of length L. In other words, the two points X and Y are independent random variables such that X is uniformly distributed over [0, L/2[, and Y is uniformly distributed over [L/2, L]. Find the probability that the distance X Y is larger than L/3. 4

5 Exercise 23 Let U 1 and U 2 be 2 independent r.v., both being uniformly distributed on [0, a]. Let V = min{u 1, U 2 } and Z = max{u 1, U 2 }. Show that the joint c.d.f. F of V and Z is given by F (s, t) = P (V s, Z t) = t2 (t s) 2 a 2 for 0 s t a. Hint: note that V s and Z t iff both U 1 t and U 2 t, but not both s < U 1 t and s < U 2 t. Exercise 24 Given 2 independent r.v. X 1 and X 2 with exponential distribution with parameters λ 1 and λ 2, find the distribution of Z = X 1 /X 2. Compute P (X 1 < X 2 ). Exercise 25 Let X and Y be i.i.d. r.v.. Use the definition of the conditional expectation to show that E(X X + Y ) = E(Y X + Y ) (p.s.), and thus E(X X + Y ) = E(Y X + Y ) = X+Y 2 (a.s.). Exercise 26 Let X, Y 1 and Y 2 be independent r.v., let Y 1 and Y 2 be normal N(0, 1), et Z = Y 1 + XY X 2 Using the conditional distribution P (Z < u X = x) show that Z N(0, 1). Exercise 27 Let X and Y be 2 square integrable r.v. on (Ω, F, P ). Prove that Exercise 28 Var(Y ) = E(Var(Y X)) + Var(E(Y X)). Let X 1,..., X n be independent r.v. such that X i P(λ i ) (Poisson distribution with parameter λ i, i.e. P (X i = k) = e λ i λk i k! ). 1 o. Find the distribution of X = n X i. 2 o. Show that the conditional distribution of (X 1,..., X n ) given X = r is multinomial M(r, p 1,..., p n ) (you will compute the corresponding parameters). Recall that r.v. (X 1,..., X k ) integer valued in {0,..., r} have multinomial distribution M(r, p 1,..., p k ) if r! P (X 1 = n 1,..., X k = n k ) = n 1!...n k! pn pn k k, with k n i = r. This is the distribution of (X 1,..., X k ) where X i = number of Y s which are equal to i in r independent trials Y 1,..., Y r with probabilities P (Y 1 = i) = p i, i = 1,..., k. Note that if k = 2, P (X 1 = n 1, X 2 = r n 1 ) = P (X 1 = n 1 ), and the distribution is denoted M(r, p) = B(r, p). 3 o. Compute E(X 1 X 1 + X 2 ). 4 o. Show that if X n is binomially distributed, X n B(n, λ/n), then for all k, P (X n = k) tends to e λ λk k! when n. 5

6 Recall trials, that binomial distribution describes the number X of wins in n independent Bernoulli P (X = k) = C k np k (1 p) n k. Exercise 29 Show that 1. Cov(X + Y, Z) = Cov(X, Z) + Cov(Y, Z), ( n 2. Cov X i, ) n j=1 Y j = n nj=1 Cov(X i, Y j ). 3. Prove that if Var(X i ) = σ 2 and Cov(X i, X j ) = γ for all 1 i, j n then Var(X X n ) = nσ 2 + n(n 1)γ. 4. Let ξ 1 and ξ 2 be i.i.d. random variables such that 0 < Var(ξ 1 ) <. Show that r.v. η 1 = ξ 1 ξ 2 and η 2 = ξ 1 + ξ 2 are uncorrelated. Exercise 30 Let X be the number of 1 and Y the number of 2 in n draws of a honest (balanced) dice. Compute Cov(X, Y ). Before computing the quantity, would you be able to predict if Cov(X, Y ) 0 or Cov(X, Y ) < 0. Hint: use the relationship 2) of Exercise 29. Exercise 31 1 o. Let ξ and η be r.v. with E(ξ) = E(η) = 0, Var(ξ) = Var(η) = 1 and the correlation coefficient ρ. Show that E(max(ξ 2, η 2 )) ρ 2. Hint: observe that max(ξ 2, η 2 ) = ξ2 + η 2 + ξ 2 η o. Let ρ be the correlation of η and ξ. Show the inequality Exercise 32 P ( ) ξ E(ξ) ɛ Var(ξ) or η E(η) ɛ Var(η) ρ 2 ɛ 2. Let (X, Y ) be a random vector of dimension 2. distribution of X given Y = y is N(y, σ 2 ). 1 o. What is the distribution of Y given X = x? 2 o. What is the distribution of X? 3 o. What is the distribution f E(Y X)? Suppose that Y N(m, τ 2 ) and that the Exercise 33 6

7 Let X and N be r.v. such that N is valued in {1, 2,...} and E( X ) <, E(N) <. Consider the sequence X 1, X 2,... of independent r.v. with the same distribution as X. Show the Wald identity: if N is independent of X i then Exercise 34 N E( X i ) = E(N)E(X). Suppose that the salary of an individual satisfies Y = Xb + σε, where σ > 0, b R, X is a r.v. with bounded second order moments corresponding the capacities of the individual, and ε is independent of X standard normal variable, ε N(0, 1). If Y is larger than the SMIC value S, the received salary Y is Y, otherwise it is equal to S. Compute E(Y X). Is this expectation linear? Exercise 35 Show that if φ is a characteristic function of some r.v. then φ, φ 2 and Re(φ), are also characteristic functions (of certain r.v.). Hint: for Re(φ) consider 2 independent random variables X and Y, where Y takes values 1 and 1 with probabilities 1/2, X has characteristic function φ, then compute the characteristic function of XY. 7

8 3 Exercises for lecture 3 Exercise 36 Two random vectors ( X ) R p and Y R q are independent( iff the ) characteristic function φ Z (u) X a of the vector Z = can be represented, for any u =, a R Y b p and b R q, as φ Z (u) = φ X (a)φ Y (b). Verify this characterization in the continuous case. Exercise 37 Let the joint density of r.v. s X and Y satisfy f(x, y) = 1 What is the distribution of X, of Y? Exercise 38 2π e x 2 2 e y2 2 [1 + xyi{ 1 x, y 1}], Prove that any linear transformation of a normal vector is again a normal vector: if Y = AX +c where A R q p and c R q are some fixed matrix and vector (non-random), Exercise 39 Compute the density of the χ 2 p distribution. Exercise 40 Y N q (Aµ + c, AΣA T ). Compute the density of the Fisher F p,q distribution. Exercise 41 Compute the density of the Student t q distribution. Exercise 42 Let Q be a q p matrix with q > p of rank p. 1 o. Show that P = Q(Q T Q) 1 Q T is a projector. 2 o. On what subspace L does P project? Exercise 43 Let (X, Y ) be a random vector with density f(x, y) = C exp( x 2 + xy y 2 /2). 1 o. Show that (X, Y ) is a normal vector. Compute the expectation, the covariance matrix and the characteristic function of (X, Y ). Compute the correlation coefficient ρ XY of X and Y. 2 o. What is the distribution of X? of Y? of 2X Y? 3 o. Show that X and Y X are independent random variables with the same distribution. 8

9 Exercise 44 Let X N(0, 1) and let Z be a r.v. taking values 1 and 1 with probability 1 2. We suppose that X and Z are independent, we set Y = ZX. 1 o. Show that the distribution of Y is N(0, 1). 2 o. Compute the covariance and the correlation of X and Y. 3 o. Compute P (X + Y = 0). 4 o. Is (X, Y ) a normal vector? Exercise 45 Let ξ and η be independent r.v. with uniform distribution U[0, 1]. Prove that X = 2 ln ξ cos(2πη), Y = 2 ln ξ sin(2πη) satisfy Z = (X, Y ) T N 2 (0, I). Hint: let (X, Y ) N 2 (0, I). Change to the polar coordinates. Exercise 46 Let Z = (Z 1, Z 2, Z 3 ) T be a normal vector, with density f, f(z 1, z 2, z 3 ) = ( 1 exp 6z z z z ) 1z 2. 4(2π) 3/ o. What is the distribution of (Z 2, Z 3 ) given Z 1 = z 1? Let X and Y be random vectors defined with X = Z et Y = ( o. Vector (X, Y ) of dimension 6, is it Gaussian? Does vector X have a density? Does vector Y have a density? 3 o. Are vectors X and Y independent? 4 o. What are the distributions of the components of Z. Exercise 47 Let (X, Y, Z) T be a normal vector with zero mean and covariance matrix Σ = o. We set U = X + Y + Z, V = X Y + Z, W = X + Y Z. What is the distribution of the vector (U, V, W ) T? 2 o. What is the density of the random variable T = U 2 + V 2 + W 2? Exercise 48. ) Z. 9

10 Let a vector (X, Y ) be normal N 2 (µ, Σ) with mean and covariance matrix: ( ) ( ) µ =, Σ = o. What is the distribution of X + 4Y? 2 o. What is the joint distribution of Y 2X and X + 4Y. Exercise 49 Let X be a zero mean normal vector of dimension n with covariance matrix Σ > 0. What is the distribution of the r.v. X T Σ 1 X? Exercise 50 We model the height H of a male person in population P by the Gaussian distribution N(172, 49) (units: cm). In this model: 1 o. What is the probability for a man to be of height 160cm? 2 o. We assume that there are approximately 15 millions of adult men in P; provide an estimation of a number of men of height 200cm. 3 o. What is the probability for 10 men randomly drawn from P to be all in the interval [168,188]cm? The height H of females of P is modeled by the Gaussian distribution N(162, 49) (units: cm). 4 o. What is the probability for a male chosen at random to be higher than a randomly chosen female? We model the heights (H, H ) of a man and a woman in a couple by a normal vector, the correlation coefficient ρ between the height of a woman and a man being 0.4 (respectively, 0.4). 5 o. Compute the probability p (respectively, p ) that the height of a man in a couple is larger than that of a woman (before making the computation, what would be your guess, in which order should one range p and p?). Exercise 51 Let Y = (η 1,..., η n ) T be a normal vector, Y N n (µ, σ 2 I), H n J be a subspace of dimension n J J > 0 of R n, and let H n J M be a subspace of dimension n J M, M > 0 of H n J. We set d J = min Y y et d J+M = min Y y. y H n J y H n J M Verify that distribution (with J degrees of free- 1. if µ H n J then random variable d 2 J /σ2 follows χ 2 J dom); 2. if, in addition, µ H n J M, then J d 2 J+M d2 J M d 2 J F M,J (Fisher distribution with (M, J) degrees of freedom). 10

Basic probability refresher

Basic probability refresher Lecture 1 Basic probability refresher 1.1 Characterizations of random variables Let Ω, F, P be a probability space where Ω is a general set, F is a σ-algebra and P is a probability measure on Ω. A random

More information

Statistics (1): Estimation

Statistics (1): Estimation Statistics (1): Estimation Marco Banterlé, Christian Robert and Judith Rousseau Practicals 2014-2015 L3, MIDO, Université Paris Dauphine 1 Table des matières 1 Random variables, probability, expectation

More information

Asymptotic Statistics-III. Changliang Zou

Asymptotic Statistics-III. Changliang Zou Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45 Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions

More information

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3. Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Expectation. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Limiting Distributions

Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

Spring 2012 Math 541B Exam 1

Spring 2012 Math 541B Exam 1 Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote

More information

2 Random Variable Generation

2 Random Variable Generation 2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods

More information

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Expectation. DS GA 1002 Probability and Statistics for Data Science.   Carlos Fernandez-Granda Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,

More information

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Week 12-13: Discrete Probability

Week 12-13: Discrete Probability Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible

More information

Twelfth Problem Assignment

Twelfth Problem Assignment EECS 401 Not Graded PROBLEM 1 Let X 1, X 2,... be a sequence of independent random variables that are uniformly distributed between 0 and 1. Consider a sequence defined by (a) Y n = max(x 1, X 2,..., X

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

ST 371 (IX): Theories of Sampling Distributions

ST 371 (IX): Theories of Sampling Distributions ST 371 (IX): Theories of Sampling Distributions 1 Sample, Population, Parameter and Statistic The major use of inferential statistics is to use information from a sample to infer characteristics about

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

. Find E(V ) and var(v ).

. Find E(V ) and var(v ). Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number

More information

Qualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf

Qualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part : Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section

More information

Probability reminders

Probability reminders CS246 Winter 204 Mining Massive Data Sets Probability reminders Sammy El Ghazzal selghazz@stanfordedu Disclaimer These notes may contain typos, mistakes or confusing points Please contact the author so

More information

Limiting Distributions

Limiting Distributions Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Statistics and data analyses

Statistics and data analyses Statistics and data analyses Designing experiments Measuring time Instrumental quality Precision Standard deviation depends on Number of measurements Detection quality Systematics and methology σ tot =

More information

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Many of the exercises are taken from two books: R. Durrett, The Essentials of Probability, Duxbury

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

15 Discrete Distributions

15 Discrete Distributions Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.

More information

Lecture 21: Convergence of transformations and generating a random variable

Lecture 21: Convergence of transformations and generating a random variable Lecture 21: Convergence of transformations and generating a random variable If Z n converges to Z in some sense, we often need to check whether h(z n ) converges to h(z ) in the same sense. Continuous

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

Qualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf

Qualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 1: Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section

More information

Statistical Methods in Particle Physics

Statistical Methods in Particle Physics Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

COMPSCI 240: Reasoning Under Uncertainty

COMPSCI 240: Reasoning Under Uncertainty COMPSCI 240: Reasoning Under Uncertainty Andrew Lan and Nic Herndon University of Massachusetts at Amherst Spring 2019 Lecture 20: Central limit theorem & The strong law of large numbers Markov and Chebyshev

More information

APM 504: Probability Notes. Jay Taylor Spring Jay Taylor (ASU) APM 504 Fall / 65

APM 504: Probability Notes. Jay Taylor Spring Jay Taylor (ASU) APM 504 Fall / 65 APM 504: Probability Notes Jay Taylor Spring 2015 Jay Taylor (ASU) APM 504 Fall 2013 1 / 65 Outline Outline 1 Probability and Uncertainty 2 Random Variables Discrete Distributions Continuous Distributions

More information

Statement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:.

Statement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:. MATHEMATICAL STATISTICS Homework assignment Instructions Please turn in the homework with this cover page. You do not need to edit the solutions. Just make sure the handwriting is legible. You may discuss

More information

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory Chapter 1 Statistical Reasoning Why statistics? Uncertainty of nature (weather, earth movement, etc. ) Uncertainty in observation/sampling/measurement Variability of human operation/error imperfection

More information

Probability and Statistics Notes

Probability and Statistics Notes Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Probability: Handout

Probability: Handout Probability: Handout Klaus Pötzelberger Vienna University of Economics and Business Institute for Statistics and Mathematics E-mail: Klaus.Poetzelberger@wu.ac.at Contents 1 Axioms of Probability 3 1.1

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example continued : Coin tossing Math 425 Intro to Probability Lecture 37 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan April 8, 2009 Consider a Bernoulli trials process with

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

Advanced topics from statistics

Advanced topics from statistics Advanced topics from statistics Anders Ringgaard Kristensen Advanced Herd Management Slide 1 Outline Covariance and correlation Random vectors and multivariate distributions The multinomial distribution

More information

we need to describe how many cookies the first person gets. There are 6 choices (0, 1,... 5). So the answer is 6.

we need to describe how many cookies the first person gets. There are 6 choices (0, 1,... 5). So the answer is 6. () (a) How many ways are there to divide 5 different cakes and 5 identical cookies between people so that the first person gets exactly cakes. (b) How many ways are there to divide 5 different cakes and

More information

FINAL EXAM: 3:30-5:30pm

FINAL EXAM: 3:30-5:30pm ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.

More information

01 Probability Theory and Statistics Review

01 Probability Theory and Statistics Review NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement

More information

FINAL EXAM: Monday 8-10am

FINAL EXAM: Monday 8-10am ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.

More information

Chapter 6. Convergence. Probability Theory. Four different convergence concepts. Four different convergence concepts. Convergence in probability

Chapter 6. Convergence. Probability Theory. Four different convergence concepts. Four different convergence concepts. Convergence in probability Probability Theory Chapter 6 Convergence Four different convergence concepts Let X 1, X 2, be a sequence of (usually dependent) random variables Definition 1.1. X n converges almost surely (a.s.), or with

More information

Joint p.d.f. and Independent Random Variables

Joint p.d.f. and Independent Random Variables 1 Joint p.d.f. and Independent Random Variables Let X and Y be two discrete r.v. s and let R be the corresponding space of X and Y. The joint p.d.f. of X = x and Y = y, denoted by f(x, y) = P(X = x, Y

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

Order Statistics and Distributions

Order Statistics and Distributions Order Statistics and Distributions 1 Some Preliminary Comments and Ideas In this section we consider a random sample X 1, X 2,..., X n common continuous distribution function F and probability density

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

Lecture 4: Sampling, Tail Inequalities

Lecture 4: Sampling, Tail Inequalities Lecture 4: Sampling, Tail Inequalities Variance and Covariance Moment and Deviation Concentration and Tail Inequalities Sampling and Estimation c Hung Q. Ngo (SUNY at Buffalo) CSE 694 A Fun Course 1 /

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise Name M36K Final. Let X be a random variable with probability density function { /x x < f(x = 0 otherwise Compute the following. You can leave your answers in integral form. (a ( points Find F X (t = P

More information

Partial Solutions for h4/2014s: Sampling Distributions

Partial Solutions for h4/2014s: Sampling Distributions 27 Partial Solutions for h4/24s: Sampling Distributions ( Let X and X 2 be two independent random variables, each with the same probability distribution given as follows. f(x 2 e x/2, x (a Compute the

More information

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr. Topic 2: Probability & Distributions ECO220Y5Y: Quantitative Methods in Economics Dr. Nick Zammit University of Toronto Department of Economics Room KN3272 n.zammit utoronto.ca November 21, 2017 Dr. Nick

More information

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, 1.2.11, 1.2.12, 1.2.16 (extra credit) A fashionable country club has 100 members, 30 of whom are lawyers. Rumor has it that 25 of the club members are liars

More information

We introduce methods that are useful in:

We introduce methods that are useful in: Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more

More information

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro CS37300 Class Notes Jennifer Neville, Sebastian Moreno, Bruno Ribeiro 2 Background on Probability and Statistics These are basic definitions, concepts, and equations that should have been covered in your

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Week 1 Quantitative Analysis of Financial Markets Distributions A

Week 1 Quantitative Analysis of Financial Markets Distributions A Week 1 Quantitative Analysis of Financial Markets Distributions A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October

More information

Chap 2.1 : Random Variables

Chap 2.1 : Random Variables Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is

More information

1. Let X and Y be independent exponential random variables with rate α. Find the densities of the random variables X 3, X Y, min(x, Y 3 )

1. Let X and Y be independent exponential random variables with rate α. Find the densities of the random variables X 3, X Y, min(x, Y 3 ) 1 Introduction These problems are meant to be practice problems for you to see if you have understood the material reasonably well. They are neither exhaustive (e.g. Diffusions, continuous time branching

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information

Statistics, Data Analysis, and Simulation SS 2015

Statistics, Data Analysis, and Simulation SS 2015 Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Review. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with

More information

On the convergence of sequences of random variables: A primer

On the convergence of sequences of random variables: A primer BCAM May 2012 1 On the convergence of sequences of random variables: A primer Armand M. Makowski ECE & ISR/HyNet University of Maryland at College Park armand@isr.umd.edu BCAM May 2012 2 A sequence a :

More information

Department of Large Animal Sciences. Outline. Slide 2. Department of Large Animal Sciences. Slide 4. Department of Large Animal Sciences

Department of Large Animal Sciences. Outline. Slide 2. Department of Large Animal Sciences. Slide 4. Department of Large Animal Sciences Outline Advanced topics from statistics Anders Ringgaard Kristensen Covariance and correlation Random vectors and multivariate distributions The multinomial distribution The multivariate normal distribution

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

UQ, Semester 1, 2017, Companion to STAT2201/CIVL2530 Exam Formulae and Tables

UQ, Semester 1, 2017, Companion to STAT2201/CIVL2530 Exam Formulae and Tables UQ, Semester 1, 2017, Companion to STAT2201/CIVL2530 Exam Formulae and Tables To be provided to students with STAT2201 or CIVIL-2530 (Probability and Statistics) Exam Main exam date: Tuesday, 20 June 1

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

Chapter 5 Class Notes

Chapter 5 Class Notes Chapter 5 Class Notes Sections 5.1 and 5.2 It is quite common to measure several variables (some of which may be correlated) and to examine the corresponding joint probability distribution One example

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

Gaussian vectors and central limit theorem

Gaussian vectors and central limit theorem Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables

More information

Continuous Probability Distributions

Continuous Probability Distributions 1 Chapter 5 Continuous Probability Distributions 5.1 Probability density function Example 5.1.1. Revisit Example 3.1.1. 11 12 13 14 15 16 21 22 23 24 25 26 S = 31 32 33 34 35 36 41 42 43 44 45 46 (5.1.1)

More information

14.30 Introduction to Statistical Methods in Economics Spring 2009

14.30 Introduction to Statistical Methods in Economics Spring 2009 MIT OpenCourseWare http://ocw.mit.edu 14.30 Introduction to Statistical Methods in Economics Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Continuous Probability Distributions

Continuous Probability Distributions 1 Chapter 5 Continuous Probability Distributions 5.1 Probability density function Example 5.1.1. Revisit Example 3.1.1. 11 12 13 14 15 16 21 22 23 24 25 26 S = 31 32 33 34 35 36 41 42 43 44 45 46 (5.1.1)

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

8 Laws of large numbers

8 Laws of large numbers 8 Laws of large numbers 8.1 Introduction We first start with the idea of standardizing a random variable. Let X be a random variable with mean µ and variance σ 2. Then Z = (X µ)/σ will be a random variable

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information