MAS223 Statistical Inference and Modelling Exercises
|
|
- Helen Burke
- 5 years ago
- Views:
Transcription
1 MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions, ordinary questions, and challenge questions Note that there are no exercises accompanying Chapter 8 The vast majority of exercises are ordinary questions Ordinary questions will be used in homeworks and tutorials; they cover the material content of the course Warm-up questions are typically easier, often nothing more than revision of relevant material from first year courses Challenge questions are typically harder and test ingenuity Contents Univariate Distribution Theory 2 2 Standard Univariate Distributions 5 3 Transformations of Univariate Random Variables 7 4 Multivariate Distribution Theory 9 5 Transformations of Multivariate Distributions 6 Covariance Matrices and and Multivariate Normal Distributions 4 7 Likelihood and Maximum Likelihood 7
2 Univariate Distribution Theory Warm-up Questions Let X be a random variable taking values in, 2, 3}, with P[X = ] = P[X = 2] = 04 Find P[X = 3], and calculate both E[X] and Var[X] 2 Let Y be a random variable with probability density function (pdf) f(y) given by y/2 for 0 y < 2; f(y) = Find the probability that Y is between 2 and Calculate E[Y ] and Var[Y ] Ordinary Questions 3 Define F : R [0, ] by 0 for y 0; F (y) = y 2 for y (0, ); for y (a) Sketch the function F, and check that it is a distribution function (b) If Y is a random variable with distribution function F, calculate the pdf of Y 4 Let X be a discrete random variable, taking values in 0,, 2}, where P[X = n] = 3 for n 0,, 2} Sketch the distribution function F X : R R 5 Define f : R [0, ] by f(x) = 0 for x < 0; e x for x 0 (a) Show that f is a probability density function (b) Find the corresponding distribution function and evaluate P[ < X < 2] 6 Sketch graphs of each of the following two functions, and explain why each of them is not a distribution function 0 for x 0; (a) F (x) = x for x > 0 0 for x < 0; (b) F (x) = x + sin 2πx for 0 x < ; 4 for x 2
3 7 Let k R and define f : R R by k(x x 2 ) for x (0, ), f(x) = Find the value of k for which f(x) is a probability density function, and calculate the probability that X is greater than 2 8 The probability density function f(x) is given by + x for x 0; f(x) = x for 0 < x < ; 9 Let Find the corresponding distribution function F (x) for all real x F (x) = ex + e x for all real x (a) Show that F is a distribution function, and find the corresponding pdf f (b) Show that f( x) = f(x) (c) If X is a random variable with this distribution, evaluate P[ X > 2] 0 Show that f(x) = π, defined for all x R, is a probability density function +x 2 (a) For which values of r [0, ) is x r dx finite? (b) Show that x 2 if x > f(x) = 0 otherwise is a probability density function (c) Show that the expectation of a random variable X, with the probability density function f given in (b), is not defined (d) Give an example of a random variable Y for which E[Y ] < but E[Y 2 ] is not defined 2 The discrete random variable X has the probability function P[X = x] = (a) Use the partial fractions of x(x+) x(x+) for x N to show that P[X x] =, for all x N x+ (b) Write down the distribution function F (x) of X, for x R Sketch its graph What are the values of F (2) and F ( 3 2 )? (c) Evaluate P[0 X 20] (d) Is E[X] defined? If so, what is E[X]? If not, why not? 3
4 Challenge Questions 3 Show that there is no random variable X, with range N, such that P[X = n] is constant for all n N 4 Recall the meaning of inscribing a cube within a sphere: the cube sits inside of the sphere, with each vertex of the cube positioned on the surface of the sphere It is not especially easy to illustrate this on a two dimensional page, but here is an attempt: Suppose that ten percent of the surface of the sphere is coloured blue, and the rest of the surface is coloured red Show that, regardless of which parts are coloured blue, it is always possible to inscribe a cube within the sphere in such a way as all vertices of the cube are red Hint: A cube has eight corners Suppose that position of the cube is sampled uniformly from the set of possible positions What is the expected number of corners that are red? 4
5 2 Standard Univariate Distributions Warm-up Questions 2 (a) A standard fair dice is rolled 5 times Let X be the number of sixes rolled Which distribution (and which parameters) would you use to model X? (b) A fair coin is flipped until the first head is shown Let X be the total number of flips, including the final flip on which the first head appears Which distribution (and which parameter) would you use to model X? Ordinary Questions 22 Let λ > 0 Write down the pdf f of the random variable X, where X Exp(λ), and calculate its distribution function F Hence, show that is constant for t > 0 f(t) F (t) 23 Let λ > 0 and let X be a random variable with Exp(λ) distribution Let Z = X, that is let Z be X rounded down to the nearest integer Show that Z is geometrically distributed with parameter p = e λ 24 Let µ R Let X and X 2 be independent random variables with distributions N(µ, ) and N(µ, 4), respectively Let T, T 2 and T 3 be defined by T = X + X 2 2, T 2 = 2X X 2, T 3 = 4X + X 2 5 Find the mean and variance of T, T 2 and T 3 Which of E[T ], E[T 2 ] and E[T 3 ] would you prefer to use as an estimator of µ? 25 Let X be a random variable with Ga(α, β) distribution (a) Let k N Show that E[X k ] = α(α + ) (α + k ) β k Hence, calculate µ = E[X] and σ 2 = Var(X) and verify that these formulas match the ones given in lectures [ ( (b) Show that E X µ ) 3 ] = 2 σ α 26 (a) Using R, you can obtain a plot of, for example, the pdf of a Ga(3, 2) random variable between 0 and 0 with the command curve(dgamma(x,shape=3,scale=2),from=0,to=0) Use R to investigate how the shape of the pdf of a Gamma distribution varies with the different parameter values In particular, fix a value of β, see how the shape changes as you vary α 5
6 (b) Investigate the effect that changing parameters values has on the shape of the pdf of the Beta distribution To produce, for example, a plot of the pdf of Be(4, 5), use curve(dbeta(x,shape=4,shape2=5),from=-,to=2) 27 Suggest which standard discrete distributions (or combination of them) we should use to model the following situations (a) Organisms, independently, possess a given characteristic with probability p A sample of k organisms with the characteristic is required How many organisms will need to be tested to achieve this sample? (b) In Texas Hold em Poker, players make the best hand they can by combining two cards in their hand with five community cards that are placed face up on the table At the start of the game, a player can only see their own hand The community cards are then turned over, one by one A player has two hearts in her hand Three of the community cards have been turned over, and only one of them is a heart How many hearts will appear in the remaining two community cards? Use a computer to find the probability of seeing k = 0,, 2 hearts 28 Let X be a N(0, ) random variable Use integration by parts to show that E[X n+2 ] = (n + )E[X n ] for any n = 0,, 2, Hence, show that E[X n 0 if n is odd; ] = ()(3)(5) (n ) if n is even 29 Let X N(µ, σ 2 ) Show that E[e X σ2 µ+ ] = e 2 Challenge Questions 20 Let X be a random variable with a continuous distribution, and a strictly increasing distribution function F Show that F (X) has a uniform distribution on (0, ) Suggest how we might use this result to simulate samples from standard distributions 2 Prove that Γ( 2 ) = π 6
7 3 Transformations of Univariate Random Variables Warm-up Questions 3 Let g : R R be given by g(x) = x 3 + (a) Sketch the graph of g, show that g is strictly increasing, and find its inverse function g (b) Let R = [0, 2] Find g(r) Ordinary Questions 32 Let X be a random variable with pdf x 2 for x > f X (x) = Define g(x) = e x and let Y = g(x) (a) Show that g(x) is strictly increasing Find its inverse function g (y), and dg (y) dy (b) Identify the set R X on which f X (x) > 0 Sketch g and show that g(r X ) = (e, ) (c) Deduce from (a) and (b) that Y has pdf (log y) 2 for y > e y f Y (y) = 33 Let X be a random variable with the uniform distribution on (0, ), and let Y = log X λ where λ > 0 Show that Y has an Exp(λ) distribution 34 Let α, β > 0 (a) Show that B(α, β) = B(β, α) (b) Let X be a random variable with the Be(α, β) distribution Show that Y = X has the Be(β, α) distribution 35 Let α > 0 (a) Show that B(α, ) = α (b) Let X Be(α, ) distribution Let Y = r X for some positive integer r Show that Y also has a Beta distribution, and find its parameters 36 Let X have a uniform distribution on [, ] Find the pdf of X and identify the distribution of X 37 Let α, β > 0 and let X Be(α, β) Let c > 0 and set Y = c/x Find the pdf of Y 7
8 38 Let Θ be an angle chosen according to a uniform distribution on ( π, π ), and let 2 2 X = tan Θ Show that X has the Cauchy distribution 39 Let X be a random variable with the pdf + x for < x < 0; f(x) = x for 0 < x < ; Find the probability density functions of (a) Y = 5X + 3 (b) Z = X 30 Let X have the uniform distribution on [a, b] (a) For [a, b] = [, ], find the pdf of Y = X 2 (b) For [a, b] = [, 2], find the pdf of Y = X 3 Let X have a uniform distribution on [, ] and define g : R R by 0 for x 0; g(x) = x 2 for x > 0 Find the distribution function of g(x) 32 Let X be a random variable with the Cauchy distribution Show that X also has the Cauchy distribution Challenge Questions 33 If we were to pretend that g(x) = /x was strictly monotone, we could (incorrectly) apply Lemma 3 and use the formula f Y (y) = f X (g (y)) dg to solve 32 We dy would still arrive at the correct answer Can you explain why? Can you construct another example of a case in which the relationship f Y (y) = f X (g (y)) dg holds, but where the function g is not monotone? dy 34 Let Y and α, β be as in Question 37 (a) If α >, show that E[Y ] = c(α+β ) α (b) If α show that E[Y ] is not defined 8
9 4 Multivariate Distribution Theory Warm-up questions 4 Let T = (x, y) : 0 < x < y} Define f : R 2 R by e 2x y for (x, y) T ; f(x, y) = Sketch the region T Calculate 0 that they are equal 42 Sketch the following regions of R 2 (a) S = (x, y) : x [0, ], y [0, ]} (b) T = (x, y) : 0 < x < y} (c) U = (x, y) : x [0, ], y [0, ], 2y > x} f(x, y) dx dy and f(x, y) dy dx, and verify Ordinary Questions 43 Let (X, Y ) be a random vector with joint probability density function ke (x+y) if 0 < y < x f X,Y (x, y) = (a) Using that P[(X, Y ) R 2 ] =, find the value of k (b) For each of the regions S, T, U in 42, calculate the probability that (X, Y ) is inside the given region (c) Find the marginal pdf of Y, and hence identify the distribution of Y 44 Let S = [0, ] [0, ], and let U and V have joint probability density function 4u+2v (u, v) S; 3 f U,V (u, v) = (a) Find P[U + V ] (b) Find P[V U 2 ] 45 For the random variables U and V in Exercise 44: (a) Find the marginal pdf f U (u) of U (b) Find the marginal pdf f V (v) of V (c) For v such that f V (v) > 0, find the conditional pdf f U V =v (u) of U given V = v (d) Check that each of f U, f V and f U V =v integrate over R to (e) Calculate the two forms of conditional expectation, E[U V = v], and E[U V ] 9
10 46 Let (X, Y ) be a random vector with joint probability density function y x x [, 0], y [0, ]; 2 x+y f X,Y (x, y) = x [0, ], y [0, ]; 2 Find the marginal pdf of X Show that the correlation coefficient X and Y is zero Show also that X and Y are not independent 47 Let X be a random variable Let Z be a random variable, independent of X, such that P[Z = ] = P[Z = ] = Let Y = XZ 2 (a) Show that X and Y are uncorrelated (b) Give an example in which X and Y are not independent 48 Let X and Y be independent random variables, with 0 < Var(X) = Var(Y ) < Let U = X + Y and V = XY Show that U and V are uncorrelated if and only if E[X] + E[Y ] = 0 49 Let λ > 0 Let X have an Exp(λ) distribution, and conditionally given X = x let U have a uniform distribution on [0, x] Calculate E[U] and Var(U) 40 Let k R and let (X, Y ) have joint probability density function kx sin(xy) for x (0, ), y (0, π), f X,Y (x, y) = (a) Find the value of k (b) For x (0, ), find the conditional probability density function of Y given X = x (c) Find E[Y X] 4 Let U have a uniform distribution on (0, ), and conditionally given U = u let X have a uniform distribution on (0, u) (a) Find the joint pdf of (X, U) and the marginal pdf of X (b) Show that E[U X = x] = x log x 42 Let (X, Y ) have a bivariate distribution with joint pdf f X,Y (x, y) Let y 0 R be such that f Y (y 0 ) > 0 Show that f X Y =y0 (x) is a probability density function Challenge Questions 43 Give an example of random variables (X, Y, Z) such that P[X < Y ] = P[Y < Z] = P[Z < X] = 2 3 0
11 5 Transformations of Multivariate Distributions Warm-up Questions 5 (a) Define u = x and v = 2y Sketch the images of the regions S, T and U from question 42 in the (u, v) plane (b) Define u = x + y and v = x y Sketch the images of the regions S, T and U from question 42 in the (u, v) plane Ordinary Questions 52 The random variables X and Y have joint pdf given by xe y if x (0, 2), y (0, ) f X,Y (x, y) = Define u = u(x, y) = x + y and v = v(x, y) = 2y Let U = u(x, Y ) and V = v(x, Y ) (a) Find the inverse transformation ) x = x(u, v) and y = y(u, v) and calculate the value of J = det ( x/ u x/ v y/ u y/ v (b) Sketch the set of (x, y) for which f X,Y (x, y) is non-zero Find the image of this set in the (u, v) plane (c) Deduce from (a) and (b) that the joint pdf of (U, V ) is (u v 2 f U,V (u, v) = )e v/2 for v > 0, u (2v, 2v + 4) 53 The random variables X and Y have joint pdf given by 2 f X,Y (x, y) = (x + y)e (x+y) for x, y 0 0 elsewhere Let U = X + Y and W = X (a) Find the joint pdf of (U, W ) and the marginal pdf of U (b) Recognize U as a standard distribution and, using the result of Question 25(a), evaluate E[(X + Y ) 5 ] 54 Let X and Y be a pair of independent random variables, both with the standard normal distribution Show that the joint pdf of (U, V ) where U = X 2 and V = X 2 + Y 2 is given by 8π f U,V (u, v) = e v/2 u /2 (v u) /2 for 0 u v
12 55 Let (X, Y ) be a random vector with joint pdf 2e (x+y) x > y > 0; f X,Y (x, y) = (a) If U = X Y and V = Y/2, find the joint pdf of (U, V ) (b) Show that U and V are independent, and recognize their (marginal) distributions as standard distributions 56 Let X and Y be a pair of independent and identically distributed random variables Let U = X + Y and V = X Y (a) Show that Cov(U, V ) = 0, and give an example to show that U and V are not necessarily independent (b) Show that U and V are independent in the special case where X and Y are standard normals 57 Let X and Y be independent random variables with distributions Ga(α, β) and Ga(α 2, β) respectively Show that the random variables U = X and V = X + Y are X+Y independent with distributions Be(α, α 2 ) and Ga(α + α 2, β) respectively 58 As part of Question 57, we showed that if X and Y are independent random variables with X Ga(α, β) and Y Ga(α 2, β), then X + Y Ga(α + α 2, β) (a) Use induction to show that for n 2, if X, X 2,, X n are independent random variables with X i Ga(α i, β) then ( n n ) X i Ga α i, β i= (b) Hence show that for n, if Z, Z 2,, Z n are independent standard normal random variables then n Zi 2 χ 2 n i= You may use the result of Example 2, which showed that this was true in the case n = (and, recall that the χ 2 distribution is a special case of the Gamma distribution) 59 Let X and Y be a pair of independent random variables, both with the standard normal distribution Show that U = X/Y has the Cauchy distribution, with pdf i= for all u R f U (u) = π + u 2 2
13 50 Let n N The t distribution (often known as Student s t distribution) is the univariate random variable X with pdf f X (x) = Γ( n+ 2 ) nπγ( n 2 ) ) n+ ( + x2 2, n for all x R Here, n is a parameter, known as the number of degrees of freedom Let Z be a standard normal random variable and let W be a chi-squared random variable with n degrees of freedom, where Z and W are independent Show that X = Z W/n has the t distribution with n degrees of freedom Challenge Questions 5 The formula (52), from the typed lecture notes, holds whenever the transformation u = u(x, y), v = v(x, y) is both one-to-one and onto, and for which the Jacobian matrix of derivatives exists Use this fact to provide an alternative proof of Lemma 3 (ie of the univariate transformation formula) 52 Let X Exp(λ ) and Y Exp(λ 2 ) be independent Show that U = min(x, Y ) has distribution Exp(λ + λ 2 ), and that P[min(X, Y ) = X] = λ λ +λ 2 Extend this result, by induction, to handle the minimum of finitely many exponential random variables Let W = max(x, Y ) Show that U and W U are independent 3
14 6 Covariance Matrices and and Multivariate Normal Distributions Warm-up Questions 6 Let A = ( ) 2 and Σ = ( ) 4 9 (a) Calculate det(a) and find both AΣ and AΣA T (b) Let A be the vector (2, ) Show that AΣA T = 2 Ordinary Questions 62 Let X = (X, Y ) T be a random vector with ( ) 0 E[X] =, Cov(X) = Let U = X + Y and V = 2X 2Y + ( ) 2 (a) Write down a square matrix A and a vector b R 2 such that U = AX + b (b) Show that the mean vector and covariance matrix of U are given by ( ) ( ) 5 2 E[U] =, Cov(U) = 2 4 (c) Show that the correlation coefficient ρ of U and V is equal to 5 63 Let X and Y be independent standard normal random variables (a) Write down the covariance matrix of the random vector X = (X, Y ) T (b) Let R be the rotation matrix ( ) cos θ sin θ sin θ cos θ Show that RX has the same covariance matrix as X 64 Three (univariate) random variables X, Y and Z have means 3, 4 and 6 respectively and variances, and 25 respectively Further, X and Y are uncorrelated; the correlation coefficient between X and Z is 5 and that between Y and Z is 5 Let U = X + Y Z and W = 2X + Z 4 and set U = (U, W ) T (a) Find the mean vector and covariance matrix of X = (X, Y, Z) T (b) Write down a matrix A and a vector b such that U = AX + b (c) Find the mean vector and covariance matrix of U 4
15 (d) Evaluate E[(2X + Z 6) 2 ] 65 Suppose that the random vector X = (X, X 2 ) T follows the bivariate normal distribution with E[X ] = E[X 2 ] = 0, Var(X ) =, Cov(X, X 2 ) = 2 and Var(X 2 ) = 5 (a) Calculate the correlation coefficient of X and X 2 Are X and X 2 independent? (b) Find the mean and the covariance matrices of Y = ( 2 ) X and Z = What are the distributions of Y and Z? ( ) 2 X Let X and X 2 be bivariate normally distributed random variables each with mean 0 and variance, and with correlation coefficient ρ (a) By integrating out the variable x 2 in the joint pdf, verify that the marginal distribution of X is indeed that of a standard univariate normal random variable Hint: Use the fact that the integral of a N(µ, σ 2 ) pdf is equal to (b) Show, using the usual formula for the conditional pdf that the conditional pdf of X 2 given X = x is N(ρx, ρ 2 ) 67 Let X = (X, X 2 ) T have( a N 2 (µ, ) Σ) distribution with mean vector µ = (µ, µ 2 ) T and σ 2 covariance matrix Σ = σ 2 Let σ 2 σ 2 2 A = ( σ ) σ 2 σ 2 σ Find the distribution of Y = AX, and deduce that any bivariate normal random vector can be transformed by a linear transformation into a vector of independent normal random variables 68 The random vector X = (X, X 2, X 3 ) T has an N 3 (µ, Σ) distribution where µ = (,, 2) T and Σ = (a) Find the correlation coefficients between X and X 2, between X and X 3 and between X 2 and X 3 (b) Let Y = X + X 3 and Y 2 = X 2 X Find the distribution of Y = (Y, Y 2 ) T and hence find the correlation coefficient between Y and Y 2 69 (a) Let X be a (univariate) standard normal random variable and let Y = X Does the random vector X = (X, Y ) T have a bivariate normal distribution? (b) Let X and Y be two independent (univariate) standard normal random variables, and let Z be a random variable such that P[Z = ] = P[Z = ] = Are 2 XZ and Y Z independent, and does the random vector X = (XZ, Y Z) T have a bivariate normal distribution? 5
16 Challenge Questions 60 Recall that an orthogonal matrix R is one for which R = R T, and recall that if an n n matrix R is orthogonal, x is an n-dimensional vector, and y = Rx then n i= y2 i = y y = x x = n i= x2 i Let X = (X, X 2,, X n ) be a vector of independent normal random variables with common mean 0 and variance σ 2 Let R be an orthogonal matrix and let Y = RX (a) Show that Y is also a vector of independent normal random variables, with common mean 0 and variance σ 2 (b) Suppose that all the elements in the first row of R are equal to n (you may assume that an orthogonal matrix exists with this property) Show that Y = n X, where X is the sample mean of X, and that n i=2 Y 2 i = n Xi 2 n X 2 (c) Hence, use Question 58 to deduce that, if s 2 = sample variance of X, (n )s 2 χ 2 n, and that it is independent of X σ 2 i= n ( n i= X2 i n X 2) is the (d) Let µ R Deduce that the result of part (c) also holds if the X i have mean µ (so as they are iid N(µ, σ 2 ) random variables) 6 Let Z, Z 2, be independent identically distributed normal random variables with mean µ and variance σ 2 We regard n samples of these as a set of data Write Z and s 2 respectively for the sample mean and variance Combine 60(d) and 50 to show that the statistic n( Z µ) X = s has the t distribution with n degrees of freedom 6
17 7 Likelihood and Maximum Likelihood Warm-up Questions 7 Let f : R R by f(θ) = e θ2 +4θ (a) Find the first derivative of f, and hence identify its turning point(s) (b) Calculate the value of the second derivative of f at these turning point(s) Hence, deduce if the turning point(s) are local maxima or local minima ( n ) 72 Let (a i ) n i= be a sequence with a i (0, ) Show that log a i = n log a i i= i= Ordinary Questions 73 A sample of 3 is obtained from a geometric distribution X with unknown parameter θ That is, P[X = x] = θ x ( θ) for x 0,, 2, } (a) Given this sample, find the likelihood function L(θ; 3), and state its domain Θ (b) Find the maximum likelihood estimate of θ 74 A sample of 4 is obtained from a Bi(n, θ) distribution (a) Assuming that n is known to be 9 and that θ is unknown, write down the likelihood function of θ, for θ [0, ], given the (single) data point 4 Use a software package of your choice to plot a graph of it (b) Find the maximum likelihood estimate of θ, given this data point 75 A sample (x, x 2, x 3 ) of three observations from a Poisson distribution with parameter λ, where λ is known to be in Λ =, 2, 3}, gives the values x = 4, x 2 = 0, x 3 = 3 Find the likelihood of each of the possible values of λ, and hence find the maximum likelihood estimate 76 Given the set of iid samples x = (x, x 2,, x n ), for some n > 0, write down the likelihood function L(θ; x), in each of the following cases In each case you should give both the function, in simplified form where possible, and the parameter set Θ The data are iid samples from: (a) the exponential distribution Exp(λ) with λ = /θ (b) the binomial Bi(m, θ) distribution, where m is known (c) the normal N(µ, θ), where µ is known (d) the gamma distribution Ga(θ, 4) (e) the beta distribution Be(θ, θ) 77 For each of 76(a)-(e), find the log likelihood, and simplify it as much as you can 7
18 78 The IGa(α, β) distribution is the distribution of /U if U Ga(α, β) Repeat 76 in the case where the data are iid samples from the inverted gamma distribution IGa(, θ) Find and simplify the corresponding log likelihood 79 A set of iid samples x = (x, x 2,, x n ) T is taken from a geometric distribution with unknown parameter θ, so that the probability function of X i is p(x i ) = P[X i = x i ] = ( θ) x i θ, x i = 0,, 2,, 0 < θ < Find the maximum likelihood estimate of θ based on the above sample 70 Find the maximum likelihood estimate of θ when the data x = (x, x 2,, x n ) are iid samples from the binomial Bi(m, θ) distribution, where m is known, as in question 76/77(b) 7 Find the maximum likelihood estimate of θ when the data x = (x, x 2,, x n ) are iid samples from the normal N(µ, θ), where µ is known, as in question 76/77(c) Show that this estimator is unbiased (ie that E[ˆθ] = σ 2 ) 72 Find the maximum likelihood estimate of θ (0, ) when the data x = (x, x 2,, x n ) are iid samples from the gamma distribution Ga(3, θ) If x = n n i= x i = 3, calculate the maximum likelihood estimate ˆθ and show that it does not depend on the sample size n 73 Suppose that a set of iid samples x = (x,, x n ) is taken from a negative binomial distribution, so that each X i has probability function ( ) xi + r p(x i ) = P (X i = x i ) = θ r ( θ) x i r for some success probability θ satisfying 0 θ, where x i = 0,, 2, and r is the total number of successes If r is known, find the maximum likelihood estimate of θ 74 As in question 74, an observation from a Bi(n, θ) gives the value x = 4 (a) If θ is known to be 3 but n is unknown, with possible range n 4, 5, 6,, }, 4 write down a formula for the likelihood function of n and calculate its values for n = 4, 5, 6 and 7 (b) Find the maximum likelihood estimator of n 75 Suppose we have data x = (x,, x n ), which are iid samples from a N(µ, σ 2 ) distribution, where µ is unknown and σ 2 is known (a) Find the log-likelihood function of µ (b) Show that the maximum likelihood estimator of µ is ˆµ = n (c) Let k (0, ) Find the k-likelihood region for µ n x i i= 8
19 76 As in question 74(a) an observation from a Bi(n, θ) distribution with n = 9 gives the value x = 4 (a) Find the range of values for which the log likelihood is within 2 of its maximum value [You may do this either by inspection of a plot, or by using a computer package to solve the inequality numerically] (b) A traditional approximate 95% confidence interval here would be of the form ˆθ ± 96 ˆθ( ˆθ), where ˆθ = x/n Compare your answer to (a) to what this would n give (c) Repeat this analysis with n = 90 and x = 40, and comment on your results 77 A set of iid samples x = (x, x 2,, x n ) T is taken from an inverse Gaussian distribution (known also as the Wald distribution), with pdf ) θ f(x) = ( 2πx exp θ(x µ)2 when x > 0, 3 2µ 2 x and f(x) = x for x 0, with parameters µ, θ > 0 (a) Assuming that µ is known, find the maximum likelihood estimate of θ based on the above sample (b) If both µ and θ are unknown, find the maximum likelihood estimate of (µ, θ) based on the sample given Challenge Questions 78 The Pareto distribution has parameters α > 0 and β > 0, and pdf f(x; θ) = αβα x α+ when x β and f(x; θ) = 0 when x < β A set of n iid samples x = (x, x 2,, x n ) T are taken from a Pareto distribution, where the parameters θ = (α, β) are both unknown Find the maximum likelihood estimate of θ based on the above sample 9
MAS223 Statistical Inference and Modelling Exercises and Solutions
MAS3 Statistical Inference and Modelling Exercises and Solutions The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationSTA 2201/442 Assignment 2
STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution
More informationMAS223 Statistical Modelling and Inference Examples
Chapter MAS3 Statistical Modelling and Inference Examples Example : Sample spaces and random variables. Let S be the sample space for the experiment of tossing two coins; i.e. Define the random variables
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More information18 Bivariate normal distribution I
8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer
More informationLecture 11. Multivariate Normal theory
10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More information. Find E(V ) and var(v ).
Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number
More informationStatistics for scientists and engineers
Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationTwo hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45
Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationWeek 9 The Central Limit Theorem and Estimation Concepts
Week 9 and Estimation Concepts Week 9 and Estimation Concepts Week 9 Objectives 1 The Law of Large Numbers and the concept of consistency of averages are introduced. The condition of existence of the population
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationEXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours
EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, 008 Statistical Theory and Methods I Time Allowed: Three Hours Candidates should answer FIVE questions. All questions carry equal marks.
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationSTA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/
STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This
More informationE X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.
E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,
More informationJoint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:
Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More information1 Introduction. P (n = 1 red ball drawn) =
Introduction Exercises and outline solutions. Y has a pack of 4 cards (Ace and Queen of clubs, Ace and Queen of Hearts) from which he deals a random of selection 2 to player X. What is the probability
More informationSTAT 414: Introduction to Probability Theory
STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationMath 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14
Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional
More information, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40
Assignment 4 Fall 07. Exercise 3.. on Page 46: If the mgf of a rom variable X is ( 3 + 3 et) 5, find P(X or 3). Since the M(t) of X is ( 3 + 3 et) 5, X has a binomial distribution with n 5, p 3. The probability
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More information01 Probability Theory and Statistics Review
NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement
More informationMaster s Written Examination - Solution
Master s Written Examination - Solution Spring 204 Problem Stat 40 Suppose X and X 2 have the joint pdf f X,X 2 (x, x 2 ) = 2e (x +x 2 ), 0 < x < x 2
More information2 Random Variable Generation
2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods
More informationStatistics Ph.D. Qualifying Exam: Part I October 18, 2003
Statistics Ph.D. Qualifying Exam: Part I October 18, 2003 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your answer
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationMATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours
MATH2750 This question paper consists of 8 printed pages, each of which is identified by the reference MATH275. All calculators must carry an approval sticker issued by the School of Mathematics. c UNIVERSITY
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More information2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.
CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook
More informationSTAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS
STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,
More informationUniversity of Regina. Lecture Notes. Michael Kozdron
University of Regina Statistics 252 Mathematical Statistics Lecture Notes Winter 2005 Michael Kozdron kozdron@math.uregina.ca www.math.uregina.ca/ kozdron Contents 1 The Basic Idea of Statistics: Estimating
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More informationEXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY
EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 2016 MODULE 1 : Probability distributions Time allowed: Three hours Candidates should answer FIVE questions. All questions carry equal marks.
More informationExpectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,
More informationStat 426 : Homework 1.
Stat 426 : Homework 1. Moulinath Banerjee University of Michigan Announcement: The homework carries 120 points and contributes 10 points to the total grade. (1) A geometric random variable W takes values
More informationExpectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda
Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,
More information3 Conditional Expectation
3 Conditional Expectation 3.1 The Discrete case Recall that for any two events E and F, the conditional probability of E given F is defined, whenever P (F ) > 0, by P (E F ) P (E)P (F ). P (F ) Example.
More informationSolutions to Homework Set #6 (Prepared by Lele Wang)
Solutions to Homework Set #6 (Prepared by Lele Wang) Gaussian random vector Given a Gaussian random vector X N (µ, Σ), where µ ( 5 ) T and 0 Σ 4 0 0 0 9 (a) Find the pdfs of i X, ii X + X 3, iii X + X
More informationStatistics 1B. Statistics 1B 1 (1 1)
0. Statistics 1B Statistics 1B 1 (1 1) 0. Lecture 1. Introduction and probability review Lecture 1. Introduction and probability review 2 (1 1) 1. Introduction and probability review 1.1. What is Statistics?
More informationMultivariate distributions
CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density
More information4. CONTINUOUS RANDOM VARIABLES
IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More informationMultivariate Analysis and Likelihood Inference
Multivariate Analysis and Likelihood Inference Outline 1 Joint Distribution of Random Variables 2 Principal Component Analysis (PCA) 3 Multivariate Normal Distribution 4 Likelihood Inference Joint density
More informationx. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).
.8.6 µ =, σ = 1 µ = 1, σ = 1 / µ =, σ =.. 3 1 1 3 x Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ ). The Gaussian distribution Probably the most-important distribution in all of statistics
More informationThe Multivariate Normal Distribution 1
The Multivariate Normal Distribution 1 STA 302 Fall 2017 1 See last slide for copyright information. 1 / 40 Overview 1 Moment-generating Functions 2 Definition 3 Properties 4 χ 2 and t distributions 2
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationSTAT 418: Probability and Stochastic Processes
STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical
More informationMATH 151, FINAL EXAM Winter Quarter, 21 March, 2014
Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists
More informationStat 5101 Notes: Brand Name Distributions
Stat 5101 Notes: Brand Name Distributions Charles J. Geyer September 5, 2012 Contents 1 Discrete Uniform Distribution 2 2 General Discrete Uniform Distribution 2 3 Uniform Distribution 3 4 General Uniform
More information1. Consider a random independent sample of size 712 from a distribution with the following pdf. c 1+x. f(x) =
1. Consider a random independent sample of size 712 from a distribution with the following pdf f(x) = c 1+x 0
More informationSOLUTION FOR HOMEWORK 11, ACTS 4306
SOLUTION FOR HOMEWORK, ACTS 36 Welcome to your th homework. This is a collection of transformation, Central Limit Theorem (CLT), and other topics.. Solution: By definition of Z, Var(Z) = Var(3X Y.5). We
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationNotes for Math 324, Part 19
48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More information[Chapter 6. Functions of Random Variables]
[Chapter 6. Functions of Random Variables] 6.1 Introduction 6.2 Finding the probability distribution of a function of random variables 6.3 The method of distribution functions 6.5 The method of Moment-generating
More information8 - Continuous random vectors
8-1 Continuous random vectors S. Lall, Stanford 2011.01.25.01 8 - Continuous random vectors Mean-square deviation Mean-variance decomposition Gaussian random vectors The Gamma function The χ 2 distribution
More informationStat410 Probability and Statistics II (F16)
Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two
More informationFundamentals. CS 281A: Statistical Learning Theory. Yangqing Jia. August, Based on tutorial slides by Lester Mackey and Ariel Kleiner
Fundamentals CS 281A: Statistical Learning Theory Yangqing Jia Based on tutorial slides by Lester Mackey and Ariel Kleiner August, 2011 Outline 1 Probability 2 Statistics 3 Linear Algebra 4 Optimization
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationA Probability Review
A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in
More informationReview of Probability Theory II
Review of Probability Theory II January 9-3, 008 Exectation If the samle sace Ω = {ω, ω,...} is countable and g is a real-valued function, then we define the exected value or the exectation of a function
More informationLesson 4: Stationary stochastic processes
Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More informationNon-parametric Inference and Resampling
Non-parametric Inference and Resampling Exercises by David Wozabal (Last update. Juni 010) 1 Basic Facts about Rank and Order Statistics 1.1 10 students were asked about the amount of time they spend surfing
More informationMultivariate probability distributions and linear regression
Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,
More informationThis exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.
TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to
More informationStatement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:.
MATHEMATICAL STATISTICS Homework assignment Instructions Please turn in the homework with this cover page. You do not need to edit the solutions. Just make sure the handwriting is legible. You may discuss
More informationPart IB Statistics. Theorems with proof. Based on lectures by D. Spiegelhalter Notes taken by Dexter Chua. Lent 2015
Part IB Statistics Theorems with proof Based on lectures by D. Spiegelhalter Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly)
More informationSpring 2012 Math 541B Exam 1
Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote
More informationSTA 2101/442 Assignment 3 1
STA 2101/442 Assignment 3 1 These questions are practice for the midterm and final exam, and are not to be handed in. 1. Suppose X 1,..., X n are a random sample from a distribution with mean µ and variance
More informationProbability & Statistics - FALL 2008 FINAL EXAM
550.3 Probability & Statistics - FALL 008 FINAL EXAM NAME. An urn contains white marbles and 8 red marbles. A marble is drawn at random from the urn 00 times with replacement. Which of the following is
More informationTest Problems for Probability Theory ,
1 Test Problems for Probability Theory 01-06-16, 010-1-14 1. Write down the following probability density functions and compute their moment generating functions. (a) Binomial distribution with mean 30
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More information