Spring 2012 Math 541B Exam 1
|
|
- Shanna Sims
- 5 years ago
- Views:
Transcription
1 Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote the number of red balls in the sample of size n. In what follows we treat N, n as known and m as unknown. (a) Find P m (X = x). (b) Show that is an MLE of m. (c) Define m = min{ X(N + 1)/n, N}. (1) Show that x m,α = max{x Z : P m (X x) α} x m,α = min{x Z : P m (X x) α}. {m : x m,α/2 < X x m,1 α/2 } (2) is a 100(1 α)% confidence interval for m, possibly conservative. Hint: Invert a hypothesis test of H 0 : m = m 0 vs. H 1 : m m 0, and note that one of the inequalities in (2) is strict. 2. (a) Let S 1 Bin(n 1, p) and S 2 Bin(n 2, p) be two independent binomial random variables, and let S = S 1 + S 2. Identify the distribution of S 1 conditional on S = s, and give its parameter values in terms of an urn model. (b) Now let S 1 Bin(n 1, p 1 ) and S 2 Bin(n 2, p 2 ) be independent, and S = S 1 + S 2 as above. Fisher s Exact Test of H 0 : p 1 = p 2 versus H 1 : p 1 > p 2 rejects H 0 when S 1 is large. i. Show that, under H 0, S is a sufficient statistic. ii. Write down an expression for the p-value of Fisher s Exact Test, conditional on S = s, in terms of the density f(s 1 s) of the distribution in part 2a.
2 Fall 2012 Math 541B Exam 1 1. Let X 1,..., X n be a random sample from a distribution with variance Var(X 1 ) = σ 2 <, and let T n = T n (X 1,..., X n ) be some statistic. (a) Write down an expression for the jackknife estimator V n of Var(T n ) in terms of T n 1,i = T n 1 (X 1,..., X i 1, X i+1,..., X n ), i = 1,..., n. (b) Now let T n = X n = n 1 n X i be the sample mean. Show that: i. Var(T n ) = σ 2 /n ii. 1 n W n = (X i X n ) 2 n(n 1) 2. Given is an unbiased estimator of Var(T n ) iii. V n = W n an index set S; a distribution π = (π i ) on S, and π i > 0 for all i S; a Markov chain on S with transition matrix Q = (q ij ), where q ij reference chain). > 0 for all i j (the We construct a new Markov chain whose transition matrix P = (p ij ) is given by (a) Show that π j q ji p ij = q ij π i q ij + π j q ji i. This new chain is reversible. ii. The stationary distribution of this chain is π. (b) Sketch an algorithm that generates random samples whose marginal distribution is π.
3 Spring 2013 Math 541B Exam 1 1. (a) Let f θ (x), θ Θ R, be a family of density functions with respect to some common measure. If we say that this family has the monotone likelihood ratio (MLR) property in the real-valued statistic T = T (x), what two properties must hold? (b) Taking Θ = (0, ), let f θ (x), x = (x 1,..., x n ), be the joint density of a random sample of n i.i.d. uniform (0, θ) observations: { θ n, if x i < θ for all i = 1,..., n f θ (x) = 0, otherwise. Show that this family has the MLR property, and give the statistic T. (c) Given α (0, 1) and θ 0 > 0, find a uniformly most powerful level-α test of H 0 : θ θ 0 vs. H 1 : θ > θ 0 in terms of T (X). Find any critical values and randomization constants explicitly. 2. Recall that a log-normal distribution ln N (x µ, σ 2 ) is a continuous probability distribution of a random variable whose logarithm is normally distributed N (x µ, σ 2 ). That is, if X ln N (x µ, σ 2 ), then log X N (x µ, σ 2 ). Suppose the only random number generator that you have is the one for log-normal distributions ln N (x µ, σ 2 ). Propose an MCMC algorithm for estimating the following integral I = 0 e x4 x 6 x 8 ex α dx, where α = 0 e x4 x 6 x 8 dx (is unknown). Describe the algorithm in detail.
4 Spring 2014 Math 541b Exam 1. Let X 1, X 2,..., X n be independent identically distributed samples from the Normal distribution N (θ, σ 2 ) having mean θ and variance σ 2. (a) Does a Uniformly Most Powerful, or UMP, level α test of H 0 : σ 2 1 versus H 1 : σ 2 > 1 exist if the mean θ is known? If so, find the form of the rejection region of the UMP test, and if not, explain why not. (b) Does a UMP level α test of H 0 : σ 2 1 versus H 1 : σ 2 > 1 exist, if both θ and σ 2 are unknown, with the restriction θ/σ 2 = 2? 2. Consider a vector X = (X 1, X 2, X 3 ) of counts with distribution given by the multinomial distribution with probabilities ( ) 3 n P (X = x) = x 1, x 2, x 3 for x = (x 1, x 2, x 3 ), a vector of non-negative integers summing to n, and ( 1 (p 1, p 2, p 3 ) = 3 + θ 3, 2θ 3, 2 ) 3 θ for some θ (0, 1). (a) Write out the equation that would need to be solved in order to obtain the maximum likelihood estimate of θ. (b) Show that if additional missing data is now introduced to form a full model that a simpler equation then that in part (a) results, and solve it explicitly. Hint: Consider the first cell. (c) Specify the steps of an EM algorithm that takes advantage of the simplification obtained by treating the situation as a missing data problem as in part (b). p x i i 1
5 Fall 2014 Math 541b Exam 1. (a) Let q x,y be a Markov transition function, and π x a probability distribution on a finite state space S. Show that the Markov chain that accepts moves made according to q x,y with probability { } πy q y,x p x,y = min, 1, π x q x,y and otherwise remains at x, has stationary distribution π x. Show that if q x,y and π x are positive for all x, y S then the chain so described has unique stationary distribution π x. (b) Let f(y) and g(y) be two probability mass functions, both positive on R. With X 1 generated according to g, consider the Markov chain X 1, X 2,... that for at stage n 1 generates an independent observation Y n from density g, and accepts this value as the new state X n+1 with probability { } f(yn )g(x n ) min f(x n )g(y n ), 1 and otherwise sets X n+1 to be X n. Prove that the chain converges in distribution to a random variable with distribution f. (c) The accept/reject method. Let f and g be density functions on R such that the support of f is a subset of the support of g, and suppose that there exists a constant M such that f(x) Mg(x). Consider the procedure that generates a random variable with distribution g, an independent random variable with the uniform distribution U on [0, 1] and sets Y = X when U f(x)/mg(x). Show that Y has density f. 2. Let f be a real valued function on R n, and Z = f(x 1,..., X n ) for X 1,..., X n independent random variables. (a) With E (i) ( ) = E( X 1,..., X i 1, X i+1,..., X n ) show the following version of the Efron-Stein inequality ( n ) Var(Z) E (Z E (i) Z) 2. (1) 1
6 Hint: With E i ( ) = E( X 1,..., X i ), show that Z EZ = n i where i = E i Z E i 1 Z, compute the variance of Z in this form, use properties of conditional expectation such as E i (E (i) ( )) = E i 1 ( ), and (conditional) Jensens inequality. (b) Letting (X 1,..., X n) be an independent copy of (X 1,..., X n ), with Z i = f(x 1,..., X i 1, X i, X i+1,..., X n ), show that Var(Z) 1 2 E ( n (Z Z i) 2 ). Hint: Express the right hand side of (1) in terms of conditional variances, and justify and use the conditional version of the fact that if X and Y are independent and have the same distribution then the variance of X can be expresses in terms of E(X Y ) 2. 2
7 Fall 2015 Math 541B Exam 1 1. Let X 1,..., X n be a sample from distribution F, let X (1)... X (n) be the corresponding order statistics, and let θ and θ be the population and sample median, respectively. Assume that the sample size is 3 (n = 3), (a) Find the distribution of the ordered bootstrap sample (X(1), X (2), X (3) ), where X i s are randomly selected from the sample with replacement. (b) Determine the bootstrap estimator λ 1 of the bias of sample median, λ 1 = E( θ) θ. (c) Determine the bootstrap estimator λ 2 of the variance of sample median, λ 2 = V ar( θ). 2. Denote z R 2 by z = (x, y), and let Z 1,..., Z n be independent with distribution N (0, Σ) where ( ) 1 ρ Σ = for ρ ( 1, 1), unknown. ρ 1 a. Write down the N (0, Σ) density function, and the likelihood of the sample. L(ρ) = f(x 1,..., x n ; ρ) b. Determine the Neyman Pearson procedure for testing H 0 : ρ = 0 versus H 1 : ρ = ρ 0 at level α (0, 1) for some ρ 0 0 in (0, 1). (You do not need to explicitly write down any null distributions arising.) c. Determine if the test in b) is uniformly most powerful for testing H 0 : ρ = 0 versus H 1 : ρ > 0, and justify your conclusion.
8 Fall 2016 Math 541B Exam 1 1. Suppose that out of n i.i.d. Bernoulli trials, each with probability p of success, there are zero successes. (a) Given α (0, 1), derive an exact upper (1 α)-confidence bound for p by either pivoting the c.d.f. of the Binomial distribution or inverting the appropriate hypothesis test. (b) There is a famous rule of thumb called the Rule of Threes which says that, when n is large, 3/n is an approximate upper 95%-confidence bound for p in the above situation. Justify the Rule of Threes by applying a large-n first order Taylor approximation to your answer from Part 1a, and use the fact that log(.05) Let w 1,..., w n be i.i.d. from the mixture distribution f(w; ψ) = g π i f i (w), where ψ = (π 1,..., π g ) is a vector of unknown probabilities summing to one, and f 1,..., f g are known density functions. (a) Write an equation one would solve to find the maximum likelihood estimate of ψ. (b) To implement the EM algorithm, write down the full likelihood when in addition to the sample w 1,..., w n, the missing data is also observed. Z ij = 1(the jth observation w j comes from ith group f i ), (c) Write down the estimate of ψ using the full data likelihood in part (2b). (d) Write down the E and M steps of the EM algorithm.
9 Spring 2015 Math 541b Exam 1. Let X 1, X 2,..., X n be independent Cauchy random variable with density 1 f(x θ) = π(1 + (x θ) 2 ), and let X n = median of {X 1, X 2,..., X n }. (a) Prove that n( X n θ) is asymptotically normal with mean 0 and variance π 2 /4 by showing that as n tends to infinity, P ( n( X n θ) a) P (Z 2a/π) where Z is a standard normal random variable. Hint: If we define Bernoulli random variables Y i = 1 {Xi θ+a/ n}, the event { X n θ + a/ n} is equivalent to { i Y i (n + 1)/2} when n is odd. Applying the CLT might also be needed. (b) Using the result from part (a), find an approximate α-level large sample test of H 0 : θ = θ 0 versus H 1 : θ θ We observe independent Bernoulli variables X 1, X 2,..., X n, which depend on unobservable variables Z 1,..., Z n which, given θ 1,..., θ n, are distributed independently as N(θ i, 1), where { 0 if Zi u X i = 1 if Z i > u. The values θ 1, θ 2,..., θ n are distributed independently as N(ξ, σ 2 ). Assuming that u and σ 2 are known, we are interested in the maximum likelihood estimate of ξ. (a) Show that for any for given values of ξ and σ 2, and all i = 1,..., n, the random variable Z i is normally distributed with mean ξ and variance σ (b) Write down the likelihood function for the complete data Z 1,..., Z n when these values are observed. 1
10 (c) Now assume that only X 1,..., X n are observed, and show that the EM sequence for the estimation of the unknown ξ is given by ξ (t+1) = 1 n n E(Z i X i, ξ (t), σ 2 ). Start by computing the expected log likelihood of the complete data. (d) Show that E(Z i X i, ξ (t), σ 2 ) = ξ (t) + ( ) u ξ (t) σ H i σ2 + 1 where H i (t) = { φ(t) 1 Φ(t) if X i = 1 φ(t) Φ(t) if X i = 0, and Φ(t) and φ(t) are cumulative distribution and density function of a standard normal variable, respectively. 2
11 Spring 2016 Math 541B Exam 1 1. Let X 1,..., X n be i.i.d. from a normal distribution with unknown mean µ and variance 1. Suppose that negative values of X i are truncated at 0, so that instead of X i, we actually observe Y i = max(0, X i ), i = 1, 2,..., n, from which we would like to estimate µ. By reordering, assume that Y 1,..., Y m > 0 and Y m+1 =... = Y n = 0. (a) Explain how to use the EM algorithm to estimate µ from Y 1,..., Y n. Specifically, give the details about E-step and M-step. Show that a recursive formula for the successive EM estimates µ (k+1) is µ (k+1) = 1 n m Y i + n m m µ(k) n m m φ(µ (k) ) Φ( µ (k) ), where φ(x) is probability density function and Φ(x) is cumulative density function of the standard normal distribution. (b) Find the log-likelihood function log L(µ) based only on observed data, and use it to write down a (nonlinear) equation which the MLE µ satisfies. (c) Use the equation in part (b) to verify that µ is indeed a fixed point of the recursion found in (a). (d) Prove that µ (k) µ for any starting point µ (0), providing at least one of the observations is not truncated. To do this, prove that the difference between µ (k) and µ gets smaller as k gets larger. Hint: The Mean Value Theorem and the following inequalities, which you can use without proof, might be useful. 0 < φ(x)[φ(x) xφ( x)] Φ 2 ( x) < 1, for all x. Note: The Mean Value Theorem says that if f is continuous and differentiable on the interval (a, b), then there is a number c in (a, b) such that f(b) f(a) = f (c)(b a). 2. Let X 1,..., X n be iid Unif(0, θ), where θ > 0 is unknown. (a) Find the MLE θ, its c.d.f. F θ (u) = P θ ( θ u), and its expected value E θ ( θ). (b) Consider a confidence interval for θ of the form [a θ, b θ], where 1 a b are constants. (1) For given 0 < α < 1, characterize all 1 a b making [a θ, b θ] a (1 α) confidence interval. (c) Find values 1 a b minimizing the expected length E θ (b θ a θ) among all (1 α) confidence intervals of the form (1), uniformly in θ.
12 Spring 2017 Math 541B Exam 1 1. Let X = (X 1,..., X n ) be a vector of i.i.d. N(µ, σ 2 ) random variables, where both µ and σ are unknown. (a) Given α 1 (0, 1), write down an exact (1 α 1 ) confidence interval for µ. (b) Given α 2 (0, 1), write down an exact (1 α 2 ) confidence interval for σ 2. (c) Letting I α1 (X) and J α2 (X) denote the confidence intervals in parts 1a and 1b, respectively, for given α (0, 1) show how to choose α 1, α 2 so that the overall coverage probability satisfies ( P µ,σ 2 µ Iα1 (X) and σ 2 J α2 (X) ) 1 α for all µ, σ 2. The inequality does not have to be sharp. 2. Let P 0 and P 1 be probability distributions on R with densities p 0 and p 1 with respect to Lebesgue measure, and let X 1,..., X n be a sequence of i.i.d. random variables. (a) Let β denote the power of the most powerful test of size α, 0 < α < 1, for testing the null hypothesis H 0 : X 1,..., X n P 0 against the alternative H a : X 1,..., X n P 1. Show that α < β unless P 0 = P 1. (b) Let P 0 be the uniform distribution on the interval [0, 1] and P 1 be the uniform distribution on [1/3, 2/3]. Find the Neyman-Pearson test of size α for testing H 0 against H a (consider all possible values of 0 < α < 1).
A Very Brief Summary of Statistical Inference, and Examples
A Very Brief Summary of Statistical Inference, and Examples Trinity Term 2009 Prof. Gesine Reinert Our standard situation is that we have data x = x 1, x 2,..., x n, which we view as realisations of random
More informationTwo hours. To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER. 21 June :45 11:45
Two hours MATH20802 To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER STATISTICAL METHODS 21 June 2010 9:45 11:45 Answer any FOUR of the questions. University-approved
More informationStat 516, Homework 1
Stat 516, Homework 1 Due date: October 7 1. Consider an urn with n distinct balls numbered 1,..., n. We sample balls from the urn with replacement. Let N be the number of draws until we encounter a ball
More informationFinal Exam. 1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given.
1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given. (a) If X and Y are independent, Corr(X, Y ) = 0. (b) (c) (d) (e) A consistent estimator must be asymptotically
More informationsimple if it completely specifies the density of x
3. Hypothesis Testing Pure significance tests Data x = (x 1,..., x n ) from f(x, θ) Hypothesis H 0 : restricts f(x, θ) Are the data consistent with H 0? H 0 is called the null hypothesis simple if it completely
More informationStatistics Ph.D. Qualifying Exam: Part II November 9, 2002
Statistics Ph.D. Qualifying Exam: Part II November 9, 2002 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationHypothesis Testing. 1 Definitions of test statistics. CB: chapter 8; section 10.3
Hypothesis Testing CB: chapter 8; section 0.3 Hypothesis: statement about an unknown population parameter Examples: The average age of males in Sweden is 7. (statement about population mean) The lowest
More informationparameter space Θ, depending only on X, such that Note: it is not θ that is random, but the set C(X).
4. Interval estimation The goal for interval estimation is to specify the accurary of an estimate. A 1 α confidence set for a parameter θ is a set C(X) in the parameter space Θ, depending only on X, such
More informationMaster s Written Examination
Master s Written Examination Option: Statistics and Probability Spring 016 Full points may be obtained for correct answers to eight questions. Each numbered question which may have several parts is worth
More informationPh.D. Qualifying Exam Friday Saturday, January 3 4, 2014
Ph.D. Qualifying Exam Friday Saturday, January 3 4, 2014 Put your solution to each problem on a separate sheet of paper. Problem 1. (5166) Assume that two random samples {x i } and {y i } are independently
More informationDirection: This test is worth 250 points and each problem worth points. DO ANY SIX
Term Test 3 December 5, 2003 Name Math 52 Student Number Direction: This test is worth 250 points and each problem worth 4 points DO ANY SIX PROBLEMS You are required to complete this test within 50 minutes
More informationAsymptotic Statistics-III. Changliang Zou
Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (
More informationMath 494: Mathematical Statistics
Math 494: Mathematical Statistics Instructor: Jimin Ding jmding@wustl.edu Department of Mathematics Washington University in St. Louis Class materials are available on course website (www.math.wustl.edu/
More informationCentral Limit Theorem ( 5.3)
Central Limit Theorem ( 5.3) Let X 1, X 2,... be a sequence of independent random variables, each having n mean µ and variance σ 2. Then the distribution of the partial sum S n = X i i=1 becomes approximately
More informationMath 494: Mathematical Statistics
Math 494: Mathematical Statistics Instructor: Jimin Ding jmding@wustl.edu Department of Mathematics Washington University in St. Louis Class materials are available on course website (www.math.wustl.edu/
More informationSpring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =
Spring 2012 Math 541A Exam 1 1. (a) Let Z i be independent N(0, 1), i = 1, 2,, n. Are Z = 1 n n Z i and S 2 Z = 1 n 1 n (Z i Z) 2 independent? Prove your claim. (b) Let X 1, X 2,, X n be independent identically
More informationFall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.
1. Let P be a probability measure on a collection of sets A. (a) For each n N, let H n be a set in A such that H n H n+1. Show that P (H n ) monotonically converges to P ( k=1 H k) as n. (b) For each n
More informationTopic 15: Simple Hypotheses
Topic 15: November 10, 2009 In the simplest set-up for a statistical hypothesis, we consider two values θ 0, θ 1 in the parameter space. We write the test as H 0 : θ = θ 0 versus H 1 : θ = θ 1. H 0 is
More information1 Exercises for lecture 1
1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )
More informationλ(x + 1)f g (x) > θ 0
Stat 8111 Final Exam December 16 Eleven students took the exam, the scores were 92, 78, 4 in the 5 s, 1 in the 4 s, 1 in the 3 s and 3 in the 2 s. 1. i) Let X 1, X 2,..., X n be iid each Bernoulli(θ) where
More informationThis does not cover everything on the final. Look at the posted practice problems for other topics.
Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry
More information40.530: Statistics. Professor Chen Zehua. Singapore University of Design and Technology
Singapore University of Design and Technology Lecture 9: Hypothesis testing, uniformly most powerful tests. The Neyman-Pearson framework Let P be the family of distributions of concern. The Neyman-Pearson
More informationA Very Brief Summary of Statistical Inference, and Examples
A Very Brief Summary of Statistical Inference, and Examples Trinity Term 2008 Prof. Gesine Reinert 1 Data x = x 1, x 2,..., x n, realisations of random variables X 1, X 2,..., X n with distribution (model)
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationMathematics Ph.D. Qualifying Examination Stat Probability, January 2018
Mathematics Ph.D. Qualifying Examination Stat 52800 Probability, January 2018 NOTE: Answers all questions completely. Justify every step. Time allowed: 3 hours. 1. Let X 1,..., X n be a random sample from
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationMath 152. Rumbos Fall Solutions to Assignment #12
Math 52. umbos Fall 2009 Solutions to Assignment #2. Suppose that you observe n iid Bernoulli(p) random variables, denoted by X, X 2,..., X n. Find the LT rejection region for the test of H o : p p o versus
More informationProbability and Statistics qualifying exam, May 2015
Probability and Statistics qualifying exam, May 2015 Name: Instructions: 1. The exam is divided into 3 sections: Linear Models, Mathematical Statistics and Probability. You must pass each section to pass
More informationSummary of Chapters 7-9
Summary of Chapters 7-9 Chapter 7. Interval Estimation 7.2. Confidence Intervals for Difference of Two Means Let X 1,, X n and Y 1, Y 2,, Y m be two independent random samples of sizes n and m from two
More informationStatistics. Statistics
The main aims of statistics 1 1 Choosing a model 2 Estimating its parameter(s) 1 point estimates 2 interval estimates 3 Testing hypotheses Distributions used in statistics: χ 2 n-distribution 2 Let X 1,
More informationEconomics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1,
Economics 520 Lecture Note 9: Hypothesis Testing via the Neyman-Pearson Lemma CB 8., 8.3.-8.3.3 Uniformly Most Powerful Tests and the Neyman-Pearson Lemma Let s return to the hypothesis testing problem
More informationMaster s Written Examination
Master s Written Examination Option: Statistics and Probability Spring 05 Full points may be obtained for correct answers to eight questions Each numbered question (which may have several parts) is worth
More informationBEST TESTS. Abstract. We will discuss the Neymann-Pearson theorem and certain best test where the power function is optimized.
BEST TESTS Abstract. We will discuss the Neymann-Pearson theorem and certain best test where the power function is optimized. 1. Most powerful test Let {f θ } θ Θ be a family of pdfs. We will consider
More information2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.
CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook
More informationChapter 4 HOMEWORK ASSIGNMENTS. 4.1 Homework #1
Chapter 4 HOMEWORK ASSIGNMENTS These homeworks may be modified as the semester progresses. It is your responsibility to keep up to date with the correctly assigned homeworks. There may be some errors in
More information2 Random Variable Generation
2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods
More informationThe Delta Method and Applications
Chapter 5 The Delta Method and Applications 5.1 Local linear approximations Suppose that a particular random sequence converges in distribution to a particular constant. The idea of using a first-order
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationM(t) = 1 t. (1 t), 6 M (0) = 20 P (95. X i 110) i=1
Math 66/566 - Midterm Solutions NOTE: These solutions are for both the 66 and 566 exam. The problems are the same until questions and 5. 1. The moment generating function of a random variable X is M(t)
More informationMcGill University. Faculty of Science. Department of Mathematics and Statistics. Part A Examination. Statistics: Theory Paper
McGill University Faculty of Science Department of Mathematics and Statistics Part A Examination Statistics: Theory Paper Date: 10th May 2015 Instructions Time: 1pm-5pm Answer only two questions from Section
More informationf (1 0.5)/n Z =
Math 466/566 - Homework 4. We want to test a hypothesis involving a population proportion. The unknown population proportion is p. The null hypothesis is p = / and the alternative hypothesis is p > /.
More informationTest Code: STA/STB (Short Answer Type) 2013 Junior Research Fellowship for Research Course in Statistics
Test Code: STA/STB (Short Answer Type) 2013 Junior Research Fellowship for Research Course in Statistics The candidates for the research course in Statistics will have to take two shortanswer type tests
More informationFirst Year Examination Department of Statistics, University of Florida
First Year Examination Department of Statistics, University of Florida August 19, 010, 8:00 am - 1:00 noon Instructions: 1. You have four hours to answer questions in this examination.. You must show your
More informationFinal Examination Statistics 200C. T. Ferguson June 11, 2009
Final Examination Statistics 00C T. Ferguson June, 009. (a) Define: X n converges in probability to X. (b) Define: X m converges in quadratic mean to X. (c) Show that if X n converges in quadratic mean
More informationUnbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.
Unbiased Estimation Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. To compare ˆθ and θ, two estimators of θ: Say ˆθ is better than θ if it
More informationChapter 7. Hypothesis Testing
Chapter 7. Hypothesis Testing Joonpyo Kim June 24, 2017 Joonpyo Kim Ch7 June 24, 2017 1 / 63 Basic Concepts of Testing Suppose that our interest centers on a random variable X which has density function
More informationPhD Qualifying Examination Department of Statistics, University of Florida
PhD Qualifying xamination Department of Statistics, University of Florida January 24, 2003, 8:00 am - 12:00 noon Instructions: 1 You have exactly four hours to answer questions in this examination 2 There
More informationSTAT 830 Hypothesis Testing
STAT 830 Hypothesis Testing Hypothesis testing is a statistical problem where you must choose, on the basis of data X, between two alternatives. We formalize this as the problem of choosing between two
More informationHT Introduction. P(X i = x i ) = e λ λ x i
MODS STATISTICS Introduction. HT 2012 Simon Myers, Department of Statistics (and The Wellcome Trust Centre for Human Genetics) myers@stats.ox.ac.uk We will be concerned with the mathematical framework
More informationMathematics Qualifying Examination January 2015 STAT Mathematical Statistics
Mathematics Qualifying Examination January 2015 STAT 52800 - Mathematical Statistics NOTE: Answer all questions completely and justify your derivations and steps. A calculator and statistical tables (normal,
More informationBTRY 4090: Spring 2009 Theory of Statistics
BTRY 4090: Spring 2009 Theory of Statistics Guozhang Wang September 25, 2010 1 Review of Probability We begin with a real example of using probability to solve computationally intensive (or infeasible)
More informationIntroduction to Machine Learning. Lecture 2
Introduction to Machine Learning Lecturer: Eran Halperin Lecture 2 Fall Semester Scribe: Yishay Mansour Some of the material was not presented in class (and is marked with a side line) and is given for
More informationStatistics 135 Fall 2008 Final Exam
Name: SID: Statistics 135 Fall 2008 Final Exam Show your work. The number of points each question is worth is shown at the beginning of the question. There are 10 problems. 1. [2] The normal equations
More informationTUTORIAL 8 SOLUTIONS #
TUTORIAL 8 SOLUTIONS #9.11.21 Suppose that a single observation X is taken from a uniform density on [0,θ], and consider testing H 0 : θ = 1 versus H 1 : θ =2. (a) Find a test that has significance level
More informationPrimer on statistics:
Primer on statistics: MLE, Confidence Intervals, and Hypothesis Testing ryan.reece@gmail.com http://rreece.github.io/ Insight Data Science - AI Fellows Workshop Feb 16, 018 Outline 1. Maximum likelihood
More informationQualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf
Part : Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section
More informationReview. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with
More informationSTATISTICS SYLLABUS UNIT I
STATISTICS SYLLABUS UNIT I (Probability Theory) Definition Classical and axiomatic approaches.laws of total and compound probability, conditional probability, Bayes Theorem. Random variable and its distribution
More informationQualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf
Part 1: Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section
More informationSTAT 461/561- Assignments, Year 2015
STAT 461/561- Assignments, Year 2015 This is the second set of assignment problems. When you hand in any problem, include the problem itself and its number. pdf are welcome. If so, use large fonts and
More information1 Inverse Transform Method and some alternative algorithms
Copyright c 2016 by Karl Sigman 1 Inverse Transform Method and some alternative algorithms Assuming our computer can hand us, upon demand, iid copies of rvs that are uniformly distributed on (0, 1), it
More informationParameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn
Parameter estimation and forecasting Cristiano Porciani AIfA, Uni-Bonn Questions? C. Porciani Estimation & forecasting 2 Temperature fluctuations Variance at multipole l (angle ~180o/l) C. Porciani Estimation
More informationSTAT 830 Hypothesis Testing
STAT 830 Hypothesis Testing Richard Lockhart Simon Fraser University STAT 830 Fall 2018 Richard Lockhart (Simon Fraser University) STAT 830 Hypothesis Testing STAT 830 Fall 2018 1 / 30 Purposes of These
More informationInstitute of Actuaries of India
Institute of Actuaries of India Subject CT3 Probability & Mathematical Statistics May 2011 Examinations INDICATIVE SOLUTION Introduction The indicative solution has been written by the Examiners with the
More informationStatistics GIDP Ph.D. Qualifying Exam Theory Jan 11, 2016, 9:00am-1:00pm
Statistics GIDP Ph.D. Qualifying Exam Theory Jan, 06, 9:00am-:00pm Instructions: Provide answers on the supplied pads of paper; write on only one side of each sheet. Complete exactly 5 of the 6 problems.
More informationHypothesis Test. The opposite of the null hypothesis, called an alternative hypothesis, becomes
Neyman-Pearson paradigm. Suppose that a researcher is interested in whether the new drug works. The process of determining whether the outcome of the experiment points to yes or no is called hypothesis
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More information. Find E(V ) and var(v ).
Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number
More informationProbability reminders
CS246 Winter 204 Mining Massive Data Sets Probability reminders Sammy El Ghazzal selghazz@stanfordedu Disclaimer These notes may contain typos, mistakes or confusing points Please contact the author so
More informationEXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS
EXAM Exam # Math 3342 Summer II, 2 July 2, 2 ANSWERS i pts. Problem. Consider the following data: 7, 8, 9, 2,, 7, 2, 3. Find the first quartile, the median, and the third quartile. Make a box and whisker
More informationChapter 3 : Likelihood function and inference
Chapter 3 : Likelihood function and inference 4 Likelihood function and inference The likelihood Information and curvature Sufficiency and ancilarity Maximum likelihood estimation Non-regular models EM
More informationClass 8 Review Problems solutions, 18.05, Spring 2014
Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots
More informationIIT JAM : MATHEMATICAL STATISTICS (MS) 2013
IIT JAM : MATHEMATICAL STATISTICS (MS 2013 Question Paper with Answer Keys Ctanujit Classes Of Mathematics, Statistics & Economics Visit our website for more: www.ctanujit.in IMPORTANT NOTE FOR CANDIDATES
More informationSome Assorted Formulae. Some confidence intervals: σ n. x ± z α/2. x ± t n 1;α/2 n. ˆp(1 ˆp) ˆp ± z α/2 n. χ 2 n 1;1 α/2. n 1;α/2
STA 248 H1S MIDTERM TEST February 26, 2008 SURNAME: SOLUTIONS GIVEN NAME: STUDENT NUMBER: INSTRUCTIONS: Time: 1 hour and 50 minutes Aids allowed: calculator Tables of the standard normal, t and chi-square
More informationUnbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.
Unbiased Estimation Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. To compare ˆθ and θ, two estimators of θ: Say ˆθ is better than θ if it
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationStatistics 3858 : Maximum Likelihood Estimators
Statistics 3858 : Maximum Likelihood Estimators 1 Method of Maximum Likelihood In this method we construct the so called likelihood function, that is L(θ) = L(θ; X 1, X 2,..., X n ) = f n (X 1, X 2,...,
More informationSubmitted to the Brazilian Journal of Probability and Statistics
Submitted to the Brazilian Journal of Probability and Statistics Multivariate normal approximation of the maximum likelihood estimator via the delta method Andreas Anastasiou a and Robert E. Gaunt b a
More informationHypothesis Testing. A rule for making the required choice can be described in two ways: called the rejection or critical region of the test.
Hypothesis Testing Hypothesis testing is a statistical problem where you must choose, on the basis of data X, between two alternatives. We formalize this as the problem of choosing between two hypotheses:
More informationSTAT 512 sp 2018 Summary Sheet
STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}
More informationReview Quiz. 1. Prove that in a one-dimensional canonical exponential family, the complete and sufficient statistic achieves the
Review Quiz 1. Prove that in a one-dimensional canonical exponential family, the complete and sufficient statistic achieves the Cramér Rao lower bound (CRLB). That is, if where { } and are scalars, then
More informationChapter 2. Discrete Distributions
Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation
More informationExam C Solutions Spring 2005
Exam C Solutions Spring 005 Question # The CDF is F( x) = 4 ( + x) Observation (x) F(x) compare to: Maximum difference 0. 0.58 0, 0. 0.58 0.7 0.880 0., 0.4 0.680 0.9 0.93 0.4, 0.6 0.53. 0.949 0.6, 0.8
More informationOrder Statistics and Distributions
Order Statistics and Distributions 1 Some Preliminary Comments and Ideas In this section we consider a random sample X 1, X 2,..., X n common continuous distribution function F and probability density
More informationAMCS243/CS243/EE243 Probability and Statistics. Fall Final Exam: Sunday Dec. 8, 3:00pm- 5:50pm VERSION A
AMCS243/CS243/EE243 Probability and Statistics Fall 2013 Final Exam: Sunday Dec. 8, 3:00pm- 5:50pm VERSION A *********************************************************** ID: ***********************************************************
More informationStatistics Ph.D. Qualifying Exam: Part I October 18, 2003
Statistics Ph.D. Qualifying Exam: Part I October 18, 2003 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your answer
More informationSTA205 Probability: Week 8 R. Wolpert
INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and
More informationUniversität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Hypothesis testing. Anna Wegloop Niels Landwehr/Tobias Scheffer
Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Hypothesis testing Anna Wegloop iels Landwehr/Tobias Scheffer Why do a statistical test? input computer model output Outlook ull-hypothesis
More informationBrief Review on Estimation Theory
Brief Review on Estimation Theory K. Abed-Meraim ENST PARIS, Signal and Image Processing Dept. abed@tsi.enst.fr This presentation is essentially based on the course BASTA by E. Moulines Brief review on
More informationCh. 5 Hypothesis Testing
Ch. 5 Hypothesis Testing The current framework of hypothesis testing is largely due to the work of Neyman and Pearson in the late 1920s, early 30s, complementing Fisher s work on estimation. As in estimation,
More informationLimiting Distributions
We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results
More informationLoglikelihood and Confidence Intervals
Stat 504, Lecture 2 1 Loglikelihood and Confidence Intervals The loglikelihood function is defined to be the natural logarithm of the likelihood function, l(θ ; x) = log L(θ ; x). For a variety of reasons,
More informationStat 5101 Notes: Brand Name Distributions
Stat 5101 Notes: Brand Name Distributions Charles J. Geyer September 5, 2012 Contents 1 Discrete Uniform Distribution 2 2 General Discrete Uniform Distribution 2 3 Uniform Distribution 3 4 General Uniform
More informationPrinciples of Statistics
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 81 Paper 4, Section II 28K Let g : R R be an unknown function, twice continuously differentiable with g (x) M for
More informationDA Freedman Notes on the MLE Fall 2003
DA Freedman Notes on the MLE Fall 2003 The object here is to provide a sketch of the theory of the MLE. Rigorous presentations can be found in the references cited below. Calculus. Let f be a smooth, scalar
More informationMath 562 Homework 1 August 29, 2006 Dr. Ron Sahoo
Math 56 Homework August 9, 006 Dr. Ron Sahoo He who labors diligently need never despair; for all things are accomplished by diligence and labor. Menander of Athens Direction: This homework worths 60 points
More information