Solution-Midterm Examination - I STAT 421, Spring 2012
|
|
- Jean Reynolds
- 6 years ago
- Views:
Transcription
1 Solution-Midterm Examination - I STAT 4, Spring 0 Prof. Prem K. Goel. [0 points] Let X, X,..., X n be a random sample of size n from a Gamma population with β = and α = ν/, where ν is the unknown parameter. (a) [6 points] Find the Method of Moments estimator of ν, and show that it is unbiased for ν. Note that the Gamma distribution with β = and α = ν/ is same as the Chi-square distribution with ν degrees of freedom, for which E(X) = ν, and V ar(x) = ν (from Appendix C.3). Therefore the MOM estimator ˆν MOM of ν is equal to the first sample moment X, and its expectation E( X) = n E(X i) = E(X), i.e., X is unbiased for the population mean, if it exists. Hence, ˆν MOM unbiased estimator for ν. (b) [4 points] Find the variance of the MoM estimator. If V ar(x) = σ, it is known that V ar( X) = σ. As mentioned in part (a) above, n σ = ν. Therefore, V ar( X) = ν. n (c) [0 points] Find the Maximum Likelihood Estimator of ν. Given ν, the joint density of f(x, X,..., X n ) is given by f(x, X,..., X n ν) = n i= Γ(ν/) (x i ) ν e x i. () Therefore, the log-likelihood function of ν is given by lnl(ν) = nln() nln(γ( ν n ) + (ν ) ln( X i) n X i. On taking the derivative of this function, and setting it equal to zero, we get dγ(α) d dν lnl(ν) = n dα Γ(α) + n ln( X i) = 0, where α = ν/. This expression can be simplfied into the following equation. dγ(α) dα Γ(α) = n n ln( X i). () The MLE ˆν = ˆα is obtained by solving i() numerically for α. For a complete solution, we would check that this solution is a maxima by evaluating the second
2 derivative of the log-likelihood function at ˆν, i.e., at the solution of the above equation. However, the second derivative of the log-likelihood function is equal to the second derivative of nln(γ( ν ) with respect to ν. If you look into advanced calculus books, you will find that the log-gamma function, lnγ(α), is known to be convex, i.e., its second derivative is positive. Thefore, lnl(α) < 0. For those (α who are anxious to go even deeper: Some of these books give the second derivative of log-gamma function equal to (α + i), which is obviously positive. i=0 (d) [BONUS QUESTION: points] Based on your understanding of the MLE and the MOM for this problem, is the MLE an unbiased estimator of ν? YES/NO. Any explanation! The bonus question was supposed to be a bit of a challenge for some students to think outside the box. The answer is no, MLE is not unbiased. If your answer was NO, you got one bonus point. (Big DEAL!). For the explanation, a simple statment would be that the MOM estimator of α = ν/ is equal to half the arithmatic mean of X, X,..., X n, which is unbiased. Whereas, the MLE ˆν is a solution of (), whose right hand side is equal to the natural log of the geometric mean of X i /. It is known that the geometric mean for any collection of non-negative numbers is less than their arithmatic mean. Of course, one still needs to find the solution of () for ˆν. At this point, recognizing this connection is plenty. Those who answered YES, mostly gave the explanation that the MOM is a oneto-one function of the MLE! Note that the arithmatic mean is NOT a one-to-one function of the geometric mean. For a fixed value of the arithmatic mean of n numbers, the geometric mean can take more than one value. Try it out for several sets of two numbers which have the same arithmatic mean. Their geometric means will not be all equal!. [0 points] Let X, X be a random sample of two Bernoulli trials with parameter p. Show that X + X is a sufficient statistic for p. Explain why the conditional distribution of X X given X + X does not depend on p. First part of this problem is a simpler version of the Example 0.0, for n =, as well as of an assigned Problem in Home Work No., Exercise 0.43, with (n =, n = ). The joint distribution of X, X is given by f(x, x ; p) = f (x ; p) f (x ; p) = p x ( p) x p x ( p) x = p x +x ( p) (x +x ).
3 3 By factorization theorem (Theorem 0.4), X + X is a sufficient statistic for p because g(x + x ; p) = p x +x ( p) (x +x ), and h(x, x ) =. The second part simply follows from the defintion of sufficient statistic, which says that the conditional distribution of the data X, X, given the sufficient statistic X + X does not depend on the parameter p. Therefore, the conditional distribution of any function of the data, e.g., X X, given the sufficient statistic X + X does not depend on the parameter p. Instead, you could just say that the conditional distribution of data given the sufficient statistics is proportional to f(x, x ; p) g(x + x ; p) = h(x, x ) =, which does not depend on p. Thefore the conditional distribution of any function of data, e.g., X X, given the sufficient satistics, does not depend on p. If you wish, another way to solve this part of the problem is to obtain the conditional probability distribution of X X given X + X directly, using simple probability calculations as follows: Note that there are only four possible outcomes, with probabilty distribution f(, ) = p, f(, 0) = f(0, ) = p( p), f(0, 0) = ( p). Given that, X + X =, only one outcome (, ) satisfies this condition, for which P (X X = 0) =. Given that, X + X = 0, only one outcome (0, 0) satisfies this condition, for which P (X X = 0) =. Given that, X + X =, two outcomes (, 0), (0, ) satisfy this condition, leading to X X = and X X =, respectively. However, each of these two outcomes have equal probability ( p)p. Therefore, conditional on X + X =, P (X X = ) = P (X X = ) = /. Note that, these conditional distributions of X X given a value of the sufficient statistic X + X, do not depend on p, but only on the values of the sufficient statistic X + X itself. 3. [5 points] Let Y, Y,..., Y n be a random sample of size n from a Poisson population with unknown parameter λ. (a) [0 points] Show that the sample mean Ȳ is the minimum variance unbiased estimator of λ. It is known from the method of moments that the sample mean Ȳ is an unbiased estimator of the population mean with variance σ /n. For the Poisson distribution, E(Y i ) = λ, σ = λ. [Appendix B.7.] Hence Ȳ is an unbiased estimator of λ, with variance equal to λ/n.
4 4 For showing the minimum variance property of Ȳ, Fisher information can be derived as follows: From Appendix B.7, the probability distribution f(y ) of Y is f(y λ) = Y! λy e λ ln f(y ) = ln(y!) + Y ln λ λ ln f(y ) = 0 + Y λ λ ( ) ( ) ( ) Y (Y λ) E ln f(y ) = E λ λ = E = λ λ From the Cramér-Rao inequality, the lowest possible value of the the variance of any unbiased estimator of λ is However, as shown earlier, n Fisher Information = λ n. (3) V ar(ȳ ) = λ n. (4) Since (3) and (4) are identical, Ȳ is an unbiased estimator of λ with variance equal to the minimum possible variance. Thus, it is a minimum variance unbiased estimator of λ by Theorem 0.. (b) [5 points] Consider ˆλ = (Y + Y n )/3 as another unbiased estimator of λ. Find the efficiency of ˆλ relative to Ȳ. In other versions of this problem, the alternate estimator was ˆλ = (Y + Y n )/. So we will work with a general estimator of the form ˆλ = (a Y +a Y n )/(a +a ). Note that E( ˆλ ) = E[(a Y + a Y n )/(a + a )]. Therefore, E( ˆλ ) = (a E(Y ) + a E(Y n ))/(a + a ). Since, E(Y i ) = λ, the right hand side of this expression is equal to λ. Hence ˆλ is unbiased for all values of the coefficients a i s. Now, since Y, Y,..., Y n are independent random variables, it follows that from Corollary 4.3 that V ar( ˆλ ) = (a V ar(y ) + a V ar(y n ))/(a + a ). However, since V ar(y i ) = λ, this expression simplifies to V ar( ˆλ ) = λ a +a (a +a ). Therefore, the efficiency of ˆλ relative to Ȳ is given by V ar(ȳ )/V ar( ˆλ ) = λ/n, which simplifies to (a+a) λ a +a n(a (a +a ) +a). For the special cases in two versions of this problem, your solution would use the specific values of the two coeffiecients from the beginning, ending up with: a =, a = with E = 9 5n, and a =, a = with E = n.
5 5 4. [5 points] Let X, X,..., X n be a random sample of size n from a Uniform distribution on the interval (0, θ), with unknown parameter θ. Let X (n) = max(x, X,..., X n ). It was shown in the class that the MLE of θ is equal to X (n). Using the methods of Chapter 8, it is known that the sampling distribution of U = X (n) is given by θ g n (u) = nu n, 0 < u <, and 0 otherwise. Show that the MLE is an asymptotically unbiased and consistent estimator of θ. The simplest approach to solving the first problem is to recognize that the sampling density of the random variable U is a Beta distribution, with α = n, β =. It follows from, Appendix C., that E(U) = α α + β = n n +, V ar(u) = αβ (α + β) (α + β + ) = n (n + )(n + ). If one doesn t recognise the conection with the Beta distribution, one must be able to perform simple integration to obtain E(U) = 0 nu n du = n n +, E(U ) = 0 nu n+ du = n n +. Given these expected values, it is easy to find the mean and variance of the random variable U. Now, E(X (n) ) = θe(u) = θ n n+. Therefore, B(θ) = Bias(X (n) ) = E(X (n) θ) = n + θ. Clearly, B(θ) 0 as n. Hence, the MLE X (n) is asymptotically unbiased. Given the distribution of U, the easiest appproach to proving consistency of X (n) is to directly evaluate the probability P ( X (n) θ < ɛ) = P ( U < ɛ ), which does θ require being able to integrate u n. Since U takes values in the interval (0, ), P ( U < ɛ θ ) = P ( ɛ θ < U < ) = nu n du = ( ɛ θ )n. Note that for small value of ɛ, 0 < ( ɛ θ )n <. Now, as n, ( ɛ θ )n 0. Therefore, P ( X (n) θ < ɛ). Hence, X (n) is a consistent estimator of θ. If one wants to avoid doing integrtion, one can use Chebyshev s inequality, P ( X (n) θ > ɛ) MSE(X (n)) ɛ, where MSE(X (n) ) = E[(X (n) θ) ] is the Mean Squared Error of X (n). Now, using the mean and variance of U obtained above, MSE(X (n) ) = V ar(x (n) ) + B (θ) = ɛ θ (n + ) ( n n + + )θ = (n + )(n + ) θ. As n, the MSE(X (n) ) converges to 0. Therefore, by Chebyshev s inequality, P ( X (n) θ > ɛ) 0 as n. Hence, X (n) is a consistent estimator of θ.
6 6 5. [0 points] Chronic anterior compartment syndrome is a condition characterized by exercise-induced pain in the lower leg. Swelling and impaired nerve and muscle function also accompany the pain, which is relieved by rest. Susan Beckham and colleagues conducted an experiment involving ten healthy runners and ten healthy cyclists to determine if pressure measurements within the anterior muscle compartment differ between runners and cyclists under two conditions, namely resting and after the 80% maximal O consumption. You can assume that the measurements are normally distributed and that under each condition, the variance of the measurements are same for the two groups. The means and standard deviations summary of data, compartment pressure (in millimeters of mercury), for the two groups under these two conditions are given in the following table: Runners Cyclists Condition Mean S Mean S Resting % maximal O consumption (a) [8 points] Construct a 95% confidence interval for the difference in the mean compartment pressures between Runners and Cyclists under the resting condition. Note that in this problem, n = 0 and n = 0 are both < 30, and we can assume normality of the measurements from the two populations, as well as equal variances, so we can apply Theorem.5. First, we compute the pooled estimate of the variance, which can be simplified because n = n = n s p = (n )s + (n )s n + n = s + s = = (5) Hence s p = = 3.95, ν = = 8. If you were asked to find the 95% confidence interval, you must use the cut-off point t 0.05,8 =.0. However, if you were asked to find the 90% confidence interval, you must use the cut-off point t 0.05,8 =.734. The 00( α)% CI for the difference µ µ is given by ( x x ) ± t α/,8 s p + = (4.5.) ± t α/, n n Therefore, the 95% confidence interval for the difference in the mean compartment pressures between Runners and Cyclists under the resting condition is 3.4±3.7, i.e., ( 0.03 mm, 7. mm). Similarly, the 90% CI for this difference is 3.4 ± 3.06, i.e., (.34mm, 6.46mm).
7 7 (b) [8 points] Construct a 90% confidence interval for the difference in the mean compartment pressures between Runners and Cyclists under the 80% maximal O condition. Under the 80% maximal O condition, you were asked to calculate either a 90% CI or a 95% CI. In addition, the value of S was either 3.95 or The pooled estimates of variance is slighlty different in the two versions. Furthermore, as in part (a), simplified formula for the pooled variance can be used. In the first version, s p = = 5.89, i.e., s p = 5.89 = In the second version, s p = = 8.34, i.e., s p = 8.34 = Note: The pooled estimate s p is a weighted average of two sample standard deviations, it must be between these two numbers. If your calculations produce a value of s p that is either less than the min SD or more than the max SD, you should not use that value and check your calculations. Now, under this condition, x x =..5 = 0.7. On substituting values of s p for each version of data, and appropriate cut-off points in the 00( α)% CI for the difference in the mean compartment pressures between Runners and Cyclists under the 80% maximal O consumption condition ( x x ) ± t α/,8 s p n, the following confidence intervals are obtained. Version : 90% CI (.7 ± (.734)(3.77)., i.e., (., 3.59) Version : 95%CI (.7 ± (.0)(3.77)., i.e., (.80, 4.0). Version : 90% CI (.7 ± (.734)(4.86)., i.e., (.6, 4.0) Version : 95%CI (.7 ± (.0)(4.86)., i.e., ( 3.3, 4.7). (c) [4 points] Consider the intervals contructed in parts (a) and (b) above. How would you interpret the results that you obtained? For the CI s under the Resting condition, at each level of confidence the lower limit of the CI is close to zero (for 90% CI, it is just above zero, and for the 95% CI, it is just below zero). For this condition, I am x% confident that the true value of the diference in mean compartment pressure may almost be the same (or slightly higher) for runners than for cyclists. Since all four confidence intervals for the difference of mean compartment under the 80% maximal O consumption condition contain 0, at each level of confidence, the two groups have similar mean compartment pressures at each level of confidence, and the two versions of the data do not affect the conclusion. The joint interpretation of both the confidence intervals on your version of the exam, will depend slightly on the combination you have out of 4 possible combinations of confidence levels.
STAT 135 Lab 3 Asymptotic MLE and the Method of Moments
STAT 135 Lab 3 Asymptotic MLE and the Method of Moments Rebecca Barter February 9, 2015 Maximum likelihood estimation (a reminder) Maximum likelihood estimation Suppose that we have a sample, X 1, X 2,...,
More informationMATH4427 Notebook 2 Fall Semester 2017/2018
MATH4427 Notebook 2 Fall Semester 2017/2018 prepared by Professor Jenny Baglivo c Copyright 2009-2018 by Jenny A. Baglivo. All Rights Reserved. 2 MATH4427 Notebook 2 3 2.1 Definitions and Examples...................................
More informationStatistics Ph.D. Qualifying Exam: Part I October 18, 2003
Statistics Ph.D. Qualifying Exam: Part I October 18, 2003 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your answer
More informationMidterm Examination. STA 215: Statistical Inference. Due Wednesday, 2006 Mar 8, 1:15 pm
Midterm Examination STA 215: Statistical Inference Due Wednesday, 2006 Mar 8, 1:15 pm This is an open-book take-home examination. You may work on it during any consecutive 24-hour period you like; please
More informationTwo hours. To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER. 21 June :45 11:45
Two hours MATH20802 To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER STATISTICAL METHODS 21 June 2010 9:45 11:45 Answer any FOUR of the questions. University-approved
More informationReview. December 4 th, Review
December 4 th, 2017 Att. Final exam: Course evaluation Friday, 12/14/2018, 10:30am 12:30pm Gore Hall 115 Overview Week 2 Week 4 Week 7 Week 10 Week 12 Chapter 6: Statistics and Sampling Distributions Chapter
More informationQualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama
Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Instructions This exam has 7 pages in total, numbered 1 to 7. Make sure your exam has all the pages. This exam will be 2 hours
More informationMathematical statistics
October 18 th, 2018 Lecture 16: Midterm review Countdown to mid-term exam: 7 days Week 1 Chapter 1: Probability review Week 2 Week 4 Week 7 Chapter 6: Statistics Chapter 7: Point Estimation Chapter 8:
More informationMathematical statistics
October 4 th, 2018 Lecture 12: Information Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation Chapter
More informationSTAT 512 sp 2018 Summary Sheet
STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationSTA 2201/442 Assignment 2
STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution
More informationMathematics Ph.D. Qualifying Examination Stat Probability, January 2018
Mathematics Ph.D. Qualifying Examination Stat 52800 Probability, January 2018 NOTE: Answers all questions completely. Justify every step. Time allowed: 3 hours. 1. Let X 1,..., X n be a random sample from
More informationFall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.
1. Let P be a probability measure on a collection of sets A. (a) For each n N, let H n be a set in A such that H n H n+1. Show that P (H n ) monotonically converges to P ( k=1 H k) as n. (b) For each n
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationStatistics - Lecture One. Outline. Charlotte Wickham 1. Basic ideas about estimation
Statistics - Lecture One Charlotte Wickham wickham@stat.berkeley.edu http://www.stat.berkeley.edu/~wickham/ Outline 1. Basic ideas about estimation 2. Method of Moments 3. Maximum Likelihood 4. Confidence
More informationDepartment of Statistical Science FIRST YEAR EXAM - SPRING 2017
Department of Statistical Science Duke University FIRST YEAR EXAM - SPRING 017 Monday May 8th 017, 9:00 AM 1:00 PM NOTES: PLEASE READ CAREFULLY BEFORE BEGINNING EXAM! 1. Do not write solutions on the exam;
More informationSpring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =
Spring 2012 Math 541A Exam 1 1. (a) Let Z i be independent N(0, 1), i = 1, 2,, n. Are Z = 1 n n Z i and S 2 Z = 1 n 1 n (Z i Z) 2 independent? Prove your claim. (b) Let X 1, X 2,, X n be independent identically
More informationFinal Examination. STA 215: Statistical Inference. Saturday, 2001 May 5, 9:00am 12:00 noon
Final Examination Saturday, 2001 May 5, 9:00am 12:00 noon This is an open-book examination, but you may not share materials. A normal distribution table, a PMF/PDF handout, and a blank worksheet are attached
More informationStatistics 3858 : Maximum Likelihood Estimators
Statistics 3858 : Maximum Likelihood Estimators 1 Method of Maximum Likelihood In this method we construct the so called likelihood function, that is L(θ) = L(θ; X 1, X 2,..., X n ) = f n (X 1, X 2,...,
More informationAll other items including (and especially) CELL PHONES must be left at the front of the room.
TEST #2 / STA 5327 (Inference) / Spring 2017 (April 24, 2017) Name: Directions This exam is closed book and closed notes. You will be supplied with scratch paper, and a copy of the Table of Common Distributions
More informationFor iid Y i the stronger conclusion holds; for our heuristics ignore differences between these notions.
Large Sample Theory Study approximate behaviour of ˆθ by studying the function U. Notice U is sum of independent random variables. Theorem: If Y 1, Y 2,... are iid with mean µ then Yi n µ Called law of
More informationTABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1
TABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1 1.1 The Probability Model...1 1.2 Finite Discrete Models with Equally Likely Outcomes...5 1.2.1 Tree Diagrams...6 1.2.2 The Multiplication Principle...8
More informationSTA 260: Statistics and Probability II
Al Nosedal. University of Toronto. Winter 2017 1 Properties of Point Estimators and Methods of Estimation 2 3 If you can t explain it simply, you don t understand it well enough Albert Einstein. Definition
More informationFinal Exam. 1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given.
1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given. (a) If X and Y are independent, Corr(X, Y ) = 0. (b) (c) (d) (e) A consistent estimator must be asymptotically
More informationMathematics Qualifying Examination January 2015 STAT Mathematical Statistics
Mathematics Qualifying Examination January 2015 STAT 52800 - Mathematical Statistics NOTE: Answer all questions completely and justify your derivations and steps. A calculator and statistical tables (normal,
More informationStatistics Masters Comprehensive Exam March 21, 2003
Statistics Masters Comprehensive Exam March 21, 2003 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your answer
More informationP n. This is called the law of large numbers but it comes in two forms: Strong and Weak.
Large Sample Theory Large Sample Theory is a name given to the search for approximations to the behaviour of statistical procedures which are derived by computing limits as the sample size, n, tends to
More informationAPPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2
APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so
More informationProblem 1 (20) Log-normal. f(x) Cauchy
ORF 245. Rigollet Date: 11/21/2008 Problem 1 (20) f(x) f(x) 0.0 0.1 0.2 0.3 0.4 0.0 0.2 0.4 0.6 0.8 4 2 0 2 4 Normal (with mean -1) 4 2 0 2 4 Negative-exponential x x f(x) f(x) 0.0 0.1 0.2 0.3 0.4 0.5
More informationChapter 2. Discrete Distributions
Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation
More informationThis exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.
TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to
More informationLecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality
Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek Bhrushundi
More informationSTA 2101/442 Assignment 3 1
STA 2101/442 Assignment 3 1 These questions are practice for the midterm and final exam, and are not to be handed in. 1. Suppose X 1,..., X n are a random sample from a distribution with mean µ and variance
More informationSolutions to the Spring 2015 CAS Exam ST
Solutions to the Spring 2015 CAS Exam ST (updated to include the CAS Final Answer Key of July 15) There were 25 questions in total, of equal value, on this 2.5 hour exam. There was a 10 minute reading
More informationIE 230 Probability & Statistics in Engineering I. Closed book and notes. 120 minutes.
Closed book and notes. 10 minutes. Two summary tables from the concise notes are attached: Discrete distributions and continuous distributions. Eight Pages. Score _ Final Exam, Fall 1999 Cover Sheet, Page
More informationModule 6: Methods of Point Estimation Statistics (OA3102)
Module 6: Methods of Point Estimation Statistics (OA3102) Professor Ron Fricker Naval Postgraduate School Monterey, California Reading assignment: WM&S chapter 9.6-9.7 Revision: 1-12 1 Goals for this Module
More informationUsing R in Undergraduate Probability and Mathematical Statistics Courses. Amy G. Froelich Department of Statistics Iowa State University
Using R in Undergraduate Probability and Mathematical Statistics Courses Amy G. Froelich Department of Statistics Iowa State University Undergraduate Probability and Mathematical Statistics at Iowa State
More informationMA 320 Introductory Probability, Section 1 Spring 2017 Final Exam Practice May. 1, Exam Scores. Question Score Total. Name:
MA 320 Introductory Probability, Section 1 Spring 2017 Final Exam Practice May. 1, 2017 Exam Scores Question Score Total 1 10 Name: Last 4 digits of student ID #: No books or notes may be used. Turn off
More informationf(x θ)dx with respect to θ. Assuming certain smoothness conditions concern differentiating under the integral the integral sign, we first obtain
0.1. INTRODUCTION 1 0.1 Introduction R. A. Fisher, a pioneer in the development of mathematical statistics, introduced a measure of the amount of information contained in an observaton from f(x θ). Fisher
More informationFinal Examination a. STA 532: Statistical Inference. Wednesday, 2015 Apr 29, 7:00 10:00pm. Thisisaclosed bookexam books&phonesonthefloor.
Final Examination a STA 532: Statistical Inference Wednesday, 2015 Apr 29, 7:00 10:00pm Thisisaclosed bookexam books&phonesonthefloor Youmayuseacalculatorandtwo pagesofyourownnotes Do not share calculators
More informationProblem 1. Problem 2. Problem 3. Problem 4
Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication
More informationSTAT 135 Lab 2 Confidence Intervals, MLE and the Delta Method
STAT 135 Lab 2 Confidence Intervals, MLE and the Delta Method Rebecca Barter February 2, 2015 Confidence Intervals Confidence intervals What is a confidence interval? A confidence interval is calculated
More informationProblem Selected Scores
Statistics Ph.D. Qualifying Exam: Part II November 20, 2010 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. Problem 1 2 3 4 5 6 7 8 9 10 11 12 Selected
More informationWill Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them:
Will Murray s Probability, XXXII. Moment-Generating Functions XXXII. Moment-Generating Functions Premise We have several random variables, Y, Y, etc. We want to study functions of them: U (Y,..., Y n ).
More informationStatistics GIDP Ph.D. Qualifying Exam Theory Jan 11, 2016, 9:00am-1:00pm
Statistics GIDP Ph.D. Qualifying Exam Theory Jan, 06, 9:00am-:00pm Instructions: Provide answers on the supplied pads of paper; write on only one side of each sheet. Complete exactly 5 of the 6 problems.
More informationPrimer on statistics:
Primer on statistics: MLE, Confidence Intervals, and Hypothesis Testing ryan.reece@gmail.com http://rreece.github.io/ Insight Data Science - AI Fellows Workshop Feb 16, 018 Outline 1. Maximum likelihood
More informationACTEX CAS EXAM 3 STUDY GUIDE FOR MATHEMATICAL STATISTICS
ACTEX CAS EXAM 3 STUDY GUIDE FOR MATHEMATICAL STATISTICS TABLE OF CONTENTS INTRODUCTORY NOTE NOTES AND PROBLEM SETS Section 1 - Point Estimation 1 Problem Set 1 15 Section 2 - Confidence Intervals and
More informationUNIVERSITY OF TORONTO Faculty of Arts and Science
UNIVERSITY OF TORONTO Faculty of Arts and Science December 2013 Final Examination STA442H1F/2101HF Methods of Applied Statistics Jerry Brunner Duration - 3 hours Aids: Calculator Model(s): Any calculator
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationExponential Families
Exponential Families David M. Blei 1 Introduction We discuss the exponential family, a very flexible family of distributions. Most distributions that you have heard of are in the exponential family. Bernoulli,
More informationCOMP2610/COMP Information Theory
COMP2610/COMP6261 - Information Theory Lecture 9: Probabilistic Inequalities Mark Reid and Aditya Menon Research School of Computer Science The Australian National University August 19th, 2014 Mark Reid
More informationStat 5102 Lecture Slides Deck 3. Charles J. Geyer School of Statistics University of Minnesota
Stat 5102 Lecture Slides Deck 3 Charles J. Geyer School of Statistics University of Minnesota 1 Likelihood Inference We have learned one very general method of estimation: method of moments. the Now we
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2
IEOR 316: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 2 More Probability Review: In the Ross textbook, Introduction to Probability Models, read
More informationTheory of Statistics.
Theory of Statistics. Homework V February 5, 00. MT 8.7.c When σ is known, ˆµ = X is an unbiased estimator for µ. If you can show that its variance attains the Cramer-Rao lower bound, then no other unbiased
More informationSuggested solutions to written exam Jan 17, 2012
LINKÖPINGS UNIVERSITET Institutionen för datavetenskap Statistik, ANd 73A36 THEORY OF STATISTICS, 6 CDTS Master s program in Statistics and Data Mining Fall semester Written exam Suggested solutions to
More informationQualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf
Part : Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section
More informationChapters 9. Properties of Point Estimators
Chapters 9. Properties of Point Estimators Recap Target parameter, or population parameter θ. Population distribution f(x; θ). { probability function, discrete case f(x; θ) = density, continuous case The
More informationInstitute of Actuaries of India
Institute of Actuaries of India Subject CT3 Probability & Mathematical Statistics May 2011 Examinations INDICATIVE SOLUTION Introduction The indicative solution has been written by the Examiners with the
More informationProbability & Statistics - FALL 2008 FINAL EXAM
550.3 Probability & Statistics - FALL 008 FINAL EXAM NAME. An urn contains white marbles and 8 red marbles. A marble is drawn at random from the urn 00 times with replacement. Which of the following is
More informationProblems. Suppose both models are fitted to the same data. Show that SS Res, A SS Res, B
Simple Linear Regression 35 Problems 1 Consider a set of data (x i, y i ), i =1, 2,,n, and the following two regression models: y i = β 0 + β 1 x i + ε, (i =1, 2,,n), Model A y i = γ 0 + γ 1 x i + γ 2
More informationGeneralized Linear Models 1
Generalized Linear Models 1 STA 2101/442: Fall 2012 1 See last slide for copyright information. 1 / 24 Suggested Reading: Davison s Statistical models Exponential families of distributions Sec. 5.2 Chapter
More informationSTATISTICS/ECONOMETRICS PREP COURSE PROF. MASSIMO GUIDOLIN
Massimo Guidolin Massimo.Guidolin@unibocconi.it Dept. of Finance STATISTICS/ECONOMETRICS PREP COURSE PROF. MASSIMO GUIDOLIN SECOND PART, LECTURE 2: MODES OF CONVERGENCE AND POINT ESTIMATION Lecture 2:
More informationSpring 2012 Math 541B Exam 1
Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote
More informationECON 5350 Class Notes Review of Probability and Distribution Theory
ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one
More informationCommon probability distributionsi Math 217 Probability and Statistics Prof. D. Joyce, Fall 2014
Introduction. ommon probability distributionsi Math 7 Probability and Statistics Prof. D. Joyce, Fall 04 I summarize here some of the more common distributions used in probability and statistics. Some
More informationProbability and Estimation. Alan Moses
Probability and Estimation Alan Moses Random variables and probability A random variable is like a variable in algebra (e.g., y=e x ), but where at least part of the variability is taken to be stochastic.
More informationStatistics Ph.D. Qualifying Exam: Part II November 3, 2001
Statistics Ph.D. Qualifying Exam: Part II November 3, 2001 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your
More informationA Very Brief Summary of Statistical Inference, and Examples
A Very Brief Summary of Statistical Inference, and Examples Trinity Term 2008 Prof. Gesine Reinert 1 Data x = x 1, x 2,..., x n, realisations of random variables X 1, X 2,..., X n with distribution (model)
More informationParameter Estimation
Parameter Estimation Consider a sample of observations on a random variable Y. his generates random variables: (y 1, y 2,, y ). A random sample is a sample (y 1, y 2,, y ) where the random variables y
More informationHT Introduction. P(X i = x i ) = e λ λ x i
MODS STATISTICS Introduction. HT 2012 Simon Myers, Department of Statistics (and The Wellcome Trust Centre for Human Genetics) myers@stats.ox.ac.uk We will be concerned with the mathematical framework
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationSTAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.
STAT 302 Introduction to Probability Learning Outcomes Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. Chapter 1: Combinatorial Analysis Demonstrate the ability to solve combinatorial
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationSTAT 285: Fall Semester Final Examination Solutions
Name: Student Number: STAT 285: Fall Semester 2014 Final Examination Solutions 5 December 2014 Instructor: Richard Lockhart Instructions: This is an open book test. As such you may use formulas such as
More informationStat 135, Fall 2006 A. Adhikari HOMEWORK 6 SOLUTIONS
Stat 135, Fall 2006 A. Adhikari HOMEWORK 6 SOLUTIONS 1a. Under the null hypothesis X has the binomial (100,.5) distribution with E(X) = 50 and SE(X) = 5. So P ( X 50 > 10) is (approximately) two tails
More informationActuarial Science Exam 1/P
Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,
More informationTesting Hypothesis. Maura Mezzetti. Department of Economics and Finance Università Tor Vergata
Maura Department of Economics and Finance Università Tor Vergata Hypothesis Testing Outline It is a mistake to confound strangeness with mystery Sherlock Holmes A Study in Scarlet Outline 1 The Power Function
More informationGEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs
STATISTICS 4 Summary Notes. Geometric and Exponential Distributions GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs P(X = x) = ( p) x p x =,, 3,...
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationLecture 2: Streaming Algorithms
CS369G: Algorithmic Techniques for Big Data Spring 2015-2016 Lecture 2: Streaming Algorithms Prof. Moses Chariar Scribes: Stephen Mussmann 1 Overview In this lecture, we first derive a concentration inequality
More informationQualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf
Part 1: Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section
More informationBTRY 4090: Spring 2009 Theory of Statistics
BTRY 4090: Spring 2009 Theory of Statistics Guozhang Wang September 25, 2010 1 Review of Probability We begin with a real example of using probability to solve computationally intensive (or infeasible)
More informationMaster s Written Examination
Master s Written Examination Option: Statistics and Probability Spring 016 Full points may be obtained for correct answers to eight questions. Each numbered question which may have several parts is worth
More informationSTAT 515 MIDTERM 2 EXAM November 14, 2018
STAT 55 MIDTERM 2 EXAM November 4, 28 NAME: Section Number: Instructor: In problems that require reasoning, algebraic calculation, or the use of your graphing calculator, it is not sufficient just to write
More informationFirst Year Examination Department of Statistics, University of Florida
First Year Examination Department of Statistics, University of Florida May 6, 2011, 8:00 am - 12:00 noon Instructions: 1. You have four hours to answer questions in this examination. 2. You must show your
More informationCorrelation and Regression
Correlation and Regression October 25, 2017 STAT 151 Class 9 Slide 1 Outline of Topics 1 Associations 2 Scatter plot 3 Correlation 4 Regression 5 Testing and estimation 6 Goodness-of-fit STAT 151 Class
More informationMathematical statistics
October 1 st, 2018 Lecture 11: Sufficient statistic Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation
More informationQuiz 1. Name: Instructions: Closed book, notes, and no electronic devices.
Quiz 1. Name: Instructions: Closed book, notes, and no electronic devices. 1. What is the difference between a deterministic model and a probabilistic model? (Two or three sentences only). 2. What is the
More information7 Random samples and sampling distributions
7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces
More informationStatistics 135 Fall 2007 Midterm Exam
Name: Student ID Number: Statistics 135 Fall 007 Midterm Exam Ignore the finite population correction in all relevant problems. The exam is closed book, but some possibly useful facts about probability
More informationStatement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:.
MATHEMATICAL STATISTICS Homework assignment Instructions Please turn in the homework with this cover page. You do not need to edit the solutions. Just make sure the handwriting is legible. You may discuss
More information[Chapter 6. Functions of Random Variables]
[Chapter 6. Functions of Random Variables] 6.1 Introduction 6.2 Finding the probability distribution of a function of random variables 6.3 The method of distribution functions 6.5 The method of Moment-generating
More informationMidterm 1 and 2 results
Midterm 1 and 2 results Midterm 1 Midterm 2 ------------------------------ Min. :40.00 Min. : 20.0 1st Qu.:60.00 1st Qu.:60.00 Median :75.00 Median :70.0 Mean :71.97 Mean :69.77 3rd Qu.:85.00 3rd Qu.:85.0
More information5.2 Fisher information and the Cramer-Rao bound
Stat 200: Introduction to Statistical Inference Autumn 208/9 Lecture 5: Maximum likelihood theory Lecturer: Art B. Owen October 9 Disclaimer: These notes have not been subjected to the usual scrutiny reserved
More informationTest Problems for Probability Theory ,
1 Test Problems for Probability Theory 01-06-16, 010-1-14 1. Write down the following probability density functions and compute their moment generating functions. (a) Binomial distribution with mean 30
More informationDA Freedman Notes on the MLE Fall 2003
DA Freedman Notes on the MLE Fall 2003 The object here is to provide a sketch of the theory of the MLE. Rigorous presentations can be found in the references cited below. Calculus. Let f be a smooth, scalar
More information1. (Regular) Exponential Family
1. (Regular) Exponential Family The density function of a regular exponential family is: [ ] Example. Poisson(θ) [ ] Example. Normal. (both unknown). ) [ ] [ ] [ ] [ ] 2. Theorem (Exponential family &
More informationMultivariate Analysis and Likelihood Inference
Multivariate Analysis and Likelihood Inference Outline 1 Joint Distribution of Random Variables 2 Principal Component Analysis (PCA) 3 Multivariate Normal Distribution 4 Likelihood Inference Joint density
More informationFIRST YEAR EXAM Monday May 10, 2010; 9:00 12:00am
FIRST YEAR EXAM Monday May 10, 2010; 9:00 12:00am NOTES: PLEASE READ CAREFULLY BEFORE BEGINNING EXAM! 1. Do not write solutions on the exam; please write your solutions on the paper provided. 2. Put the
More information