First Year Examination Department of Statistics, University of Florida

Similar documents
First Year Examination Department of Statistics, University of Florida

First Year Examination Department of Statistics, University of Florida

Masters Comprehensive Examination Department of Statistics, University of Florida

Masters Comprehensive Examination Department of Statistics, University of Florida

Final Exam. 1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given.

Statistics GIDP Ph.D. Qualifying Exam Theory Jan 11, 2016, 9:00am-1:00pm

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

Stat 5102 Final Exam May 14, 2015

McGill University. Faculty of Science. Department of Mathematics and Statistics. Part A Examination. Statistics: Theory Paper

Multiple Random Variables

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

Master s Written Examination - Solution

Spring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =

Multivariate Random Variable

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

IIT JAM : MATHEMATICAL STATISTICS (MS) 2013

Problem Selected Scores

Test Code: STA/STB (Short Answer Type) 2013 Junior Research Fellowship for Research Course in Statistics

18.05 Practice Final Exam

Semester , Example Exam 1

STATISTICS SYLLABUS UNIT I

STA 4322 Exam I Name: Introduction to Statistics Theory

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

STAT 450: Final Examination Version 1. Richard Lockhart 16 December 2002

Statistics & Data Sciences: First Year Prelim Exam May 2018

First Year Examination Department of Statistics, University of Florida

Class 8 Review Problems 18.05, Spring 2014

MATH 644: Regression Analysis Methods

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER. 21 June :45 11:45

4. Distributions of Functions of Random Variables

Institute of Actuaries of India

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Spring 2012 Math 541B Exam 1

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Class 26: review for final exam 18.05, Spring 2014

Central Limit Theorem ( 5.3)

Review Quiz. 1. Prove that in a one-dimensional canonical exponential family, the complete and sufficient statistic achieves the

WISE International Masters

1 Introduction. P (n = 1 red ball drawn) =

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017

STAT 512 sp 2018 Summary Sheet

Probability and Statistics Notes

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

STAT 418: Probability and Stochastic Processes

STAT 414: Introduction to Probability Theory

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Probability and Statistics qualifying exam, May 2015

MAT 271E Probability and Statistics

p y (1 p) 1 y, y = 0, 1 p Y (y p) = 0, otherwise.

STAT420 Midterm Exam. University of Illinois Urbana-Champaign October 19 (Friday), :00 4:15p. SOLUTIONS (Yellow)

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY

Lecture 2: Repetition of probability theory and statistics

Lecture 1: August 28

Lecture 15. Hypothesis testing in the linear model

STT 441 Final Exam Fall 2013

A Very Brief Summary of Statistical Inference, and Examples

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

Probability Distributions Columns (a) through (d)

STATS 200: Introduction to Statistical Inference. Lecture 29: Course review

Estimation Tasks. Short Course on Image Quality. Matthew A. Kupinski. Introduction

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

Ch 2: Simple Linear Regression

Basics on Probability. Jingrui He 09/11/2007

Ph.D. Qualifying Exam Monday Tuesday, January 4 5, 2016

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Direction: This test is worth 250 points and each problem worth points. DO ANY SIX

(ii) Scan your answer sheets INTO ONE FILE only, and submit it in the drop-box.

Qualifying Exam in Probability and Statistics.

Chapter 4 Multiple Random Variables

Review: mostly probability and some statistics

RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators

Qualifying Exam in Probability and Statistics.

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

This does not cover everything on the final. Look at the posted practice problems for other topics.

Final Examination. STA 215: Statistical Inference. Saturday, 2001 May 5, 9:00am 12:00 noon

Asymptotic Statistics-III. Changliang Zou

Simple Linear Regression

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

f (1 0.5)/n Z =

Statistics 3657 : Moment Approximations

STATISTICS/ECONOMETRICS PREP COURSE PROF. MASSIMO GUIDOLIN

Master s Written Examination

A Probability Review

p(z)

Introduction to Normal Distribution

Week 9 The Central Limit Theorem and Estimation Concepts

1. Point Estimators, Review

Quick Tour of Basic Probability Theory and Linear Algebra

An Introduction to Bayesian Linear Regression

Contents 1. Contents

Mathematics Ph.D. Qualifying Examination Stat Probability, January 2018

1. Fisher Information

Reminders. Thought questions should be submitted on eclass. Please list the section related to the thought question

Statement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:.

Chapter 1: Linear Regression with One Predictor Variable also known as: Simple Linear Regression Bivariate Linear Regression

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Transcription:

First Year Examination Department of Statistics, University of Florida August 19, 010, 8:00 am - 1:00 noon Instructions: 1. You have four hours to answer questions in this examination.. You must show your work to receive credit. 3. Questions 1 through 5 are the theory questions and questions 6 through 10 are the applied questions. You must do exactly four of the theory questions and exactly four of the applied questions 4. Write your answers to the theory questions on the blank paper provided. Write only on one side of the paper, and start each question on a new page. 5. Write your answers to the applied questions on the exam itself. If you use extra sheets, write only on one side of the paper, and start each question on a new page. 6. While the 10 questions are equally weighted, some questions are more difficult than others. 7. The parts within a given question are not necessarily equally weighted. 8. You are allowed to use a calculator. 1

The following abbreviations and terminology are used throughout: cdf = cumulative distribution function iid = independent and identically distributed mgf = moment generating function ML = maximum likelihood MSE = mean squared error pdf = probability density function pmf = probability mass function R + = (0, ) You may use the following facts/formulas without proof: Iterated Expectation Formula: E(X) = E [E(X Y )]. Iterated Variance Formula: Var(X) = E [Var(X Y )] + Var [E(X Y )].

1. Three balls are distributed randomly into three urns. Define the random variable X to be the number of empty urns, and define the random variable Y to be the number of balls in the third urn. (a) Find the joint pmf of X and Y. (b) Are X and Y independent? (A yes/no answer in not sufficient here.) (c) Find the conditional pmf of Y given X = 0 (d) Find the conditional expectation and variance of X given Y = 0.. Let (X, Y ) be a continuous bivariate random vector with joint pdf given by { 15 f(x, y) = 16 (x + y 3 ) 1 < x < y < 1 0 elsewhere. (a) Find P (X + Y < 0). (b) Find the marginal pdf of X. (c) Find the conditional pdf of X given that Y = y. (d) Find P (Y > 1/4 X > 1/). (Just set up the integral(s) - do not compute them.) (e) Find another joint pdf that has the same marginals as f(x, y). 3. Suppose that Y N(θ, 1), and that the prior density for θ, call it π(θ), is N(µ, τ ). (a) Find the posterior density of θ given y, call it π(θ y). (b) Find the Bayes estimator of θ, call it θ(y ), and show that it can be written as a weighted average of the ML estimator of θ, which is ˆθ(Y ) = Y, and the prior mean of θ. (c) Find the MSE of θ(y ) and the MSE of ˆθ(Y ) (as estimators of θ). (d) Does either of these estimators dominate the other in terms of MSE? In other words, is one of the MSE functions uniformly below the other? Now consider an alternative prior that is a mixture of k normal densities; that is, π (θ) = k i=1 p i πτ i { exp 1 } τi (θ µ i ), where each p i (0, 1), k i=1 p i = 1, each µ i R and each τi R +. Note that π (θ) = k i=1 p i π i (θ), where π i (θ) is a N(µ i, τi ) density. (e) Find the exact posterior density of θ given y, call it π (θ y). (Hint: You may use the fact that the marginal density of y under π(θ) is N(µ, τ + 1).) (f) Is π (θ) a conjugate prior for the normal mean? (A yes/no answer in not sufficient here.) 3

4. Suppose that n and that X 1,..., X n are iid Bernoulli(p). (a) Derive the mgf of X 1 and use it to calculate the expectation and variance of X 1. (b) Find the ML estimator of p. Is it unbiased? (c) Find the best unbiased estimator of p. (d) Find the ML estimator of p(1 p). Is it unbiased? (e) Find the Cramér-Rao Lower Bound for the variance of an unbiased estimator of p(1 p). (f) The sample variance, 1 n 1 n i=1 ( Xi X ), is an unbiased estimator of p(1 p). Is it best unbiased? 5. Let X 1, X be iid Uniform(θ, θ + 1). In this question, we consider testing H 0 : θ = 0 against H 1 : θ > 0 using the following two tests: where C > 0. (a) Show that X 1 θ is Uniform(0, 1). Test I: Reject H 0 if X 1 >.95 Test II: Reject H 0 if X 1 + X > C (b) Derive the power function of Test I; that is, find β 1 (θ) = P θ (X 1 >.95) for θ 0. (c) Let U 1, U be iid Uniform(0, 1). Find the cdf of U 1 + U. (d) Derive the power function of Test II; that is, find β (θ) = P θ (X 1 + X > C) for θ 0. (e) What value of C should we use if we want the two tests to have the same size? With this value of C, is Test II more powerful than Test I? (f) Explain how we could increase the power of Test II without affecting its size. 4

6. An experiment is conducted to compare the effects of 3 brands of basketball shoes on jumping ability among high school basketball players (these are the only 3 brands of interest to the experimenter). A random sample of 18 high school basketball players is selected from public high schools in a large city. Each of the players jumps in each of the brands (in an order such that each brand is measured 1 st, nd, and 3 rd on 6 jumpers). The researcher does not believe there will be an order effect, so she does not include that in the model. The model is made up of the following components: Y ij = Jump height (inches) when player j wears brand i (i=1,,3;j=1,,18) = Overall mean among all high school players among these 3 brands i = Effect of brand i (fixed) j = Effect of player j (random) ij = Random Error Term Model: Y 3 i1 ij i j ij ' where: 0 ~ NID 0, ~ NID 0, COV, 0 i, j, j ' i j ij j ij Derive the following terms: a) V Y ij b) V Y where Y Y i i ij j 1 c) V Y where Y Y 18 i i ij j d) COV Y, Y i i ' i i' i' e) V Y Y i i ' i

7. An experiment is conducted to measure the effects of 3 types of paint and types of wood on the brightness of the paint (measured by a brightness meter). The experimenter conducts the experiment in replicates of each combination of paint and wood, and runs the analysis as a multiple regression with two dummy variables for paint type, and 1 dummy variable for wood type, and an intercept term. We fit the following sequence of models (assume the Total sum of squares = 1000): 1 Paint Type 1 1 Paint Type 1 Wood Type 1 Where: X1 X X 3 0 otherwise 0 otherwise 0 otherwise Model 1: Y ~ NID 0, 0 Model : Y X X ~ NID 0, 0 1 1 Model 3: Y X X X ~ NID 0, 0 1 1 3 3 Model 4: Y X X X X X X X ~ NID 0, 0 1 1 3 3 4 1 3 5 3 R 0.00 R 0.65 R 0.7 R 0.75 1 3 4 Complete the following parts (for tests give null and alternative hypotheses in terms of model parameters, the test statistic, and rejection region based on = 0.05 significance level): a) For each model, Give the matrix X X b) Test whether there is a main effect for paint type (ignoring wood type and interaction) c) Test whether there is a main effect for wood type (controlling for paint type but ignoring interaction). d) Test whether there is an interaction between paint and wood types (after controlling for main effects). e) Give the analysis of variance table with degrees of freedom and sums of squares for paint, wood, interaction, and error (using the sequential sums of squares).

8. An experiment is conducted to compare the average breaking strengths of 4 types of steel bars. The following table gives the mean, standard deviations and sample sizes (number of replicates) for measurements made on each of the 4 steel types (these are the only types of steel of interest). Type Mean SD # reps 1 6 3 5 4 6 3 30 4 5 4 3 4 6 Complete the following parts: a) Compute the between types (treatment) sum of squares and degrees of freedom b) Compute the within types (error) sum of squares and degrees of freedom c) Give the Analysis of Variance table d) Test whether the population means differ among the 4 steel types (= 0.05 significance level). e) Compute the Bonferroni Minimum significant difference for each of the pairs of means with an experimentwise error rate of 0.05.

9. Use the following output to obtain the quantities given below: X Y 1 0 4 14 1 4 4 16 1 8 4 1 1 0 8 17 1 4 8 0 1 8 8 4 (X'X)^-1 1.9167-0.065-0.500-0.065 0.0156 0.0000-0.500 0.0000 0.0417 X'Y Beta-hat 11 10.1667 504 0.8750 69 0.8333 P 0.5833 0.3333 0.0833 0.500 0.0000-0.500 0.3333 0.3333 0.3333 0.0000 0.0000 0.0000 0.0833 0.3333 0.5833-0.500 0.0000 0.500 0.500 0.0000-0.500 0.5833 0.3333 0.0833 0.0000 0.0000 0.0000 0.3333 0.3333 0.3333-0.500 0.0000 0.500 0.0833 0.3333 0.5833 Y'Y 158.00 Y'PY 156.33 Y'(I-P)Y 1.67 Y'(J/n)Y 090.67 Y'(P-J/n)Y 65.67 Total Corrected: Sum Of Squares Degrees of Freedom Regression: Sum Of Squares Degrees of Freedom Residual: Sum Of Squares Degrees of Freedom S ^ s 1 Testing H 0 : F-stat Num df Den df Rejection Region (= 0.05) Predicted Value for Y : Based on: (SHOW CALCULATIONS for each method) ^ X PY ^ s Y s e

10. Consider the simple linear regression model in scalar form: Y X iid E V i n i 0 1 i i i ~ i 0, i 1,..., Complete the following parts (show all work): a) Derive the least squares estimators of b) Derive the mean and variance of the least squares estimator of c) Derive the covariance of the least squares estimators of