Ph.D. Qualifying Exam Monday Tuesday, January 4 5, 2016

Similar documents
Ph.D. Qualifying Exam Friday Saturday, January 3 4, 2014

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017

. Find E(V ) and var(v ).

Expected Value 7/7/2006

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

MAT 271E Probability and Statistics

Statistics 910, #15 1. Kalman Filter

Problem Selected Scores

1. Let A be a 2 2 nonzero real matrix. Which of the following is true?

Qualifying Exam in Probability and Statistics.

Statistics 135: Fall 2004 Final Exam

MTH U481 : SPRING 2009: PRACTICE PROBLEMS FOR FINAL

Spring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =

MAT 271E Probability and Statistics

First Year Examination Department of Statistics, University of Florida

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

Recursive Estimation

Central Limit Theorem ( 5.3)

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

Test Code: STA/STB (Short Answer Type) 2013 Junior Research Fellowship for Research Course in Statistics

6.867 Machine Learning

Qualifying Exam in Probability and Statistics.

Homework 4 Solution, due July 23

Chapters 9. Properties of Point Estimators

Probability Theory and Simulation Methods

STAT 418: Probability and Stochastic Processes

STA 2201/442 Assignment 2

STAT 414: Introduction to Probability Theory

Master s Written Examination

Qualifying Exam in Probability and Statistics.

Math 180B Problem Set 3

Lecture 2: Repetition of probability theory and statistics

Probability Theory for Machine Learning. Chris Cremer September 2015

18.05 Practice Final Exam

Maximum Likelihood, Logistic Regression, and Stochastic Gradient Training

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

Some slides from Carlos Guestrin, Luke Zettlemoyer & K Gajos 2

Statistics 135 Fall 2008 Final Exam

Statistics Ph.D. Qualifying Exam: Part II November 3, 2001

Statistics & Data Sciences: First Year Prelim Exam May 2018

Probability & Statistics - FALL 2008 FINAL EXAM

IIT JAM : MATHEMATICAL STATISTICS (MS) 2013

Statistics GIDP Ph.D. Qualifying Exam Theory Jan 11, 2016, 9:00am-1:00pm

First Year Examination Department of Statistics, University of Florida

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Probability and Statistics qualifying exam, May 2015

STAT 516 Midterm Exam 3 Friday, April 18, 2008

MAT 271E Probability and Statistics

Mathematics Qualifying Examination January 2015 STAT Mathematical Statistics

Applied Statistics Comprehensive Exam

Linear Models in Machine Learning

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Mathematics Ph.D. Qualifying Examination Stat Probability, January 2018

1 Probability Model. 1.1 Types of models to be discussed in the course

Linear Regression (9/11/13)

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

STAT 512 sp 2018 Summary Sheet

Class 8 Review Problems 18.05, Spring 2014

6.867 Machine Learning

Econ 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines

Introduction to Stochastic Processes

Class 26: review for final exam 18.05, Spring 2014

Master s Written Examination

Lecture 1: August 28

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Stat 516, Homework 1

Statistics and Econometrics I

Before you begin read these instructions carefully.

FIRST YEAR EXAM Monday May 10, 2010; 9:00 12:00am

Statistical Estimation

Martingale Theory for Finance

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012

Week 3 Sept. 17 Sept. 21

McGill University. Faculty of Science. Department of Mathematics and Statistics. Part A Examination. Statistics: Theory Paper

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Final Exam. 1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given.

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

Solutionbank S1 Edexcel AS and A Level Modular Mathematics

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

Hierarchical Modeling for Univariate Spatial Data

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Random Walks and Quantum Walks

Properties of the least squares estimates

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

6.041/6.431 Fall 2010 Quiz 2 Solutions

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Midterm Exam 1 (Solutions)

6 The normal distribution, the central limit theorem and random samples

Quick Review on Linear Multiple Regression

MS&E 226: Small Data. Lecture 11: Maximum likelihood (v2) Ramesh Johari

Econ 371 Problem Set #1 Answer Sheet

Week 3: Linear Regression

Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

CSCI-567: Machine Learning (Spring 2019)

Transcription:

Ph.D. Qualifying Exam Monday Tuesday, January 4 5, 2016 Put your solution to each problem on a separate sheet of paper. Problem 1. (5106) Find the maximum likelihood estimate of θ where θ is a parameter in the multinomial distribution: (x 1, x 2, x 3, x 4 ) M(n; 0.4(1 + θ), 0.3θ, 0.6(1 2θ), 0.5θ) Choose a variable for the missing data and use the EM algorithm for iteratively estimating θ. Let θ (m) be the current value of the unknown. Derive the mathematical formula to update for θ (m+1). Let L(θ) denote the likelihood with the parameter θ. Prove that L(θ (m+1) ) L(θ (m) ). Problem 2. (5106) In particular, and Let x = (x 1,, x n ) be a given binary, Markovian sequence. P (x 1 = 0) = 0.5, P (x 1 = 1) = 0.5, P (x i = x i 1 ) = p, P (x i = 1 x i 1 ) = 1 p, i = 2,, n. Let y = (y 1,, y n ) be a noisy observation of x. That is, y i = x i + e i, with e i N(0, σ 2 ), i = 1,, n. Assuming p and σ 2 known, we can use the Maximum A Posteriori method to reconstruct x from y in the following form: ˆx = argmax {xi } n i=1 ( (y i x i ) 2 2σ 2 ) + n log(1 xi =x i 1 p + 1 xi x i 1 (1 p)) i=2 Write out a pseudocode for a Dynamic Programming procedure to compute ˆx in the computational order of n. 1

Problem 3. (5166) Let {X 1, X 2,..., X n } be a sample generated from the model: X i = ɛ i + θɛ i 1, where the ɛ i s are independent and identically distributed random variables with mean zero and variance σ 2. Let X = 1 n n i=1 X i. Find the variance of X. Let s 2 = n i=1 (X i X) 2. Find E(s 2 ). n 1 Compare Var( X) with E(s 2 /n) for θ = 0, θ > 0, and θ < 0. Problem 4. (5166) Consider a 2 5 2 fractional factorial design with generators 4=12 and 5=13. After analyzing the results from the design the experimenter decided to perform a second 2 5 2 design exactly the same as the first one but switching the signs of column 1 from + to (a fold-over design). What are the advantages and disadvantages of a fractional factorial design compared with a full factorial design? What are the defining relation and design resolution of the first design? List the confounding patterns of all the two-factor interactions in the first design. What are the defining relation and design resolution of the combined design? List the confounding patterns of all the two-factor interactions in the combined design. 2

Problem 5. (5167) In a one-way layout random-effects experiment, consider the following linear model: Y ti = µ + τ t + ɛ ti, t = 1,..., k; i = 1,..., n t, where µ is the overall mean, {τ t, t = 1,..., k} are iid N(0, σ 2 τ), {ɛ ti, t = 1,..., k; i = 1,..., n t } are iid N(0, σ 2 ɛ ), and the τ t s and ɛ ti s are independent. Let N = k t=1 n t, Ȳt = 1 nt n t i=1 Y ti, and Ȳ = 1 k nt N t=1 i=1 Y ti. Calculate the expected values and variances of Ȳt and Ȳ. Find Cov(Ȳt, Ȳ ). nt Let S T = k t=1 n t(ȳt Ȳ )2, S R = k t=1 i=1 (Y ti Ȳt ) 2, and S D = k nt t=1 i=1 (Y ti Ȳ )2. Show that S D = S T + S R. Find the expected values of S T and S R. Problem 6. (5167) Consider the linear regression model with p predictors: Y = Xβ + e, where Y = (y 1,..., y n ), e = (e 1,..., e n ) with e 1, e 2,..., e n iid N(0, σ 2 ), β = (β 0, β 1,..., β p ), and X is an n (p + 1) full-rank matrix. Let ˆβ be the least squares estimate of β. Define Ŷ = X ˆβ and ê = Y Ŷ. Find the distribution of ê. Let 1 = (1, 1,, 1), a column of ones with length n. Define SS reg = (Ŷ Ȳ 1) (Ŷ Ȳ 1). Find the distribution of SS reg under H 0 : β = 0. Show that the statistic F = has an F -distribution under H 0 : β = 0. SS reg /p ê ê/(n p 1) 3

Put your solution to each problem on a separate sheet of paper. Problem 7. (5326) An urn contains R red balls and G green balls. Ed draws out K balls one by one, randomly and without replacement. Whenever Ed draws two red balls in a row he wins a dollar. To be precise, Ed receives a dollar after the i th draw if he draws red balls on draws i 1 and i. Note that under these rules, if Ed draw 3 red balls in a row he receives 2 dollars; if he draws 4 red balls in a row he receives 3 dollars, etc. Let X be the amount of Ed s total winnings. Find EX. Find Var(X). Problem 8. (5326) Consider the Gambler s Ruin problem: a gambler is betting on the tosses of a fair coin, winning $1 when it comes up heads and losing $1 when it comes up tails. The gambler has an initial fortune of z dollars and a goal of g dollars, i.e., starting with z dollars, he continues playing until he reaches g dollars or goes broke (loses all his money). Let ψ(z) denote the probability of reaching the goal g as a function of the initial fortune z. Use the Law of Total Probability to find an equation that ψ(z) must satisfy. Then find ψ(z) by obtaining a solution of this equation which satisfies ψ(0) = 0 and ψ(g) = 1. Given that the gambler reached the goal, what is the probability that his first six tosses had a total of four heads and two tails? (Use your answer to part. To avoid complications, assume 6 z g 6 so that the gambler makes at least six tosses.) Problem 9. (5327) Let X i be independently distributed as Poisson(βN i ) for i = 1, 2,..., n where β is an unknown positive number and N 1, N 2,..., N n are known positive constants. Find the maximum likelihood estimate (MLE) of β. Call it T 1. Show that T 1 is unbiased for β. Observe that T 2 = 1 n X i is also an unbiased estimator of β. Which one of n N i=1 i T 1 and T 2 has smaller variance? Prove your claim. 4

Problem 10. (5327) Let X 1, X 2,..., X n be i.i.d from a distribution with pdf given by { f(x θ) = θ c cx c 1 e (x/θ)c, if x > 0 0, otherwise, where c > 0 is a known constant and θ > 0 is the unknown parameter. Find a complete sufficient statistic for θ. Find the minimum variance unbiased estimator (MVUE) for θ. (Hint: Find an unbiased statistic by taking a suitable power of the complete sufficient statistic.) Find the most powerful (MP) test of size α for testing H 0 : θ = θ 0, vs H 1 : θ = θ 1, where 0 < θ 0 < θ 1 are known constants. Problem 11. (6346) Suppose (Ω, F 0, µ) is a probability space, F is a σ-field with F F 0, X : Ω R is measurable with respect to F 0, and E X <. Define conditional expectation, E(X F). Let Y = I A and X = I B for some A, B F 0 with 0 < µ(a) < 1. Let G = σ(y ). Find E(X G). What happens if A and B are the same set? Problem 12. (6346) Let B(t) be standard Brownian motion. Show that the joint density of B(t 1 ) = X 1, B(t 2 ) = X 2,..., B(t n ) = X n where 0 < t 1 < < t n is a product of univariate normal densities. Find E[B(t) B(s)], 0 s t. Define X(t) = B(1/t), t > 0, and X(0) = 0. Describe the process X(t) in terms of its mean and covariance. Is X(t) Brownian motion? Is it a Gaussian process? 5