Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Similar documents
Probability & Statistics - FALL 2008 FINAL EXAM

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

MAS223 Statistical Inference and Modelling Exercises

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

Probability Theory and Statistics. Peter Jochumzen

MAT 271E Probability and Statistics

18.05 Practice Final Exam

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

Random Variables and Their Distributions

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Math 493 Final Exam December 01

Class 26: review for final exam 18.05, Spring 2014

1 Basic continuous random variable problems

University of California, Los Angeles Department of Statistics. Joint probability distributions

Recitation 2: Probability

MAT 271E Probability and Statistics

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

1 Random variables and distributions

Chapter 2 Random Variables

Exercises with solutions (Set D)

Chapter 1. Sets and probability. 1.3 Probability space

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Lecture 1: August 28

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

1 Presessional Probability

Module 1. Probability

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

STAT 414: Introduction to Probability Theory

Elementary Discrete Probability

Exercises and Answers to Chapter 1

1.1 Review of Probability Theory

Estadística I Exercises Chapter 4 Academic year 2015/16

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

LIST OF FORMULAS FOR STK1100 AND STK1110

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

FINAL EXAM: Monday 8-10am

Week 2. Review of Probability, Random Variables and Univariate Distributions

Test One Mathematics Fall 2009

Chapter 2: Random Variables

1. Consider a random independent sample of size 712 from a distribution with the following pdf. c 1+x. f(x) =

Question Points Score Total: 76

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

STAT 418: Probability and Stochastic Processes

Actuarial Science Exam 1/P

Statistics and Econometrics I

1 Introduction. P (n = 1 red ball drawn) =

1 Basic continuous random variable problems

Week 12-13: Discrete Probability

Midterm Exam 1 Solution

15 Discrete Distributions

Deep Learning for Computer Vision

Lecture 2: Review of Probability

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

MAT 271E Probability and Statistics

Mathematical statistics

Statistics for Economists. Lectures 3 & 4

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

MATH Solutions to Probability Exercises

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

1: PROBABILITY REVIEW

Statistics for scientists and engineers

1 Probability theory. 2 Random variables and probability theory.


Fundamental Tools - Probability Theory II

Lecture 2: Repetition of probability theory and statistics

Recap of Basic Probability Theory

Algorithms for Uncertainty Quantification

Lecture 1: Review on Probability and Statistics

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

Elementary Probability. Exam Number 38119

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

5. Conditional Distributions

Recap of Basic Probability Theory

University of Regina. Lecture Notes. Michael Kozdron

Qualifying Exam in Probability and Statistics.

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Multiple Random Variables

Math 510 midterm 3 answers

ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab

BASICS OF PROBABILITY

Introduction to discrete probability. The rules Sample space (finite except for one example)

. Find E(V ) and var(v ).

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Unit II. Page 1 of 12

Math 122L. Additional Homework Problems. Prepared by Sarah Schott

Quick Tour of Basic Probability Theory and Linear Algebra

Lecture 8: Continuous random variables, expectation and variance

Mathematical statistics

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Transcription:

Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear twice in succession. To an outcome requiring n tosses assign a probability 2 n. Describe the sample space. Evaluate the probability of the following events: (a) A = The experiment ends before the sixth toss. (b) B = An even number of tosses are required (c) A B, A B, A B, A B, A B 3. Consider the sample space S = {0, 1, 2, 3,...}. Consider the sigma field of all subsets of S. To the elementary event {j} assign the probability (a) Determine the constant c. (b) Define the events A, B and C by P ({j}) = c 2j, j = 0, 1,... j! A = {j : 2 j 4}, B = {j; j 3}, C = {j; j is an odd integer} Evaluate P (A), P (B), P (C), P (A B), P (A C), P (B C), P (A B C) and verify the formula P (A B C). 4. Let A and B be two arbitrary subsets of a sample space S. Find the smallest σ - field generated by the class {A, B}. Let S = {1, 2, 3, 4,, 5, 6} represent the sample space by

tossing a die once and let A = {1, 3, 5}, B = {1, 4}. Specify the σ - field A generated by C = {A, B}. For P (A) = 0.47, P (B) = 0.33, P (A B) = 0.17, assign probabilities to all events to A. 5. Three tickets are drawn randomly without replacement from a set of tickets numbered 1 to 100. Show that the probability that the number of selected tickets are in (i) arithmetic progression is 1 66 and (ii) geometric progression is 105 ( 100 3 ) 6. Three players A, B and C play a series of games, none of which can be drawn and their probability of winning any game are equal. The winner of each game scores 1 point and the series is won by the player who first scores 4 points. Out of first three games A wins 2 games and B wins 1 game. What is the probability that C will win the series. 7. Thirteen cards are selected randomly without replacement from a deck of 52 cards. Find the probability that there are at least 3 Aces given that there are at least 2 Aces in the selected cards. 8. Urn I contains 3 black and 5 red balls and urn II contains 4 black and 3 red balls. One urn is chosen randomly and a ball is drawn randomly which is red. Find the probability that urn I was chosen. 9. A point P is randomly placed in a square with side of 1 cm. Find the probability that the distance from P to the nearest side does not exceed x cm. 10. If six dice are rolled find the probability that at least two faces are equal.

Home Work: 2 1. Find the probability p r that in a sample of r random digits no two are equal. 2. What is the probability that among k random digits (a) 0 does not appear, (b) 1 does not appear, (c) neither 0 nor 1 appears, (d) at least one of the two digits 0 and 1 does not appear? Let A and B represent the events in (a) and (b). Express the other events in terms of A and B. 3. Find the value of N n such that the following probability is maximum (assume n k), P (N) = ( )( ) k N k i n i ( N n) 4. Suppose that in answering a question on a multiple choice test an examinee either knows the answer or he/she guesses. Let p be the probability that he will know the answer and let 1 p be the probability that he/she will guess. Assume that the probability of answering a question correctly is 1 for an examinee who knows the answer and 1 m for an examinee who guesses; here m is the multiple choice alternatives. Find the conditional probability that an examinee knew the answer to a question, given that he has correctly answered it. 5. Thirteen cards are selected randomly without replacement from a deck of 52 cards. Find the probability that there are at least 3 aces given that there are at least 2 aces in the selected card. 6. Consider an urn in which 4 balls have been placed by the following scheme. A fair coin is tossed; if the coin falls heads, a white ball is placed in the urn, and if the coin

falls tails, a red ball is placed in the urn. (a) What is the probability that the urn will contain exactly 3 white balls? (b) What is the probability that the urn will contain exactly three white balls given that the first ball placed in the urn was white? 7. A = {A i ; i Λ}, be a class of events. These events are said to be pairwise independent if P (A i1 A i2 ) = P (A i1 )P (A i2 ) for any two distinct elements A i1 Λ and A i2 Λ. Remember two events A and B are called independent events if P (A B) = P (A)P (B). Does pairwise independence imply independence? 8. Let A and B be two events. Assume P (A) > 0 and P (B) > 0. Prove that (i) if A and B are mutually exclusive (A B = φ) then A and B are not independent (ii) if A and B are independent then A and B are not mutually exclusive. 9. Let A, B and C be independent events. In terms of P (A), P (B) and P (C), express, for k = 0, 1, 2, 3, (i) P(exactly k of the events A, B, C will occur), (ii) P(at least k of the events A, B, C will occur), (iii) P (at most k of the events A, B, C will occur). 10. Let A and B be two independent events such that P (A B) = 1. (i) If P ( neither of 6 A and B occurs) = 1, find P (A) and P (B), (ii) if P( A occurs and B does not occur) 3 = 1, find P (A) and P (B). For either part (i) and (ii), are P (A) and P (B) uniquely 3 determined?

Home Work: 3 1. Show that if P (A B C D) > 0, then P (A B C D) = P (A).P (B A).P (C A B).P (D A B C). In how many different ways you can express this probability? Generalize for n events. 2. If A and B are independent events, show that Ā and B are also independent events. 3. Two fair dice labeled I and II are thrown simultaneously and outcomes of the top faces are observed. Let A = Event that die I shows an even number B = Event that die II shows an odd number C = Sum of the two faces are odd Are A, B and C independent events? 4. Let S = {HH, HT, TH, TT} and F = class of all subsets of S. Define X by X(ω) = number of H s in ω. Show that X is a random variable. 5. Let X be a random variable. Which of the following are random variables (a) X 2, (b) 1 given that {X = 0} = φ, (c) X, (d) X, given that {X < 0} = φ X 6. Let S = [0, 1] and F be the Borel σ - field of subsets of S. Define X on S as follows: X(ω) = ω if 0 ω 1 and X(ω) = ω 1 if 1 < ω 1. Is X a random variable? 2 2 2

Home Work: 4 1. Verify, whether or not the following functions can serve as p.m.f (a) f(x) = (x 2) 2 for x = 1, 2, 3, 4. (b) f(x) = e λ λ x x! for x = 1, 2, 3,..., (i) λ > 0, (ii) λ < 0. 2. A battery cell is labeled as good if it works for at least 300 days in a clock, otherwise it is labeled as bad. Three manufacturers, A, B and C make cells with probability of making good cell as 0.95, 0.90 and 0.80 respectively. Three identical clocks are selected and cells made by A, B and C are used in clock number 1, 2 and 3 respectively. Let X be the total number of clocks working after 300 days. Find the probability mass function (p.m.f.) of X and also plot the distribution function (d.f.) of X. 3. A fair die is rolled independently three times. Define, X i = { 1 if the i-th roll yields a perfect square 0 otherwise Find the p.m.f. of X i. Suppose Y = X 1 + X 2 + X 3. Find the p.m.f. of Y and also it s d.f. Find the mean and variance of Y. Verify Chebyshev s inequality in this case. 4. For what values of k, f X (x) = (1 k)k x, x = 0, 1, 2, 3,... can serve as a p.m.f. of a random variable X. Find the mean and variance of X. 5. Let { 0 x 0 F (x) = 1 2 3 e x 3 1 3 e [ x 3 ] x > 0

where [a] means the largest integer a. Show that F (x) is a d.f. Determine (i) P (X > 6), (ii) P (X = 5), (iii) P (5 X 8). 6. For the d.f. 0 x < 1 x+2 F X (x) = 1 x < 1 4 1 x 1 sketch the graph F X (x). Obtain the decomposition F X (x) = c 1 F c (x) + c 2 F d (x) where F c (x) is purely continuous and F d (x) is purely discrete. Find the mean and variance of F X (x), F c (x) and F d (x). 7. The daily water consumption X (in million of liters) is a random variable with p.d.f. f X (x) = x 9 e x 3, x > 0. (a) Find the d.f., E(X) and V(X). (b) Find the probability that on a given day, the water consumption is not more than 6 million liters.

Home Work: 5 1. A mode of a random variable X of the continuous and discrete type is a value that maximizes the probability density function (p.d.f.) or the probability mass function (p.m.f.) f(x). If there is only one such x, it is called the mode of the distribution. find the mode of each of the following distributions: [a ] f(x) = ( ) 1 x; 2 x = 1, 2, 3..... [b ] f(x) = 12x 2 (1 x) 0 < x < 1 and zero elsewhere. [c ] f(x) = ( ) 1 2 x 2 e x, 0 < x < and zero elsewhere. 2. A median of a distribution of one random variable X of the discrete or continuous type is a value x such that P (X x) < 1 and P (X x) 1. If there is only one such x, it 2 2 is called the median of the distribution. Find the median of the following distributions: [a ] f(x) = ( ( 4 x ( ) 4 x 1 3 x) 4) 4 ; x = 0, 1, 2, 3, 4. and zero elsewhere. [b ] f(x) = 3x 2, 0 < x < 1 and zero elsewhere. [c ] f(x) = 1, < x <. π(1+x 2 ) 3. Let f(x) = 1 for 0 < x < 1 and zero elsewhere, be the p.d.f. of X. Find the distribution function and the p.d.f. of Y = X. 4. Let f(x) = x, for x = 1,2,3 and zero elsewhere, be the p.m.f. of X. Find the p.m.f. of 6 Y = X 2 and Z = X. 1+X

5. Let f(x) = (4 x) ; for 2 < x < 2 and zero elsewhere, be the p.d.f. of X. Sketch the 16 distribution function and p.d.f. of X. If Y = X, compute P (Y < 1). If Z = X 2, compute P (Z < 1 4 ). 6. What is the value of 0 x n e x dx, where n is a non-negative integer. 7. Let f X (x) = λe λx for x > 0 and zero elsewhere. Define Y = [X], the greatest integer in X. Find the p.m.f. of Y. Evaluate the mean and variance of X and Y.

Home Work: 6 1. If X 1 and X 2 are random variables of discrete type having the joint probability mass function f(x 1, x 2 ) = x 1 + 2x 2 18 for(x 1, x 2 ) = (1, 1), (1, 2), (2, 1), (2, 2), and zero elsewhere. Find the marginal p.m.f. of X 1 and X 2. Also find the conditional mean and the variance of X 2 given X 1 = 2. 2. Three balls are placed randomly in 3 boxes B 1, B 2 and B 3. Let N be the total number boxes which are occupied and X 1 be the total number of balls in box B i. Find the p.m.f. of (N, X 1 ) and (X 1, X 2 ). Find the marginal p.m.f. of X 1 in both cases. Can it be different? 3. Let F (x, y) be a function of two random variables: F (x, y) = { 0 if y < 0 or x < 0 or x + y < 1 1 otherwise Can this be the distribution function of some random vector (X, Y )? Justify your answer. 4. Let X 1 and X 2 have joint p.d.f. f(x 1, x 2 ) = x 1 x 2 for 0 < x 1 < 1 and 0 < x 2 < 1 and zero elsewhere. Find the conditional mean and the variance of X 2 given X 1 = x 1, for 0 < x 1 < 1. 5. If f(x 1, x 2 ) = e (x 1+x 2 ) for 0 < x 1, x 2 < and zero elsewhere., is the joint p.d.f. of (X, Y ), show thatx 1 and X 2 are stochastically independent and also E(e t(x 1+X 2 ) ) = (1 t) 2, for 0 < t < 1.

Home Work: 7 1. Let X and Y have the joint p.d.f. f(x, y) = 1 for x < y < x, 0 < x < 1 and zero elsewhere. Find the graph of E(Y X = x) as a function of x and also the graph of E(X Y = y) as a function of y. 2. Let f X1 X 2 =x 2 (x 1 ) = c 1x 1 x 2 2 for 0 < x 1 < x 2, 0 < x 2 < 1 and zero elsewhere. Also f X2 (x 2 ) = c 2 x 4 2 for 0 < x 2 < 1 and zero elsewhere. Find the constants c 1 and c 2. Determine the joint p.d.f. of X 1 and X 2. Compute P (0.25 < X 1 < 0.5). Also find the P (0.25 < X 1 < 0.5 X 2 = 5/8). 3. The random vector (X, Y ) is said to have bivariate normal distribution function, if the joint p.d.f of (X, Y ) is f X,Y (x, y) = 1 2πσ X σ Y 1 ρ 2 { [ (x ) 1 µx 2 x µ X exp 2ρ 2(1 ρ 2 ) σ X σ X for < x, y <, where 1 < ρ < 1, σ X, σ Y > 0. (a) Show that f X,Y (x, y) is a proper bivariate density function. (b) Find the joint moment generating function of (X, Y ). (c) Find the marginal probability density functions of X and Y. (d) Find the conditional p.d.f of X Y = y. ( ) ]} y µ Y y µy 2 + σ Y σ Y (e) Find E(X), E(Y ), V (X), V (Y ), Cov(X, Y ) and Corr(X, Y ).

Home Work: 8 1. For a random vector (X, Y ), show that E(Y ) = E X (E(Y X)), V (Y ) = E X (V (Y X)) + V X (E(Y X)). 2. Suppose that (X, Y ) has a bivariate normal distribution N 2 (3, 1, 16, 25, 0.6). Find the following probabilities: (a) P (3 < Y < 8), (b) P (3 < Y < 8 X = 7), (c) P ( 3 < X < 3) and P ( 3 < X < 3 Y = 4). 3. Let (X, Y ) has a bivariate norm al distribution N 2 (20, 10, 1, 25, ρ), where ρ > 0. If P (4 < Y < 16 X = 20) = 0.954, find ρ. 4. Let X 1,... X 20 be i.i.d. random variables with mean 2 and variance 3. Let Y = 15 X i Z = 20 i=1 i=11 Find E(Y ), V (Y ), E(Z), V (Z) and ρ(y, Z). X i. 5. Let (X, Y ) be such that E(X) = 15, E(Y ) = 20,, V (X) = 25, V (Y ) = 100, ρ(x, Y ) = -0.6. If U = X Y and V = 2X 3Y, find the correlation between U and V. 6. Suppose that the life of light bulbs of certain kind follows the exponential distribution with mean life 50 hours. Find the probability that among 8 such light bulbs, two will last less that 4 hours, three will last anywhere from 40 to 60 hours, two will last anywhere between 60 to 80 hours and one will last more than 80 hours.

Home Work: 9 1 Suppose X 1 and X 2 are i.i.d uniformly distributed random variables over the intervals (0, 1). Consider Y 1 = X 1 + X 2 and Y 2 = X 2 X 1. Find the joint p.d.f. of Y 1 and Y 2 and also the marginal p.d.f.s of Y 1 and Y 2. 2 Let X 1 and X 2 be two independent standard normal random variables. Let Y 1 = X 1 + X 2 and Y 2 = X 1 /X 2. Find the joint p.d.f. of Y 1 and Y 2 and also the marginal p.d.f.s of Y 1 and Y 2. 3. Do the same problem no. 2, assuming that X i follows gamma(n i, λ). 4. Let X 1, X 2 and X 3 be i.i.d. N(0, 1) random variables. Consider the following random variables Y 1, Y 2 and Y 3 defined as follows; X 1 = Y 1 cos Y 2 sin Y 3, X 2 = Y 1 sin Y 2 sin Y 3, X 3 = Y 1 cos Y 3 where 0 Y 1 <, 0 Y 2 < 2π, 0 Y 3 < π. Find the p.d.f s of Y 1, Y 2 and Y 3. 5. Let X 1, X 2 and X 3 i.i.d. with p.d.f. f(x) = e x, 0 < x <, zero elsewhere. Find the p.d.f. of Y 1, Y 2 and Y 3, where Y 1 = X 1 X 1 + X 2, Y 2 = X 1 + X 2 X 1 + X 2 + X 3, Y 3 = X 1 + X 2 + X 3. 6. Determine the mean and variance of the mean X of a random sample of size 9 from a distribution having p.d.f. f(x) = 4x 3, 0 < x < 1, zero elsewhere.

Home Work: 10 1 Let X 1, X 2 and X 3 be three mutually independent chi-square random variables with r 1, r 2 and r 3 degrees of freedom, respectively. (a) Show that Y 1 = X 1 /X 2 and Y 2 = X 1 + X 2 are independent and that Y 2 is chisquare random variables with r 1 + r 2 degrees of freedom. (b) Find the density functions of X 1 /r 1 X 2 /r 2 and X 3 /r 3 (X 1 + X 2 )/(r 1 + r 2 ) 2 Suppose X follows N(0, 1) and Y follows χ 2 n and they are independent. Find the density function of Student-t statistic with n degrees of freedom. T = X. Y n 3. If T had student t distribution with 14 degrees of freedom, find c from the tables such that P ( T > c) = 0.05 4. Let f(x) = 1, 1 < x <, zero elsewhere, be the p.d.f. of a random variable X. x 2 consider a random sample of size 72 from the distribution function having this p.d.f. Compute approximately the probability that more than 50 of the times of the random sample are less than 3. 5. Let {X n } be a sequence of i.i.d. random variables from f(x) = e x, x > 0. Let Y n = max{x 1,..., X n } and Z n = (Y n ln n). Show that lim F Z n n (z) = F Z (z) = e e z, < z <.

Home Work: 11 1. Let X 1,..., X n represent a random sample from each of the distributions having the following density or mass function: (1) f(x; θ) = θx e θ x!, for x = 0, 1,..., 0 < θ < and it is zero else where. We also have f(0, 0) = 1. (2) f(x, θ) = θx θ 1, for 0 < x < 1. (3) f(x, θ) = e (x θ), for θ < x <, < θ < and zero elsewhere. (4) f(x, θ) = 1 2 e x θ, for < x < and zero elsewhere. 2. Let X 1,..., X n be a random sample from the following density function f(x, µ, σ) = 1 2 3σ for µ 3σ < x < µ + 3σ and zero otherwise. Find the maximum likelihood estimators of µ and σ. 3. Let X 1,..., X n be a random sample from the distribution function having p.d.f f(x, θ 1, θ 2 ) = 1 θ 2 e x θ1 θ 2, for θ 1 x <, < θ 1 <, 0 < θ 2 < and zero elsewhere. Find the MLEs of θ 1 and θ 2. 4. Let X 1,..., X n be a random sample from a Gamma(α, λ), find the method of moment estimators of α and λ and also the MLEs of α and λ. 5. In question no. 1, find the method of moment estimators of the unknown parameters.

Home Work: 12 1. Let Y 1,..., Y n be the order statistics of a random sample from a distribution function having p.d.f. f(x, θ) = 1, for θ 1 x θ + 1, < θ < and zero elsewhere. 2 2 Show that the statistic u(x 1,..., X n ) such that Y n 1 2 u(x 1,..., X n ) Y 1 + 1 2 is a maximum likelihood estimator of θ. In particular (4Y 1 + 2Y n + 1)/6 or (Y 1 + Y n )/2 are maximum likelihood estimators. Thus uniqueness is not in general a property of a maximum likelihood estimator. 2. Let X 1, X 2, X 3 have the multinomial distribution in which n = 25, and k = 4, and the unknown parameters are θ 1, θ 2 and θ 3 respectively i.e. the probability mass function of X 1, X 2, X 3 is P (X 1 = x 1 X 2 = x 2, X 3 = x 3 ) = 25! x 1!x 2!x 3!x 4! θx 1 1 θ x 2 2 θ x 3 3 θ x 4 4, where x 4 = 25 x 1 x 2 x 3 and θ 4 = 1 θ 1 θ 2 θ 3. If the observed values of the random variables are x 1 = 4, x 2 = 11 and x 3 = 7, find the maximum likelihood estimators of θ 1, θ 2 and θ 3 3. Let X 1,... X n be i.i.d. N(µ, σ 2 ). Show that the sample average and the sample variance are unbiased estimators of the population mean and population variance respectively. 4. Let X 1... X n be a random sample from U(0, θ). Find an unbiased of θ based on X (n), the largest order statistics. Show that X (n) is a consistent estimator of θ. 5. Let X n χ 2 n. Find the limiting distribution of (X n n)/ 2n.