This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

Similar documents
RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

FINAL EXAM: Monday 8-10am

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

University of Illinois ECE 313: Final Exam Fall 2014

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

FINAL EXAM: 3:30-5:30pm

Chapter 4 Multiple Random Variables

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

HW Solution 12 Due: Dec 2, 9:19 AM

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

Multivariate Random Variable

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

1.1 Review of Probability Theory

Math Spring Practice for the final Exam.

Probability Background

Midterm Exam 1 Solution

First Year Examination Department of Statistics, University of Florida

Multiple Random Variables

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

2. This problem is very easy if you remember the exponential cdf; i.e., 0, y 0 F Y (y) = = ln(0.5) = ln = ln 2 = φ 0.5 = β ln 2.

You may use a calculator. Translation: Show all of your work; use a calculator only to do final calculations and/or to check your work.

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm)

Basics of Stochastic Modeling: Part II

EXAM # 3 PLEASE SHOW ALL WORK!

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

Stat 5102 Final Exam May 14, 2015

Chapter 4 Multiple Random Variables

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

3 Multiple Discrete Random Variables

4. Distributions of Functions of Random Variables

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Probability and Stochastic Processes: A Friendly Introduction for Electrical and Computer Engineers Roy D. Yates and David J.

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

ECE 353 Probability and Random Signals - Practice Questions

Summary of Random Variable Concepts March 17, 2000

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

Math 151. Rumbos Spring Solutions to Review Problems for Exam 2

Midterm Examination. STA 215: Statistical Inference. Due Wednesday, 2006 Mar 8, 1:15 pm

Bell-shaped curves, variance

STAT 414: Introduction to Probability Theory

Functions of two random variables. Conditional pairs

Mathematics 375 Probability and Statistics I Final Examination Solutions December 14, 2009

Basics on Probability. Jingrui He 09/11/2007

You may not use your books/notes on this exam. You may use calculator.

Lecture 2: Repetition of probability theory and statistics

0, otherwise. U = Y 1 Y 2 Hint: Use either the method of distribution functions or a bivariate transformation. (b) Find E(U).

Assignment 9. Due: July 14, 2017 Instructor: Dr. Mustafa El-Halabi. ! A i. P (A c i ) i=1

Algorithms for Uncertainty Quantification

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

SOLUTION FOR HOMEWORK 12, STAT 4351

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY

Recitation 2: Probability

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

Continuous Probability Distributions. Uniform Distribution

ECE Lecture #10 Overview

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Learning Objectives for Stat 225

Closed book and notes. 120 minutes. Cover page, five pages of exam. No calculators.

Final Exam # 3. Sta 230: Probability. December 16, 2012

Bivariate distributions

1 Joint and marginal distributions

Contents 1. Contents

Statistics for scientists and engineers

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

Lecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

IE 336 Seat # Name (one point) < KEY > Closed book. Two pages of hand-written notes, front and back. No calculator. 60 minutes.

Chapter 2: Random Variables

Exercises with solutions (Set D)

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

Class 8 Review Problems solutions, 18.05, Spring 2014

Probability and Stochastic Processes

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Three hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

IIT JAM : MATHEMATICAL STATISTICS (MS) 2013

Miscellaneous Errors in the Chapter 6 Solutions

Chapter 2. Discrete Distributions

Discrete Random Variables

MA6451 PROBABILITY AND RANDOM PROCESSES

Final Examination. STA 215: Statistical Inference. Saturday, 2001 May 5, 9:00am 12:00 noon

Chapter 5 Class Notes

PROBABILITY AND STOCHASTIC PROCESSES A Friendly Introduction for Electrical and Computer Engineers

Joint Probability Distributions, Correlations

Transcription:

GROUND RULES: This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner. This exam is closed book and closed notes. Show all of your work and explain all of your reasoning! Translation: No work, no credit. Insufficient explanation, no credit. If you are unsure about whether or not you should explain a result or step in your derivation/proof, then this means you probably should explain it. Do not talk with anyone else about this exam. You must work by yourself. No communication of any type with others. PAGE 1

1. Suppose the continuous random variable X has probability density function ( ), x R, where < µ < and σ > 0. f X (x µ, σ) = 1 2σ exp x µ σ (a) Show that {f X (x µ, σ); < µ <, σ > 0} is a location-scale family and identify the standard density. (b) Does X have pdf in the exponential family? Prove any claims you make. (c) For r > 0, calculate ( X µ P σ ) > r exactly. Compare this calculation with Markov s upper bound for this probability. PAGE 2

2. Suppose X has a geometric distribution; i.e., the probability mass function (pmf) of X is { (1 p) x 1 p, x = 1, 2, 3,..., f X (x p) = 0, otherwise, where 0 < p < 1. (a) Show that {f X (x p), 0 < p < 1} is a one-parameter exponential family. (b) Derive the moment-generating function of X. (c) For this part only, suppose that, conditional on P = p, the random variable X follows a geometric distribution. In other words, the first level of the corresponding hierarchical model is X P = p f X P (x p), where the conditional pmf, now written f X P (x p), is given above. The second level of the hierarchical model is P beta(a, b), where a > 1 is known and b = 1. Derive the marginal pmf of X and find E(X) under this hierarchical model. (d) Describe a real application where the hierarchical model in part (c) might be useful. PAGE 3

3. Consider the random vector (X, Y ) with joint probability density function { c(1 + xy), 1 < x < 1, 1 < y < 1 f X,Y (x, y) = 0, otherwise. (a) Show that c = 1/4. (b) Calculate P (Y > X 2 ). (c) Derive the marginal probability density functions of X and Y. Show that X and Y are not independent. (d) Even though X and Y are not independent, it turns out that X 2 and Y 2 are independent! Prove this. Hint: Work with P (X 2 u, Y 2 v), the joint cdf of (X 2, Y 2 ). PAGE 4

4. Consider the random vector (X, Y ) with joint probability density function { xe x(1+y), x > 0, y > 0 f X,Y (x, y) = 0, otherwise. (a) Show that E(Y ) does not exist but that E(Y X = x) = 1/x. (b) Find the variance of E(X Y ). PAGE 5

5. Suppose X 1, X 2, and X 3 are random variables with joint probability density function (pdf) Define the random variables f X1,X 2,X 3 (x 1, x 2, x 3 ) = 6e x 1 x 2 x 3 I(0 < x 1 < x 2 < x 3 < ). U 1 = X 1, U 2 = X 2 X 1, and U 3 = X 3 X 2. (a) Derive the joint pdf of U = (U 1, U 2, U 3 ). Identify the marginal distributions of U 1, U 2, and U 3 separately. (b) Find the mean and variance of W = 2U 1 + 3U 2 U 3. PAGE 6

6. Suppose X 1, X 2,..., X n are mutually independent b(m i, p i ) random variables; i.e., the probability mass function of X i is given by ( ) mi f Xi (x i p i ) = p x i i (1 p i) m i x i, x i = 0, 1, 2,..., m i, x i for i = 1, 2,..., n. The success probabilities 0 < p i < 1 are unknown parameters. The number of trials m i are fixed and known. (a) If p 1 = p 2 = = p n = p, derive the distribution of Z = n i=1 X i. What can you say about the distribution of Z is the p i s are different? (b) Suppose the success probability associated with X i is p i = exp(a + bw i), i = 1, 2,..., n, 1 + exp(a + bw i ) where the w i s are fixed constants and a R and b R are unknown parameters. Show that the pmf of X i is a two-parameter exponential family. (c) For this part only, suppose n = 2 and p 1 = p 2 = p. Derive the (conditional) probability mass function of X 1 given Z = X 1 + X 2 = r, where r > 0. PAGE 7