Homework 9 for BST 631: Statistical Theory I Problems, 11/02/2006

Similar documents
Probability and Random Variable Primer

First Year Examination Department of Statistics, University of Florida

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

PhysicsAndMathsTutor.com

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

SELECTED PROOFS. DeMorgan s formulas: The first one is clear from Venn diagram, or the following truth table:

Engineering Risk Benefit Analysis

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur

Stat 543 Exam 2 Spring 2016

Lecture 3: Probability Distributions

Differentiating Gaussian Processes

b ), which stands for uniform distribution on the interval a x< b. = 0 elsewhere

Why Monte Carlo Integration? Introduction to Monte Carlo Method. Continuous Probability. Continuous Probability

Stat 543 Exam 2 Spring 2016

CS-433: Simulation and Modeling Modeling and Probability Review

A be a probability space. A random vector

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

CS 798: Homework Assignment 2 (Probability)

between standard Gibbs free energies of formation for products and reactants, ΔG! R = ν i ΔG f,i, we

Chapter 4 - Lecture 3 The Normal Distribution

TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES

PROBABILITY PRIMER. Exercise Solutions

Multi-dimensional Central Limit Theorem

Hydrological statistics. Hydrological statistics and extremes

Strong Markov property: Same assertion holds for stopping times τ.

Lecture 6 More on Complete Randomized Block Design (RBD)

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Large Sample Properties of Matching Estimators for Average Treatment Effects by Alberto Abadie & Guido Imbens

Limited Dependent Variables

What would be a reasonable choice of the quantization step Δ?

Queens College, CUNY, Department of Computer Science Numerical Methods CSCI 361 / 761 Spring 2018 Instructor: Dr. Sateesh Mane.

Suites of Tests. DIEHARD TESTS (Marsaglia, 1985) See

Randomness and Computation

Convergence of random processes

MATH 281A: Homework #6

Notes prepared by Prof Mrs) M.J. Gholba Class M.Sc Part(I) Information Technology

Composite Hypotheses testing

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

Probability review. Adopted from notes of Andrew W. Moore and Eric Xing from CMU. Copyright Andrew W. Moore Slide 1

Special Discrete RV s. Then X = the number of successes is a binomial RV. X ~ Bin(n,p).

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Simulation and Random Number Generation

Lecture 20: Hypothesis testing

Multi-dimensional Central Limit Argument

A REVIEW OF ERROR ANALYSIS

Goodness of fit and Wilks theorem

Simulation and Probability Distribution

+, where 0 x N - n. k k

Comments on Detecting Outliers in Gamma Distribution by M. Jabbari Nooghabi et al. (2010)

Bell-shaped curves, variance

Chapter 2 Transformations and Expectations. , and define f

6. Stochastic processes (2)

6. Stochastic processes (2)

Continuous Time Markov Chain

Modelli Clamfim Equazioni differenziali 7 ottobre 2013

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

7. Multivariate Probability

xp(x µ) = 0 p(x = 0 µ) + 1 p(x = 1 µ) = µ

Chapter 4 Multiple Random Variables

Applied Stochastic Processes

Chapter 1. Probability

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

π e ax2 dx = x 2 e ax2 dx or x 3 e ax2 dx = 1 x 4 e ax2 dx = 3 π 8a 5/2 (a) We are considering the Maxwell velocity distribution function: 2πτ/m

Complex Numbers Alpha, Round 1 Test #123

Modelli Clamfim Equazioni differenziali 22 settembre 2016

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

STT 441 Final Exam Fall 2013

DS-GA 1002 Lecture notes 5 Fall Random processes

= z 20 z n. (k 20) + 4 z k = 4

STAT 511 FINAL EXAM NAME Spring 2001

Gaussian Mixture Models

Canonical transformations

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline

Communication with AWGN Interference

Lecture 4. Continuous Random Variables and Transformations of Random Variables

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

PES 1120 Spring 2014, Spendier Lecture 6/Page 1

EGR 544 Communication Theory

Math1110 (Spring 2009) Prelim 3 - Solutions

Expected Value and Variance

EM and Structure Learning

SUPPLEMENTARY INFORMATION

An (almost) unbiased estimator for the S-Gini index

An Experiment/Some Intuition (Fall 2006): Lecture 18 The EM Algorithm heads coin 1 tails coin 2 Overview Maximum Likelihood Estimation

Review for the previous lecture

Math 426: Probability MWF 1pm, Gasson 310 Homework 4 Selected Solutions

F71SM1 STATISTICAL METHODS TUTORIAL ON 7 ESTIMATION SOLUTIONS

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

Chapter 12. Ordinary Differential Equation Boundary Value (BV) Problems

A note on almost sure behavior of randomly weighted sums of φ-mixing random variables with φ-mixing weights

5.2 Continuous random variables

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Test Problems for Probability Theory ,

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0

Neumann Asymptotic Eigenvalues of Sturm-liouville Problem with Three Turning Points

Transcription:

Due Tme: 5:00PM Thursda, on /09/006 Problem (8 ponts) Book problem 45 Let U = X + and V = X, then the jont pmf of ( UV, ) s θ λ θ e λ e f( u, ) = ( = 0, ; u =, +, )! ( u )! Then f( u, ) u θ λ f ( x x+ ) = = ( u x, x) f( ) = + =, θ + λ θ + λ θ whch s Bnomal( x+, θ + λ ) Smlarl, ou can get f( u, ) u λ ϑ f ( x+ ) = = ( u x, ) f( u) = + = θ + λ θ + λ Problem (5 ponts) Book problem 46 (a) The support set of ( UV, ) s {( u, ) : u=,, ; = 0, ±, ±, } For ( u, ) and > 0, f( u, ) P( U u, V ) P( u, X ) P( X u, u) p ( p) u For ( u, ) and = 0, + = = = = = = = = + = = For ( u, ) and < 0, f( u, ) P( X u, u) p ( p) u = = = = Thus f( u, ) P( X u, u ) p ( p) u = = = = u f( u, ) = p ( p) ( p) ( u =, ; = 0, ±, ) Therefore, U and V are ndependent (b) Let Z = X /( X + ) and V = X, then X = V and = V / Z V The jont pmf of ( Z, V ) s + / z / z fzv, ( z, ) = P( Z = z, V = ) = P( X =, = / z ) = p ( p) = p ( p) for =, and z s taken from all rreducble proper fracton Thus the margnal pmf of Z s

/ z / z p ( p) fz ( z) = p ( p) =, where the support set of Z s all rreducble = / z ( p) ( p) proper fracton (c) Let U = X, V = X +, then the jont pmf of ( UV, ) s f ( u, ) = PX ( = u, = u) = p( p) I ( ui ) ( u) u+ u UV, {, } {, } = p ( p) I{, } ( u) I{, } ( u), the support set of ( UV, ) s u =, ; = u+, Problem 3 (0 ponts) Book problem 40 (a) Let A0 = {( x, x) : x = 0}, A = {( x, x) : x > 0}, A = {( x, x) : x < 0}, then PA ( 0) = 0 and on A and A, the transformaton s one to one On : x =, A On : x =, A Therefore, the jont pdf of (, ) s x = ( ), J = x = ( ), J = f (, ) = exp( )(0 < <, < < ), πσ σ (b) and are the functon of dstance and angle The ndependence between them means the dstance and the angle are ndependent ; Problem 4 (0 ponts) Book problem 44 The transformaton s one to one from (0, ) (0, ) to (0, ) (0,) and x = zz, = z( z) and J = z The jont pdf of Z and Z are Thus, Z and Z are ndependent f ( z, z ) = ( z z ) e ( z ( z )) Γ() r Γ() s e z r+ s z r s = z e z ( z) Γ() r Γ() s r zz s z( z) Z, Z

Problem 5 ( ponts) Book problem 46 For z > 0, PZ ( zw, = 0) = PZ ( = mn( X, ) zz, = ) z x/ λ / µ λ (/ λ + / µ ) = P ( z, X) = e e dxd= ( e ) 0 λµ λ + µ Smlarl, we hae In addton, and Thus, Z and W are ndependent µ PZ zw e λ + µ (/ λ + / µ ) (, = ) = ( ) λ PW ( = 0) = P ( X) = = PW ( = ) λ + µ P( Z z) P( Z z, W 0) P( Z z, W ) e (/ λ + / µ ) = = + = = PZ ( zw = 0) = PZ ( zw, = 0) / PW ( = 0) = PZ ( z), PZ ( zw = ) = PZ ( zw, = 0) / PW ( = ) = PZ ( z) Problem 6 (5 ponts) Book problem 43 (a) E ( ) = EE ( ( X)) = EnX ( ) = n/ Var( ) = E( Var( X )) + Var( E( X )) = E( nx ( X ) + Var( nx ) = ne X ne X + n Var X = n n + n = n + n ( ) ( ) ( ) / /3 / /6 / n n (b) f, X(, x) = P( = X = x) fx( x) = x ( x) ( = 0,,, n;0< x< ) n n n Γ ( + ) Γ( n + ) (c) f ( ) = x ( x) dx ( 0,,, n) 0 = = = Γ ( n+ ) n+ Problem 7 (5 ponts) (000 Qualf Exam Problem ) () Let X ~ U (0,) and let = a(ln( X)) Fnd the pdf for the random arable () Let the random arables X f x = e for 0 x < x / λ ~ X ( ) λ () Fnd the moment generatng functon for each random arable 3

() Fnd the dstrbuton functon for = X+ X n x n x (3) Let X ~ fx ( x n, p) = p ( p) ( x= 0,,, n) and let x α β p~ g( p α, β) = p ( p) / B( α, β), 0 p, α > 0, β > 0, where Γ( a) Γ( β ) B( αβ, ) = Fnd EX Γ ( α + β ) () The pdf of s f( ) = exp( ) I( aln(), ) ( ) a a () M X () t = λ t Let = X+ X, Z = X, then z z fz, (, z) = exp( )exp( )(0 < z < ) λλ λ λ Thus, f( ) = f, Z(, z) dz = [exp( ) exp( )] 0 λ λ λ λ For λ = λ, f ( ) = exp( ) λ λ λ, whch s also equal to (3) E( X) = E( E( X P)) = E( np) = n α α + β lm λ λ 0 [exp( ) exp( )] λ λ λ λ Problem 8 (5 ponts) (00 October Qualf Exam Problem 3) () Suppose that X s the concentraton (n parts per mllon) of a certan arborne pollutant, and suppose that the random arable = ln X has a dstrbuton that can be adequatel modeled b the denst functon f e β / α ( ) = ( α), < β <,0 < α < () Fnd an explct expresson for F ( ), the cumulate dstrbuton functon (CDF) assocated wth the denst functon f ( ) If α = and β =, use ths CDF to fnd the exact numercal alue of PX ( > 4 X> ) () For the denst functon f ( ) gen aboe, dere an explct expresson for a generatng 4

functon ψ ( t ) that can be used to generate the absolute central moments = E( E r ) for r a non-negate nteger, and then use ψ ( t ) drectl to fnd Var( ), the arance of () Let U be ndependentl and dentcall dstrbuted as Unform(0,) for =, Let X be defned as X = U+ U Show that the probablt denst functon for the random arable X s x,0 x< the followng functon: f( x) = x, x () β exp( ), < β ; α F ( ) = β exp( ), β α PX ( > 4 X> ) = PX ( > 4) / PX ( > ) = P ( > ln(4)) / P ( > ln()) = ( F (ln(4))) /( F (ln())) = 084 E = β and consder Ψ ( t) = E(exp( E t) = E(exp( β t)) = αt d Var( ) = t 0= α dt α t = () Let = U, then the jont pdf of ( X, ) s f( x, ) = (0< <, < x< + ) From ths, t s eas to get the margnal pdf of X r 5