Chapter 3 Questions. Question 3.1. Based on the nature of values that each random variable can take, we can have the following classifications:

Similar documents
Events A and B are said to be independent if the occurrence of A does not affect the probability of B.

More on Distribution Function

Introduction Probability. Math 141. Introduction to Probability and Statistics. Albyn Jones

Lecture 3: Random variables, distributions, and transformations

Discrete Probability Distribution

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Outline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks

What are the odds? Coin tossing and applications

Probability Distributions for Discrete RV

Discrete random variables and probability distributions

Probability (10A) Young Won Lim 6/12/17

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

Expected Value. Lecture A Tiefenbruck MWF 9-9:50am Center 212 Lecture B Jones MWF 2-2:50pm Center 214 Lecture C Tiefenbruck MWF 11-11:50am Center 212

What is Probability? Probability. Sample Spaces and Events. Simple Event

Joint Distribution of Two or More Random Variables

Bayes Rule for probability

SDS 321: Introduction to Probability and Statistics

Formal Modeling in Cognitive Science Lecture 19: Application of Bayes Theorem; Discrete Random Variables; Distributions. Background.

Formal Modeling in Cognitive Science


Applied Statistics I

Chapter 2: The Random Variable

Review of Statistics I

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

Statistical methods in recognition. Why is classification a problem?

Appendix A Review of Probability Theory

What is a random variable

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

Outline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks

Review: mostly probability and some statistics

Deep Learning for Computer Vision

Chapter 3: Random Variables 1

Introduction to Probability and Sample Spaces

Statistics Statistical Process Control & Control Charting

Conditional Probability

Chapter 3: Random Variables 1

9709 MATHEMATICS. 9709/62 Paper 6, maximum raw mark 50

Prof. Thistleton MAT 505 Introduction to Probability Lecture 18

Solution to Assignment 3

2/3/04. Syllabus. Probability Lecture #2. Grading. Probability Theory. Events and Event Spaces. Experiments and Sample Spaces

Probability theory. References:

Lecture 6 Random Variable. Compose of procedure & observation. From observation, we get outcomes

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

Probability Pearson Education, Inc. Slide

A = (a + 1) 2 = a 2 + 2a + 1

Mathematics 426 Robert Gross Homework 9 Answers

1 Random variables and distributions

27 Binary Arithmetic: An Application to Programming

MATH 317 Fall 2016 Assignment 5

Homework 10 (due December 2, 2009)

To understand and analyze this test, we need to have the right model for the events. We need to identify an event and its probability.

Statistical Machine Learning Notes 1. Background

Basic Statistics for SGPE Students Part II: Probability theory 1

Stochastic processes and Data mining

Exam 3 review for Math 1190

Machine Learning Lecture Notes

Multivariate Distributions (Hogg Chapter Two)

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev

Differential Equations

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

Conditional Probability

Practice Examination # 3

First Order Differential Equations and Applications

STA 256: Statistics and Probability I

Math 1310 Lab 10. (Sections )

NO CALCULATORS. NO BOOKS. NO NOTES. TURN OFF YOUR CELL PHONES AND PUT THEM AWAY.

Definition of differential equations and their classification. Methods of solution of first-order differential equations

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

(i) Given that a student is female, what is the probability of having a GPA of at least 3.0?

Conditional Probability and Bayes Theorem (2.4) Independence (2.5)

ECE 4400:693 - Information Theory

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

Statistical Experiment A statistical experiment is any process by which measurements are obtained.

Integration. Antiderivatives and Indefinite Integration 3/9/2015. Copyright Cengage Learning. All rights reserved.

Joint p.d.f. and Independent Random Variables

SDS 321: Introduction to Probability and Statistics

Lecture 3 - Axioms of Probability

Continuous Random Variables

Probability, Random Processes and Inference

Ordinary Differential Equations (ODEs)

2.12: Derivatives of Exp/Log (cont d) and 2.15: Antiderivatives and Initial Value Problems

11. Probability Sample Spaces and Probability

Problem Max. Possible Points Total

New test - November 03, 2015 [79 marks]

Lecture 19: Solving linear ODEs + separable techniques for nonlinear ODE s

Notation: X = random variable; x = particular value; P(X = x) denotes probability that X equals the value x.

1 Review of di erential calculus

Lecture 1 : The Mathematical Theory of Probability

Probability Lecture #7

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

Expected Value 7/7/2006

2010 Euclid Contest. Solutions

CS206 Review Sheet 3 October 24, 2018

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 4B Notes. Written by Victoria Kala SH 6432u Office Hours: T 12:45 1:45pm Last updated 7/24/2016

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

First Digit Tally Marks Final Count

Ordinary Differential Equations

Introduction to bivariate analysis

Transcription:

Chapter Questions Question. Based on the nature of values that each random variable can take, we can have the following classifications: X: Discrete; since X is essentially count data) Y: Continuous; since Y is measured data M: Continuous; since M is measured data N: Discrete; since N is count data P: Discrete; since P is count data Q: Continuous; since Q is measured data Question. The results could be summarized into the table below: Sample HHH HHT HTH HTT THH THT TTH TTT point w - = - = - = - = - - = - = - - = - - = - Question.5 The principal rule for solving this question is that all probabilities for all sample points within the sample space should add up to. We desire: c(x + 4) x= So: = c X [( + 4)+ ( + 4) + (4 + 4) + (9 + 4)] = c = Following the similar manner of computation: C[( ) ( ) + ( ) ( ) + ( )( )] =

C = + + = Question.8 We first compute the probability associated to each sample point, then add up the corresponding sample points to obtain probability associated with each value of W. P(W=-)= P(TTT) = / X / X / = /7 P(W=-) = P(HTT) + P(THT) + P(TTH) = X (/ X / X /) = /9 P(W=) = P(HHT) + P(HTH) + P(THH) = X (/ X / X /) = 4/9 P(W=) = P(HHH) = / X / X / = 8/7 Verify they sum up to one: /7 + /9 + 4/9 + 8/7 = Thus, above describes a valid probability distribution on discrete random variable W. Question.9 Integrate over the value of X to obtain the area under this part of the density function: P( < X < ) = (x+) dx = [ x +4x ] 5 = (+4)/5 Still use integration over density function to compute the probability: = P(/4 < X < /) = 5 = [ x +4x ] / 5 /4 / (x+) /4 5 = (/4+)/5 (/6 + )/5 = 9/8 dx

Question. First compute probability distribution of discrete random variable X: Note that the values X could take on are:,,. P(X=) = (5 ) P(X=) = (5 )( ) P(X=) = (5 )( ) ( 7 = /5 = /7 ) ( 7 ) = (X)/5 = 4/7 ( 7 ) = (5X)/5 = /7 Note that they sum up to, thus the above distribution is a valid probability distribution. The probability histogram is as following: Question. To obtain the cumulative distribution function, we just need to performing a cumulative running sum of the probability mass function on X. Thus, the cumulative distribution function is shown as following: X x < x < x < x < x < 4 x 4 F(x).4.4+.7.78+.6=.94.94+.5=.99.99+. = =.78

Note the ending value for the cumulative function always equals to. Question.4 We just need to directly plot in the value of x into F(x): With minutes =. hours, we have: First, compute probability density function: F(x =.5) = e 8. =.798 f(x) = F (x) = 8e 8x Then, compute the probability using integration over the density function:. p = f(x)dx. = 8 e 8x dx = 8 8 [e 8x ]. = e 8. =.798 First, find the cumulative distribution function F(x): Note that from question.: P(X=) = /7 P(X=) = 4/7 P(X=) = /7 Question.5 Then the cumulative function for F(x) is shown as following: x x < x < x < x F(x) /7 /7 + 4/7 = 6/7 6/7 + /7 = Then use F(x) for asked quantities.

P(X = ) = F(X = ) F(X = ) = 6/7 /7 = 4/7 (since there is no probability mass within (,)) P( < X ) = P(X ) P(X ) = F(X = ) F(X = ) = /7 = 5/7 The graph is shown as following: Question.6 Question.9 For this density function to be valid, it has to satisfy two conditions:. f(x) x

+. f(x)dx = Condition is easily verified as exponential function never takes on negative values. For condition verification: + f(x)dx Thus this density function is indeed valid. F(x) is computed by the integration of f(x): For x<, as f(x)= everywhere, F(x)=. Part (c) + = x 4 dx = [x ] + = F(X) = The probability can be computed using compliment: X f(a)da x = a 4 da = [a ] x = x (for x ) P = -F(4) = ( 4 ) = /64 Question.6 Given the questions asked are about probabilities of certain range, compute the cumulative distribution function first: F(x) = x f(a)da = ( a)da x

= [ a + a] x = x + x (if <= x <= ) If x < : trivially F(x) = If x > : F(x) = Use cumulative function: P(X /) = F(x = /) = ( ) + / = 5/9 Use compliment for computing the desired probability, with cumulative function: Part (c) P = -P(x<=.5) = -F(x=.5) = (.5 +.5) =.5 P(x <.75 x.5) = p(.5 x<.75) p(x.5) = F(x=.75) F(x=.5) F(x=.5) =.75 +.75 (.5 +.5) (.5 +.5) =.975.75.5 =.75

STA86 Problem Set Solutions Problem.8 p(x, Y = ) = p(x =, Y = ) + p(x =, Y = ) + p(x =, Y = ) = f(,) + f(,) + f(,) = + + + + + = 5 p(x >, Y ) = p(x =, Y = ) + P(X =, Y = ) = f(,) + f(,) = + + + = 7 Part (c) p(x > Y) = f(,) + f(,) + f(,) + f(,) + f(,) + f(,) ( + ) + ( + ) + ( + ) + ( + ) + ( + ) + ( + ) = = 5 Part (d) p(x + Y = 4) = f(,) + f(,) ( + ) + ( + ) = = 4 5

Problem.4 Integrate the joint density function over all values of Y to get marginal density of X: g(x) = f(x, y)dy = (x + y)dy = [xy + y ] = x + Integrate the joint density function over all values of X to get marginal density of Y: h(y) = f(x, y)dy = (x + y)dx = [x + xy] Part (c) = 4y + This event corresponds to the condition on random variable: X.5. As only the drive-through facility is in the picture of the problem, we only need to consider the marginal density function g(x). Integrate over the density function to get the desired event probability:.5 p(x.5) = g(x)dx.5 = x + dx

= [ x + x ].5 =.5 = 5 Problem.56 Note that for values of f(x,y), Y s range is actually defined based on x, we can have the following observation: Given a fixed value of Y =.5: Case : x =.. Then y < -x, and thus f(.,.5) = 6X. =.8 (Note that since it s density function, so a value greater than is possible). Case : x =.7. Then y > -x, and thus f(.7,.5) = Thus we have f(.,.5) f(.7,.5), meaning the probability density for y =.5 is dependent on x. Thus, X and Y are not independent. Note that for Y =.5, when X.5, y < -x doesn t hold and leads to probability density. Thus, we only need to consider domains of X where the density is non-zero. P(X >. Y =.5) = f(x,.5)dx..5 = f(x,.5)dx..5 = 6xdx. = [x ].5. =.48

Problem.57 The requirement for probability density function to be valid is that it would be integrated to over all domains. f(x, y, z)dxdydz = kxy zdxdydz = ky z dydz = kz 6 dz Thus, k =. = k = (by definition) Integrate the corresponding region with respect to each of the variable, we have: p(x < 4, Y >, < Z < ) 4 = f(x, y, z)dxdydz 4 = kxy zdxdydz = ky z dydz = 7kz 768 dz = [ 7kz 56 ]

= 6 56 = 5