Conditional Probability

Similar documents
Applied Statistics I

Probability Distributions for Discrete RV

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Discrete random variables and probability distributions

Conditional Probability

Lecture 3: Random variables, distributions, and transformations

Deep Learning for Computer Vision

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

Lecture 1 : The Mathematical Theory of Probability

Events A and B are said to be independent if the occurrence of A does not affect the probability of B.


What is Probability? Probability. Sample Spaces and Events. Simple Event

Part (A): Review of Probability [Statistics I revision]

SDS 321: Introduction to Probability and Statistics

Chapter 1 Probability Theory

PERMUTATIONS, COMBINATIONS AND DISCRETE PROBABILITY

More on Distribution Function

STAT 430/510 Probability

ELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities

Discrete Probability Distribution

Bayes Rule for probability

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

Lecture 6 Random Variable. Compose of procedure & observation. From observation, we get outcomes

Probability theory. References:

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Statistics Statistical Process Control & Control Charting

Lecture Lecture 5

Outline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks

2. Conditional Probability

What is a random variable

Expected Value 7/7/2006

Quantitative Methods for Decision Making

The Bernoulli distribution has only two outcomes, Y, success or failure, with values one or zero. The probability of success is p.

Probability. VCE Maths Methods - Unit 2 - Probability

(i) Given that a student is female, what is the probability of having a GPA of at least 3.0?

Key Concepts. Key Concepts. Event Relations. Event Relations

27 Binary Arithmetic: An Application to Programming

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Determining Probabilities. Product Rule for Ordered Pairs/k-Tuples:

PROBABILITY CHAPTER LEARNING OBJECTIVES UNIT OVERVIEW

Notation: X = random variable; x = particular value; P(X = x) denotes probability that X equals the value x.

Introduction Probability. Math 141. Introduction to Probability and Statistics. Albyn Jones

Probability Pearson Education, Inc. Slide

Lecture 3 - Axioms of Probability

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

Probability (10A) Young Won Lim 6/12/17

p. 4-1 Random Variables

4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

STAT/MA 416 Answers Homework 4 September 27, 2007 Solutions by Mark Daniel Ward PROBLEMS

Discrete Random Variable

Dept. of Linguistics, Indiana University Fall 2015

Probabilistic models

Chapter 3: Random Variables 1

CSC Discrete Math I, Spring Discrete Probability

9/6/2016. Section 5.1 Probability. Equally Likely Model. The Division Rule: P(A)=#(A)/#(S) Some Popular Randomizers.

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

Properties of Probability

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

3rd IIA-Penn State Astrostatistics School July, 2010 Vainu Bappu Observatory, Kavalur

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

Chapter 2 PROBABILITY SAMPLE SPACE

Joint Distribution of Two or More Random Variables

EE 178 Lecture Notes 0 Course Introduction. About EE178. About Probability. Course Goals. Course Topics. Lecture Notes EE 178

Probability, Random Processes and Inference

Example A. Define X = number of heads in ten tosses of a coin. What are the values that X may assume?

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3.

324 Stat Lecture Notes (1) Probability

CLASS 6 July 16, 2015 STT

Chapter 2: The Random Variable

Recitation 2: Probability

Mathacle. A; if u is not an element of A, then A. Some of the commonly used sets and notations are

Outline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks

LECTURE NOTES by DR. J.S.V.R. KRISHNA PRASAD

Introduction to Probability and Statistics Slides 3 Chapter 3

Discrete Probability Distributions

Probability and Statisitcs

Sec$on Summary. Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability

Chapter 4: Probability and Probability Distributions

Probability Theory Review

CS206 Review Sheet 3 October 24, 2018

Relationship between probability set function and random variable - 2 -

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102

Example. What is the sample space for flipping a fair coin? Rolling a 6-sided die? Find the event E where E = {x x has exactly one head}

Uncertainty. Russell & Norvig Chapter 13.

ECE 302: Probabilistic Methods in Electrical Engineering

MA : Introductory Probability

CS 361: Probability & Statistics

Fault-Tolerant Computer System Design ECE 60872/CS 590. Topic 2: Discrete Distributions

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

Homework 4 Solution, due July 23

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

2. In a clinical trial of certain new treatment, we may be interested in the proportion of patients cured.

Transcription:

Conditional Probability

Conditional Probability The Law of Total Probability Let A 1, A 2,..., A k be mutually exclusive and exhaustive events. Then for any other event B, P(B) = P(B A 1 ) P(A 1 ) + P(B A 2 ) P(A 2 ) + + P(B A k ) P(A k ) k = P(B A i ) P(A i ) i=1 where exhaustive means A 1 A 2 A k = S.

Conditional Probability

Conditional Probability Bayes Theorem Let A 1, A 2,..., A k be a collection of k mutually exclusive and exhaustive events with prior probabilities P(A i )(i = 1, 2,..., k). Then for any other event B with P(B) > 0, the posterior probability of A j given that B has occurred is P(A j B) = P(A j B) P(B) = P(B A j ) P(A j ) k i=1 P(B A i) P(A i ) j = 1, 2,... k

Independence

Independence Definition Two events A and B are independent if P(A B) = P(A), and are dependent otherwise.

Independence Definition Two events A and B are independent if P(A B) = P(A), and are dependent otherwise.

Independence

Independence The Multiplication Rule for Independent Events Proposition Events A and B are independent if and only if P(A B) = P(A) P(B)

Independence

Independence Independence of More Than Two Events Definition Events A 1, A 2,..., A n are mutually independent if for every k (k = 2, 3,..., n) and every subset of indices i 1, i 2,..., i k, P(A i1 A i2 A ik ) = P(A ii ) P(A i2 ) P(A ik ).

Random Variables

Random Variables Definition For a given sample sample space S of some experiment, a random variable (rv) is any rule that associates a number with each outcome in S. In mathematical language, a random variable is a function whose domain is the sample space and whose range is the set of real numbers.

Random Variables Definition For a given sample sample space S of some experiment, a random variable (rv) is any rule that associates a number with each outcome in S. In mathematical language, a random variable is a function whose domain is the sample space and whose range is the set of real numbers. We use uppercase letters, such as X and Y to, denote random variables and use lowercase letters, such as x and y, to denote some particular value of the corresponding random variable. For example, X (s) = x means that value x is associated with the oucome s by the rv X.

Random Variables Definition For a given sample sample space S of some experiment, a random variable (rv) is any rule that associates a number with each outcome in S. In mathematical language, a random variable is a function whose domain is the sample space and whose range is the set of real numbers. We use uppercase letters, such as X and Y to, denote random variables and use lowercase letters, such as x and y, to denote some particular value of the corresponding random variable. For example, X (s) = x means that value x is associated with the oucome s by the rv X.

Random Variables

Random Variables Examples:

Random Variables Examples: 1. Assume we toss a coin. Then S = {H, T}. We can define a rv X by X (H) = 1 and X (T) = 0

Random Variables Examples: 1. Assume we toss a coin. Then S = {H, T}. We can define a rv X by X (H) = 1 and X (T) = 0 2. A techincian is going to check the quality of 10 prodcuts. For each product the outcome is either successful (S) or defective (D). Then we can define a rv Y by { 1, successful Y = 0, defective

Random Variables Examples: 1. Assume we toss a coin. Then S = {H, T}. We can define a rv X by X (H) = 1 and X (T) = 0 2. A techincian is going to check the quality of 10 prodcuts. For each product the outcome is either successful (S) or defective (D). Then we can define a rv Y by { 1, successful Y = 0, defective Definition Any random variable whose only possible values are 0 and 1 is called a Bernoulli random variable.

Random Variables

Random Variables More examples: 3. (Example 3.3) We are investigating two gas stations. Each has six gas pumps. Consider the experiment in which the number of pumps in use at a particular time of day is determined for each of the stations. Define rv s X, Y and U by X = the total number of pumps in use at the two stations Y = the difference between the number of pumps in use at station 1 and the number in use at station 2 U = the maximum of the numbers of pumps in use at the two station

Random Variables More examples: 3. (Example 3.3) We are investigating two gas stations. Each has six gas pumps. Consider the experiment in which the number of pumps in use at a particular time of day is determined for each of the stations. Define rv s X, Y and U by X = the total number of pumps in use at the two stations Y = the difference between the number of pumps in use at station 1 and the number in use at station 2 U = the maximum of the numbers of pumps in use at the two station If this experiment is performed and s = (3, 4) results, then X ((3, 4)) = 3 + 4 = 7, so we say that the observed value of X was x = 7. Similarly, the observed value of Y would be y = 3 4 = 1, and the observed value of U would be u = max(3, 4) = 4.

Random Variables

Random Variables More examples: 4. Assume we toss a coin until we get a Head. Then the sample space would be S = {H, TH, TTH, TTTH,... } If we define a rv X by X X = the number we totally tossed Then X ({H}) = 1, X ({TH}) = 2, X ({TTH}) = 3,..., and so on.

Random Variables More examples: 4. Assume we toss a coin until we get a Head. Then the sample space would be S = {H, TH, TTH, TTTH,... } If we define a rv X by X X = the number we totally tossed Then X ({H}) = 1, X ({TH}) = 2, X ({TTH}) = 3,..., and so on. In this case, the random variable X can be any positive integer, which in all is infinite.

Random Variables More examples: 4. Assume we toss a coin until we get a Head. Then the sample space would be S = {H, TH, TTH, TTTH,... } If we define a rv X by X X = the number we totally tossed Then X ({H}) = 1, X ({TH}) = 2, X ({TTH}) = 3,..., and so on. In this case, the random variable X can be any positive integer, which in all is infinite. 5. Assume we are going to measure the length of 100 desks. Define the rv Y by Y = the length of a particular desk Y can also assume infinitly possible values.

Random Variables

Random Variables Definition A dicrete random variable is an rv whose possible values either constitute a finite set or else can be listed in an infinite sequence in which there is a first element, a second element, and so on ( countably infinite).

Random Variables Definition A dicrete random variable is an rv whose possible values either constitute a finite set or else can be listed in an infinite sequence in which there is a first element, a second element, and so on ( countably infinite). A random variable is continuous if both of the following apply:

Random Variables Definition A dicrete random variable is an rv whose possible values either constitute a finite set or else can be listed in an infinite sequence in which there is a first element, a second element, and so on ( countably infinite). A random variable is continuous if both of the following apply: 1. Its set of possible values consists either of all numbers in a single interval on the number line (possibly infinite in extent, e.g., (, ) ) or all numbers in a disjoint union of such intervals (e.g., [0, 10] [20, 30]).

Random Variables Definition A dicrete random variable is an rv whose possible values either constitute a finite set or else can be listed in an infinite sequence in which there is a first element, a second element, and so on ( countably infinite). A random variable is continuous if both of the following apply: 1. Its set of possible values consists either of all numbers in a single interval on the number line (possibly infinite in extent, e.g., (, ) ) or all numbers in a disjoint union of such intervals (e.g., [0, 10] [20, 30]). 2. No possible value of the variable has positive probability, that is, P(X = c) = 0 for any possible value c. Examples

An example: Assume we toss a coin 3 times and record the outcomes. Let X i be a random variable defined by { 1, if the i th outcome is Head; X i = 0, if the i th outcome is Tail; Let X be the random variable such that X = X 1 + X 2 + X 3, then X represents the total number of Heads we could get from the experiment.

An example: Assume we toss a coin 3 times and record the outcomes. Let X i be a random variable defined by { 1, if the i th outcome is Head; X i = 0, if the i th outcome is Tail; Let X be the random variable such that X = X 1 + X 2 + X 3, then X represents the total number of Heads we could get from the experiment. If the probability for getting a Head for each toss is 0.7, then the probabilities for all the outcomes are tabulated as following: s HHH HHT HTH HTT THH THT TTH TTT x 3 2 2 1 2 1 1 0 p(x) 0.343 0.147 0.147 0.063 0.147 0.063 0.063 0.027

Example continued: s HHH HHT HTH HTT THH THT TTH TTT x 3 2 2 1 2 1 1 0 p(x) 0.343 0.147 0.147 0.063 0.147 0.063 0.063 0.027

Example continued: s HHH HHT HTH HTT THH THT TTH TTT x 3 2 2 1 2 1 1 0 p(x) 0.343 0.147 0.147 0.063 0.147 0.063 0.063 0.027 We can re-tabulate it only for the x values: x 0 1 2 3 p(x) 0.027 0.189 0.441 0.343

Example continued: s HHH HHT HTH HTT THH THT TTH TTT x 3 2 2 1 2 1 1 0 p(x) 0.343 0.147 0.147 0.063 0.147 0.063 0.063 0.027 We can re-tabulate it only for the x values: x 0 1 2 3 p(x) 0.027 0.189 0.441 0.343 Now we can answer various questions.

Example continued: s HHH HHT HTH HTT THH THT TTH TTT x 3 2 2 1 2 1 1 0 p(x) 0.343 0.147 0.147 0.063 0.147 0.063 0.063 0.027 We can re-tabulate it only for the x values: x 0 1 2 3 p(x) 0.027 0.189 0.441 0.343 Now we can answer various questions. The probability that there are at most 2 Heads is P(X 2) = P(x = 0 or 1 or 2) = p(0) + p(1) + p(2) = 0.657

Example continued: s HHH HHT HTH HTT THH THT TTH TTT x 3 2 2 1 2 1 1 0 p(x) 0.343 0.147 0.147 0.063 0.147 0.063 0.063 0.027 We can re-tabulate it only for the x values: x 0 1 2 3 p(x) 0.027 0.189 0.441 0.343 Now we can answer various questions. The probability that there are at most 2 Heads is P(X 2) = P(x = 0 or 1 or 2) = p(0) + p(1) + p(2) = 0.657 The probability that the number of Heads are is strictly between 1 and 3 is P(1 < X < 3) = P(X = 2) = p(2) = 0.441

Definition The probability distribution or probability mass function (pmf) of a discrete rv is defined for every number x by p(x) = P(X = x) = P(all s S : X (s) = x).

Definition The probability distribution or probability mass function (pmf) of a discrete rv is defined for every number x by p(x) = P(X = x) = P(all s S : X (s) = x). In words, for every possible value x of the random variable, the pmf specifies the probability of observing that value when the experiment is performed. (The conditions p(x) 0 and all possible x p(x) = 1 are required for any pmf.)

Example 3.8 Six lots of components are ready to be shipped by a certain supplier. The number of defective components in each lot is as follows: Lot 1 2 3 4 5 6 Number of defectives 0 2 0 1 2 0 One of these lots is to be randomly selected for shipment to a particular customer. Let X be the number of defectives in the selected lot.

Example 3.8 Six lots of components are ready to be shipped by a certain supplier. The number of defective components in each lot is as follows: Lot 1 2 3 4 5 6 Number of defectives 0 2 0 1 2 0 One of these lots is to be randomly selected for shipment to a particular customer. Let X be the number of defectives in the selected lot. The three possible X values are 0, 1 and 2. The pmf for X is p(0) = P(X = 0) = P(lot 1 or 3 or 6 is selected) = 3 6 = 0.500 p(1) = P(X = 1) = P(lot 4 is selected) = 1 6 = 0.167 p(2) = P(X = 2) = P(lot 2 or 5 is selected) = 2 6 = 0.333

Example 3.10: Consider a group of five potential blood donors a, b, c, d, and e of whom only a and b have type O+ blood. Five blood smaples, one from each individual, will be typed in random order until an O+ individual is identified. Let the rv Y = the number of typings necessary to identify an O+ individual. Then what is the pmf of Y?

Example: Consider whether the next customer coming to a certain gas station buys gasoline or diesel. Let { 1, if the customer purchases gasoline X = 0, if the customer purchases diesel If 30% of all customers in one month purchase diesel, then the pmf for X is p(0) = P(X = 0) = P(nextcustomerbuysdiesel) = 0.3 p(1) = P(X = 1) = P(nextcustomerbuysgasoline) = 0.7 p(x) = P(X = x) = 0 for x 0 or 1

Example: Consider whether the next customer coming to a certain gas station buys gasoline or diesel. Let { 1, if the customer purchases gasoline X = 0, if the customer purchases diesel If 100α% of all customers in one month purchase diesel, then the pmf for X is p(0) = P(X = 0) = P(nextcustomerbuysdiesel) = α p(1) = P(X = 1) = P(nextcustomerbuysgasoline) = 1 α p(x) = P(X = x) = 0 for x 0 or 1 here α is between 0 and 1.

Definition Suppose p(x) depends on a quantity that can be assigned any one of a number of possible values, with each different value determining a different probability distribution. Such a quantity is called a parameter of the distribution. The collection of all probability distributions for different values of the parameter is called a family of probability distribution.

Definition Suppose p(x) depends on a quantity that can be assigned any one of a number of possible values, with each different value determining a different probability distribution. Such a quantity is called a parameter of the distribution. The collection of all probability distributions for different values of the parameter is called a family of probability distribution. For the previous example, the quantity α is a parameter. Each different value of α between 0 and 1 determines a different member of a family of distributions; two such members are 0.3 if x = 0 p(x) = 0.7 if x = 1 0 otherwise 0.25 if x = 0 p(x) = 0.75 if x = 1 0 otherwise

Example: Assume we are drawing cards from a 100 well-shuffled cards with replacement. We keep drawing until we get a. Let p = P({ }), i.e. there are 100 p s. Assume the successive drawings are independent and define X = the number of drawings. Then p(1) = P(X = 1) = P({ }) = p p(2) = P(X = 2) = P({ }) = (1 p) p p(3) = P(X = 3) = P({ }) = (1 p) (1 p) p...

Example: Assume we are drawing cards from a 100 well-shuffled cards with replacement. We keep drawing until we get a. Let p = P({ }), i.e. there are 100 p s. Assume the successive drawings are independent and define X = the number of drawings. Then p(1) = P(X = 1) = P({ }) = p p(2) = P(X = 2) = P({ }) = (1 p) p p(3) = P(X = 3) = P({ }) = (1 p) (1 p) p... A general formula would be { (1 p) x 1 p x = 1, 2, 3,... p(x) = 0 otherwise

Example: Assume we are drawing cards from a 100 well-shuffled cards with replacement. We keep drawing until we get a. Let p = P({ }), i.e. there are 100 p s. Assume the successive drawings are independent and define X = the number of drawings.

Example: Assume we are drawing cards from a 100 well-shuffled cards with replacement. We keep drawing until we get a. Let p = P({ }), i.e. there are 100 p s. Assume the successive drawings are independent and define X = the number of drawings. If we know that there are 20 s, i.e. p = 0.2, then what is the probability for us to draw at most 3 times? More than 2 times?

Example: Assume we are drawing cards from a 100 well-shuffled cards with replacement. We keep drawing until we get a. Let p = P({ }), i.e. there are 100 p s. Assume the successive drawings are independent and define X = the number of drawings. If we know that there are 20 s, i.e. p = 0.2, then what is the probability for us to draw at most 3 times? More than 2 times? P(X 3) = p(1)+p(2)+p(3) = 0.2+0.2 0.8+0.2 (0.8) 2 = 0.488

Example: Assume we are drawing cards from a 100 well-shuffled cards with replacement. We keep drawing until we get a. Let p = P({ }), i.e. there are 100 p s. Assume the successive drawings are independent and define X = the number of drawings. If we know that there are 20 s, i.e. p = 0.2, then what is the probability for us to draw at most 3 times? More than 2 times? P(X 3) = p(1)+p(2)+p(3) = 0.2+0.2 0.8+0.2 (0.8) 2 = 0.488 P(X > 2) = p(3)+p(4)+p(5)+ = 1 p(1) p(2) = 1 0.2 0.2 0.8 = 0

Definition The cumulative distribution function (cdf) F (x) of a discrete rv X with pmf p(x) is defined for every number x by F (x) = P(X x) = y:y x p(y) For any number x, F(x) is the probability that the observed value of X will be at most x.

Definition The cumulative distribution function (cdf) F (x) of a discrete rv X with pmf p(x) is defined for every number x by F (x) = P(X x) = y:y x p(y) For any number x, F(x) is the probability that the observed value of X will be at most x. F (x) = P(X x) = P(X is less than or equal to x) p(x) = P(X = x) = P(X is exactly equal to x)

Example 3.10 (continued): 0 if y < 1 0.4 if 1 y < 2 F (y) = 0.7 if 2 y < 3 0.9 if 3 y < 4 1 if y 2

Example 3.10 (continued): 0 if y < 1 0.4 if 1 y < 2 F (y) = 0.7 if 2 y < 3 0.9 if 3 y < 4 1 if y 2

Example: Assume we are drawing cards from a 100 well-shuffled cards with replacement. We keep drawing until we get a. Let α = P({ }), i.e. there are 100 α s. Assume the successive drawings are independent and define X = the number of drawings. The pmf would be { (1 α) x 1 α x = 1, 2, 3,... p(x) = 0 otherwise

Example: Assume we are drawing cards from a 100 well-shuffled cards with replacement. We keep drawing until we get a. Let α = P({ }), i.e. there are 100 α s. Assume the successive drawings are independent and define X = the number of drawings. The pmf would be { (1 α) x 1 α x = 1, 2, 3,... p(x) = 0 otherwise Then for any positive interger x, we have F (x) = y x p(y) = = x x 1 (1 α) (y 1) α = α (1 α) y y=1 { 1 (1 α) x x 1 0 x < 1 y=0

Example: Assume we are drawing cards from a 100 well-shuffled cards with replacement. We keep drawing until we get a. Let α = P({ }), i.e. there are 100 α s. Assume the successive drawings are independent and define X = the number of drawings. The pmf would be

pmf = cdf: F (x) = P(X x) = p(y) y:y x

pmf = cdf: F (x) = P(X x) = p(y) It is also possible cdf = pmf: y:y x

pmf = cdf: F (x) = P(X x) = It is also possible cdf = pmf: y:y x p(x) = F (x) F (x ) p(y) where x represents the largest possible X value that is strictly less than x.

Proposition For any two numbers a and b with a b, P(a X b) = F (b) F (a ) where a represents the largest possible X value that is strictly less than a. In particular, if the only possible values are integers and if a and b are integers, then P(a X b) = P(X = a or a + 1 or... or b) = F (b) F (a 1) Taking a = b yields P(X = a) = F (a) F (a 1) in this case.

Example (Problem 23): A consumer organization that evaluates new automobiles customarily reports the number of major defects in each car examined. Let X denote the number of major defects in a randomly selected car of a certain type. The cdf of X is as follows: 0 x < 0 0.06 0 x < 1 0.19 1 x < 2 0.39 2 x < 3 F (x) = 0.67 3 x < 4 0.92 4 x < 5 0.97 5 x < 6 1 x 6 Calculate the following probabilities directly from the cdf: (a)p(2), (b)p(x > 3) and (c)p(2 X < 5).