THE UNIVERSITY OF HONG KONG DEPARTMENT OF STATISTICS AND ACTUARIAL SCIENCE

Similar documents
Lecture 2. Conditional Probability

STATISTICAL INDEPENDENCE AND AN INVITATION TO THE Art OF CONDITIONING

Lecture 3. January 7, () Lecture 3 January 7, / 35

2.4 Conditional Probability

Probabilistic models

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Properties of Probability

STAT509: Probability

Introduction to Probability Theory, Algebra, and Set Theory

Conditional Probability

Statistics 1 - Lecture Notes Chapter 1

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

ORF 245 Fundamentals of Statistics Chapter 5 Probability

Probability. 25 th September lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.)

Chap 1: Experiments, Models, and Probabilities. Random Processes. Chap 1 : Experiments, Models, and Probabilities

1 Preliminaries Sample Space and Events Interpretation of Probability... 13

Joint, Conditional, & Marginal Probabilities

Elements of probability theory

Deep Learning for Computer Vision

Algebra. Formal fallacies (recap). Example of base rate fallacy: Monty Hall Problem.

(Refer Slide Time: 6:43)

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Basics on Probability. Jingrui He 09/11/2007

Conditional Probability and Independence

Week 12-13: Discrete Probability

Probabilistic models

Elementary Discrete Probability

2) There should be uncertainty as to which outcome will occur before the procedure takes place.

CS 630 Basic Probability and Information Theory. Tim Campbell

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

Lecture 3: Probability

Consider an experiment that may have different outcomes. We are interested to know what is the probability of a particular set of outcomes.

the time it takes until a radioactive substance undergoes a decay

Basic Probability and Information Theory: quick revision

Conditional Probability Solutions STAT-UB.0103 Statistics for Business Control and Regression Models

Tutorial 3: Random Processes

Lecture 3 Probability Basics

STAT Chapter 3: Probability

Lecture 1: Review of Probability

6.080 / Great Ideas in Theoretical Computer Science Spring 2008

Venn Diagrams; Probability Laws. Notes. Set Operations and Relations. Venn Diagram 2.1. Venn Diagrams; Probability Laws. Notes

ORF 245 Fundamentals of Statistics Chapter 1 Probability

2.6 Tools for Counting sample points

Bi=[X=i] i=i,z, forma partition. Example

Math-Stat-491-Fall2014-Notes-I

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

With Question/Answer Animations. Chapter 7

INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR. NPTEL National Programme on Technology Enhanced Learning. Probability Methods in Civil Engineering

Basic Probabilistic Reasoning SEG

Probability Pr(A) 0, for any event A. 2. Pr(S) = 1, for the sample space S. 3. If A and B are mutually exclusive, Pr(A or B) = Pr(A) + Pr(B).

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Probability (Devore Chapter Two)

The University of Hong Kong Department of Statistics and Actuarial Science STAT2802 Statistical Models Tutorial Solutions Solutions to Problems 71-80

CS684 Graph Algorithms

Probability, Statistics, and Bayes Theorem Session 3

Chapter 7 Probability Basics

CSC Discrete Math I, Spring Discrete Probability

P (E) = P (A 1 )P (A 2 )... P (A n ).

Probability and distributions. Francesco Corona

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM

Formalizing Probability. Choosing the Sample Space. Probability Measures

Statistical Inference

P Q (P Q) (P Q) (P Q) (P % Q) T T T T T T T F F T F F F T F T T T F F F F T T

Intermediate Math Circles November 15, 2017 Probability III

CHAPTER 4. Probability is used in inference statistics as a tool to make statement for population from sample information.

Fundamentals of Probability CE 311S

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

CS626 Data Analysis and Simulation

Math 3338: Probability (Fall 2006)

Some Basic Concepts of Probability and Information Theory: Pt. 1

2011 Pearson Education, Inc

Lecture 04: Conditional Probability. Lisa Yan July 2, 2018

CMPSCI 240: Reasoning Under Uncertainty First Midterm Exam

Part (A): Review of Probability [Statistics I revision]

More on conditioning and Mr. Bayes

CSCE 478/878 Lecture 6: Bayesian Learning

4. Probability of an event A for equally likely outcomes:

Chapter 1 Principles of Probability

Test One Mathematics Fall 2009

Statistics Statistical Process Control & Control Charting

Conditional probability

Conditional Probability P( )

Probability 1 (MATH 11300) lecture slides

Random Signals and Systems. Chapter 1. Jitendra K Tugnait. Department of Electrical & Computer Engineering. James B Davis Professor.

CDA6530: Performance Models of Computers and Networks. Chapter 1: Review of Practical Probability

Discrete Random Variables

Example. If 4 tickets are drawn with replacement from ,

MATH2206 Prob Stat/20.Jan Weekly Review 1-2

Probability Theory. Fourier Series and Fourier Transform are widely used techniques to model deterministic signals.

ECE353: Probability and Random Processes. Lecture 3 - Independence and Sequential Experiments

Where are we in CS 440?

BASICS OF PROBABILITY CHAPTER-1 CS6015-LINEAR ALGEBRA AND RANDOM PROCESSES

Discrete Mathematics

6 Event-based Independence and Conditional Probability. Sneak peek: Figure 3: Conditional Probability Example: Sneak Peek

What is the probability of getting a heads when flipping a coin

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

Exam 1 Solutions. Problem Points Score Total 145

Probability Theory and Applications

MATHEMATICS 191, FALL 2004 MATHEMATICAL PROBABILITY Outline #3 (Conditional Probability)

Transcription:

THE UNIVERSITY OF HONG KONG DEPARTMENT OF STATISTICS AND ACTUARIAL SCIENCE STAT1301 PROBABILITY AND STATISTICS I EXAMPLE CLASS 2 Review Definition of conditional probability For any two events A and B, the conditional probability of A given the occurrence of B is written as P (A B) and is defined as P (A B) P (A B) = P (B) provided that P (B > 0). Multiplication theorem (a) For any two events A and B with P (B) > 0, P (A B) = P (B)P (A B). (b) For any three events A, B, C with P (B C) > 0, P (A B C) = P (C)P (B C)P (A B C). Independence (a) Two events A and B are called independent if and only if P (A B) = P (A)P (B). If P (A) > 0,then A and B are independent if P (B A) = P (B). (b) The events A 1, A 2,, A k are (mutually) independent if and only if the probability of the intersection of any combination of them is equal to the product of the probabilities of the corresponding 1

single events. For example, A 1, A 2, A 3 are independent if and only if Bayes Theorem (Bayes rule, Bayes law) P (A 1 A 2 ) = P (A 1 )P (A 2 ) P (A 1 A 3 ) = P (A 1 )P (A 3 ) P (A 2 A 3 ) = P (A 2 )P (A 3 ) P (A 1 A 2 A 3 ) = P (A 1 )P (A 2 )P (A 3 ) For any two event A and B with P (A) > 0 and P (B) > 0, P (B A) = P (A B) P (B) P (A). Bayes Theorem If B 1, B 2,..., B k are mutually exclusive and exhaustive events (i.e. a partition of the sample space), and A is any event with P (A) > 0, then for any B j, P (B j A) = P (A B j)p (B j ) P (A) = P (B j)p (A B j ), k P (B i )P (A B i ) i=1 where k can also be. Law of total probability (a) If 0 < P (B) < 1, then P (A) = P (A B)P (B) + P (A B c )P (B c ) for any A. (b) If B 1, B 2,..., B k are mutually exclusive and exhaustive events (i.e. a partition of the sample space), then for any event A, k P (A) = P (A B j )P (B j ), where k can also be. j=1 2

Problems Problem 1 A and B are two events. Suppose that P (A B) = 0.6, P (B A) = 0.3, and P (A B) = 0.72. LetP (A) = a. (a) Express P (A B) and P (B) in terms of a. (b) Using the results of (a), or otherwise, find the value of a. (c) Are A and B independent events? Explain your answer briefly. (a) P (A B) = P (B A)P (A) = 0.3a P (A B)P (B) = P (A B) 0.6P (B) = 0.3a P (B) = 0.5a (b) P (A B) = P (A) + P (B) P (A B) 0.72 = a + 0.5a 0.3a a = 0.6 (c) P (A B) = 0.6 = P (A) and P (B A) = 0.3 = 0.5 0.6 = P (B) therefore A and B are independent events. Problem 2 A and B are two events. Suppose that P (B c A) = 3 4, P (Ac B) = 3 5, and P (Ac ) = 2 5, where Ac and B c are complementary events of A and B respectively. Let P (B) = p, where 0 < p < 1. (a) Find P (A B c ). (b) Express P (A c B) in terms of p. 3

(c) Using the fact that A c B is the complementary event of A B c, or otherwise, find the value of p. (d) Are A and B mutually exclusive? Explain your answer. (a) (b) (c) P (A B c ) = P (B c A)P (A) = P (B c A)[1 P (A c )] = 3 4 [1 2 5 ] = 9 20 P (A c B) = P (A c B)P (B) = 3 5 p P (A c B) = 1 P (A B c ) P (A c ) + P (B) P (A c B) = 1 P (A B c ) 2 5 + p 3 5 p = 1 9 20 2 5 p = 3 20 p = 3 8 (d) therefore A and B are not mutually exclusive. P (A B) + P (A c B) = P (B) P (A B) + 3 5 3 8 = 3 8 P (A B) = 3 20 0 Problem 3 (a) Say that C 1, C 2,, C k are (mutually) independent events that have respective probabilities p 1, p 2,, p k. Argue that the probability of at least one of C 1, C 2,, C k happens is equal to 1 (1 p 1 )(1 p 2 ) (1 p k ) HINT:(C 1 C 2 C k ) c = C c 1 C c 2 C c k 4

(b) Let C 1, C 2, C 3 be independent events with probabilities 1, 1, 1, respectively. Compute 2 3 4 (i) P (C 1 C 2 C 3 ) (ii) P [(C c 1 C c 2) C 3 ] (a) We first consider C 1 and C 2, if the probability of at least one of them happens is P (C 1 C 2 ). P (C 1 C 2 ) = 1 P ((C 1 C 2 ) c ) = 1 P (C c 1 C c 2) Since C 1 and C 2 are independent,p(c 1 C 2 ) = P (C 1 )P (C 2 ). And since P (C 1 C 2 ) = P (C 1 ) + P (C 2 ) P (C 1 C 2 ),then we get P (C c 1 C c 2) = P ((C 1 C 2 ) c ) = 1 P (C 1 C 2 ) = 1 P (C 1 ) P (C 2 ) + P (C 1 C 2 ) = 1 P (C 1 ) P (C 2 ) + P (C 1 )P (C 2 ) = (1 P (C 1 ))(1 P (C 2 )) = P (C c 1)P (C c 2) Therefore C c 1 and C c 2 are independent. Then we can calculate P (C 1 C 2 ) = 1 P (C c 1)P (C c 2) = 1 (1 p 1 )(1 p 2 ) Based on mathematical induction we can get similar result, which is that C1, c C2, c, Ck c are independent, P (C1 c C2 c Ck c) =. Therefore the probability of at least one of C 1, C 2,, C k happens is P (C 1 C 2 C k ) = P (C1)P c (C2) c P (Ck c ), which is equal to P (C 1 C 2 C k ) = 1 P [(C 1 C 2 C k ) c ] = 1 P (C c 1 C c 2 C c k) = 1 P (C c 1)P (C c 2) P (C c k) = 1 (1 p 1 )(1 p 2 ) (1 p k ) (b) (i) P (C 1 C 2 C 3 ) = 1 (1 P (C 1 ))(1 P (C 2 ))(1 P (C 3 )) = 1 1 2 2 3 34 = 3 4 (ii) P [(C c 1 C c 2) C 3 ] = 1 [1 P (C c 1 C c 2)](1 P (C 3 )) = 1 [1 P (C c 1)P (C c 2)](1 P (C 3 )) = 1 (1 1 2 2 3 )(1 1 4 ) = 1 2 5

Problem 4 A small plane have gone down, and the search is organized into three regions. Starting with the likeliest, they are: Region Initial Chance Plane is There Chance of Being Overlooked in the Search Mountains 0.5 0.3 Praire 0.3 0.2 Sea 0.2 0.9 The last column gives the chance that if the plane is there, it will not be found, it will not be found. For example, if it went down at sea, there is 90% chance it will have disappeared, or otherwise not be found. Since the pilot is not equipped to long survive a crash in the mountains, it is particularly important to determine the chance that the plane went down in the mountains. (a) Before any search is started, what is this chance in the mountains? (b) The initial search was in the mountains, and the plane was not found. Now what is the chance the plane is nevertheless in the mountains? (c) The search was continued over the other two regions, and unfortunately the plane was not found anywhere. Finally now what is the chance that the plane is in the mountains? (d) Describing how and why the chances changed from (a) to (b) to (c). Let M, P, S be the events that the plan went down in the mountains, Prairie, sea respectively. Let OM, OP, OS be the events that the plane is not found in mountains, prairie, sea respectively. Then we have P (M) = 0.5 P (P ) = 0.3 P (S) = 0.2 (a) P (M) = 0.5 P (OM M) = 0.3 P (OP M) = P (OS M) = 1 P (OP P ) = 0.2 P (OM M) = P (OS P ) = 1 P (OS S) = 0.9 P (OM S) = P (OP S) = 1 (b) P (M OM) = = P (OM M)P (M) P (OM M)P (M) + P (OM P )P (P ) + P (OM S)P (S) (0.3)(0.5) (0.3)(0.5) + 0.3 + 0.2 = 0.2308 6

(c) P (OM, OP, OS) = P (OM, OP, OS M)P (M) + P (OM, OP, OS P )P (S) + P (OM, OP, OS S)P (S) = P (OM M)P (M) + P (OP P )P (P ) + P (OS S)P (S) = (0.3)(0.5) + (0.2)(0.3) + (0.9)(0.2) = 0.39 P (M OM, OP, OS) = P (OM M)P (M) P (OM, OP, OS) = (0.3)(0.5) 0.39 = 0.3846 (d) Prior to any search, the probability is 0.5. After the initial search, the plane was not found in the mountains. Therefore given this information the probability decreased to 0.2308. However, after the continued search, the plane was also not found in other regions. Hence the probability increased to 0.3846 due to this additional information. Problem 5 Let S = {1, 2,, n} and suppose that A and B are, independently, equally likely to be any of the 2 n subsets ( including the null set and S itself) of S. (a) Show that P (A B) = ( 3 4 )n. HINT: Let N(B) denote the number of elements in B. Use n P (A B) = P (A B N(B) = i)p (N(B) = i) (b) Show that P (A B = φ) = ( 3 4 )n. (a) Denote N(B) as the number of elements in B. Then Using law of total probability, P (A B) = = i=0 P (A B N(B) = i) = 2i 2 = n 2i n ( ) n P (N(B) = i) = /2 n j n P (A B N(B) = i)p (N(B) = i) i=0 n 2 i n i=0 n = 2 2n (2 i ) i=0 ( ) n /2 n j = 2 2n (2 + 1) n = ( 3 4 )n (binomial theorem) 7

(b) P (A B = φ) = P (A B c ) = ( 3 4 )n (B c and B have equivalent probabilistic behavior) Problem 6 (The Monty Hall Problem) Suppose you are on a game show, and you are given the choice of three boxes. In one box is a key to a new BMW while empty in others. You pick a box A. Then the host, Monty Hall, who knows what are inside the boxes, opens another box, say box B, which is empty. He then says to you, Do you want to abandon your box and pick box C? Is it to your advantage to switch your choice? Let SB = {hostchooseb}, we have known P (SB A) = 1 2 P (SB A c, B) = 0 P (SB A c, B c ) = 1 P (A) = P (B) = P (C) = 1 3 P (SB, B c A c ) = P (SB B c, A c )P (B c A c ) = 1 2 P (SB, B c A c ) = P (SB B c, A)P (B c A) = 1 2 Therefore P (A SB, B c ) = P (SB, B c 1 A)P (A) P (SB, B c A c )P (A c ) + P (SB, B c A)P (A) = 1 2 3 1 2 + 1 1 = 1 3 2 3 2 3 8