Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all

Similar documents
b. ( ) ( ) ( ) ( ) ( ) 5. Independence: Two events (A & B) are independent if one of the conditions listed below is satisfied; ( ) ( ) ( )

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.

Binomial and Poisson Probability Distributions

Statistics for Economists. Lectures 3 & 4

Part (A): Review of Probability [Statistics I revision]

MATH 556: PROBABILITY PRIMER

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

What is Probability? Probability. Sample Spaces and Events. Simple Event

ECO220Y Continuous Probability Distributions: Uniform and Triangle Readings: Chapter 9, sections

Chapter 3. Discrete Random Variables and Their Probability Distributions

Chapter 4. Probability-The Study of Randomness

God doesn t play dice. - Albert Einstein

REPEATED TRIALS. p(e 1 ) p(e 2 )... p(e k )

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Lectures on Elementary Probability. William G. Faris

Chapter 3. Discrete Random Variables and Their Probability Distributions

9. DISCRETE PROBABILITY DISTRIBUTIONS

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Probability Dr. Manjula Gunarathna 1

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.

Stat Lecture 20. Last class we introduced the covariance and correlation between two jointly distributed random variables.

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

Discrete Random Variables

Random variables. Lecture 5 - Discrete Distributions. Discrete Probability distributions. Example - Discrete probability model

Discrete Random Variables

Elements of probability theory

CHAPTER 4. Probability is used in inference statistics as a tool to make statement for population from sample information.

Counting principles, including permutations and combinations.

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions

MAT X (Spring 2012) Random Variables - Part I

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

Probability. 25 th September lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.)

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

Week 12-13: Discrete Probability

STAT200 Elementary Statistics for applications

Lecture 3. Biostatistics in Veterinary Science. Feb 2, Jung-Jin Lee Drexel University. Biostatistics in Veterinary Science Lecture 3

Math 3338: Probability (Fall 2006)

STA Module 4 Probability Concepts. Rev.F08 1

Statistical Inference

Mathematical Statistics 1 Math A 6330

RVs and their probability distributions

Probability and Probability Distributions. Dr. Mohammed Alahmed

Topic 3: Introduction to Probability

MATH 118 FINAL EXAM STUDY GUIDE

Sociology 6Z03 Topic 10: Probability (Part I)

Discrete Distributions

Probability Theory and Applications

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Probability: Why do we care? Lecture 2: Probability and Distributions. Classical Definition. What is Probability?

Math 2311 Test 1 Review. 1. State whether each situation is categorical or quantitative. If quantitative, state whether it s discrete or continuous.

1 The Basic Counting Principles

Discrete probability distributions

15 Discrete Distributions

Lecture 3. The Population Variance. The population variance, denoted σ 2, is the sum. of the squared deviations about the population

Topic 3: The Expectation of a Random Variable

MgtOp 215 Chapter 5 Dr. Ahn

Discrete Probability Distributions

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Probability. Lecture Notes. Adolfo J. Rumbos

Lecture 3 Probability Basics

Examples of random experiment (a) Random experiment BASIC EXPERIMENT

Lecture 2: Probability and Distributions

Lecture 2 Binomial and Poisson Probability Distributions

4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur

University of Technology, Building and Construction Engineering Department (Undergraduate study) PROBABILITY THEORY

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Ramsey theory with and withoutpigeonhole principle

Probability- describes the pattern of chance outcomes

Bernoulli and Binomial Distributions. Notes. Bernoulli Trials. Bernoulli/Binomial Random Variables Bernoulli and Binomial Distributions.

ECE 340 Probabilistic Methods in Engineering M/W 3-4:15. Lecture 2: Random Experiments. Prof. Vince Calhoun

Random Variables Example:

Discrete Probability Distributions

Dept. of Linguistics, Indiana University Fall 2015

Probability. Part 1 - Basic Counting Principles. 1. References. (1) R. Durrett, The Essentials of Probability, Duxbury.

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

Probability (Devore Chapter Two)

Discrete Distributions

4-1 BASIC CONCEPTS OF PROBABILITY

2011 Pearson Education, Inc

6 Central Limit Theorem. (Chs 6.4, 6.5)

Lecture notes for probability. Math 124

CSE 103 Homework 8: Solutions November 30, var(x) = np(1 p) = P r( X ) 0.95 P r( X ) 0.

Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( )

3rd IIA-Penn State Astrostatistics School July, 2010 Vainu Bappu Observatory, Kavalur

Lecture 3: Probability

STA1000F Summary. Mitch Myburgh MYBMIT001 May 28, Work Unit 1: Introducing Probability

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

1 Presessional Probability

Statistics for Financial Engineering Session 2: Basic Set Theory March 19 th, 2006

University of Jordan Fall 2009/2010 Department of Mathematics

4. Probability of an event A for equally likely outcomes:

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

Each trial has only two possible outcomes success and failure. The possible outcomes are exactly the same for each trial.

Conditional Probability

Bandits, Experts, and Games

Chapter Summary. 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance

Probability Distribution. Stat Camp for the MBA Program. Debbon Air Seat Release

Advanced topics from statistics

Transcription:

Lecture 6 1 Lecture 6 Probability events Definition 1. The sample space, S, of a probability experiment is the collection of all possible outcomes of an experiment. One such outcome is called a simple event. An event is a collection of several outcomes. The events are denoted by capital letters, and you could think of them as being sets. Definition 2. The probability of an event denoted P (E) is the likelihood of that event occurring.

Lecture 6 2 The probability of an event has to satisfy the following statements: 1) 0 P (E) 1. 2) If an event is impossible, the probability of the event is 0. 3) If an event is a certainty the probability of the event is 1. 4) If S = {e 1, e 2,, e n }, then P (e 1 ) + P (e 2 ) + + P (e n ) = 1.

Lecture 6 3 Definition 3. Let E and F be two events. E and F (E F ) is the event consisting of simple events that belong to both E and F. If the events E and F do not have any simple events in common then we say that they are disjoint or mutually disjoint. E or F (E F ) is the event consisting of simple events that belong to either E or F or both. Definition 4. Let S denote the sample space of a probability experiment and let E denote an event. The complement of E, denoted E, consists of all simple events in the sample space S that are not simple events in the event E.

Lecture 6 4 Proposition 5. a) If the events E, F, G,... are mutually disjoint then P (E or F or G or ) = P (E)+P (F )+P (G)+ b) For any two events E and F P (E or F ) = P (E) + P (F ) P (E and F ) c) For any event E, if E is its complement, we have P (E) = 1 P (E).

Lecture 6 5 How do we compute probabilities The classical method of computing probabilities requires equally likely outcomes. An experiment has equally likely outcomes if all simple events have the same probability of occurring. Proposition 6. If an experiment has equally likely simple events, then the probability of an event E occurring is computed by: P (E) = Number of ways that E can occur Number of possible outcomes

Lecture 6 6 Sometimes it is difficult or impossible to count the number of ways an event E could occur. In this case we could determine the probability based upon the outcomes of a probability experiment. The probability of an event E is approximately the number of times the event E is observed, divided by the number of repetitions of the experiment. P (E) = relative frequency = frequency of E number of trials of experiment.

Lecture 6 7 Probability events Definition 7. If E and F are any two events, then the probability of event E given the event F (or theconditional probability of E given F is computed by: P (E F ) = P (E and F ). P (E) The probability of the event F occurring given the occurance of event E is found by dividing the probability of E and F by the probability of E. As a consequence we have the general multiplication rule: P (E and F ) = P (E)P (E F ).

Lecture 6 8 Definition 8. Two events E and F are independent if the occurance of event E in a probability experiment does not affect the probability of event F. In mathematical notations, two events are independent if P (E F ) = P (E) or P (F E) = P (E) If the events are not independent we say they are dependent. Multiplication rule for 2 independent events Two events are independent if and only if (iff) P (E and F ) = P (E)P (F ). This means that if two events are independent then it must be the case that P (E and F ) = P (E)P (F ) and if P (E and F ) = P (E)P (F ) holds then the events

Lecture 6 9 E and F are independent.

Lecture 6 10 Multiplication rule for n independent events If events E, F, G are independent, then P (E and F and G ) = P (E)P (F )P (G)

Lecture 6 11 Counting techniques Multiplication principle If a task consists of a sequence of choices in which there are p selections for the first choice, q selections for the second choice, r selections for the third choice, and so on, then the task of making these selections can be done in p q r different ways.

Lecture 6 12 Number of Combinations of n distinct objects taken r at a time The number of different arrangements of n objects using r n of them, in which 1) the n objects are distinct 2) once an object is used it can not be repeated (without replacement) 3) order is not important is given by the formula C n r = n! r!(n r)!

Lecture 6 13 Probability Distributions In certain situations, some attribute of the outcome may hold more interest for the experimenter than the outcome itself. For example, a player of the game of craps may be concern only about throwing a 7 and not weather the 7 was the result of a 5 and a 2 or a 4 and a 3 or a 6 and a 1. Definition 9. A random variable (r.v.) is a numerical measure of the outcome of a probability experiment, so its value is determined by chance. Random variables are denoted using capital letters such as X, Y, etc.

Lecture 6 14 Definition 10. A discrete random variable is a random variable that has either a finite number of possible values or a countable number of possible values. A continuous random variable is a random variable that has an infinite number of possible values that is not countable.

Lecture 6 15 Because the value of a r.v. is determined by chance, there are probabilities assigned to these possible values. A table, graph, or formula containing all the possible values a random variable could take together with the corresponding probabilities forms a probability distribution. In the case of a discrete probability distribution the following equalities must be verified: 1) P (X = x) = 1 2) 0 P (X = x) 1 where (X = x) denotes the probability of the

Lecture 6 16 random variable X to be x.

Lecture 6 17 Mean and variance of a discrete random variable The mean, or the expected value, of a discrete random variable is given by the formula µ X = E(X) = x P (X = x) where x is the value of the random variable and P (X = x) is the probability of observing the random variable x.the variance of a discrete r.v. is given by σ 2 X = (x µ X ) 2 P (X = x) and the standard deviation is the square root of the variance, i.e. σ X = σ 2 X.

Lecture 6 18 Binomial distribution When do we deal with a binomial trial or distribution? An experiment is said to be a binomial experiment if: 1) The experiment is performed a fixed number of times, usually denoted by n. Each repetition is called a trial. 2) The trials are independent (the outcome of one does not depend on the other) 3) For each trial, there are 2 mutually exclusive outcomes: success or failure. 4) The probability of success is fixed for each trial of the experiment.the probability of success is p while of failure is 1 p 5) We say that a r.v. is binomially distributed if X counts the number of

Lecture 6 19 successes in n independent trials of the experiment. So the possible values for X are 0, 1, 2,..., n.

Lecture 6 20 Mathematicians showed that the probability of obtaining x successes in n independent trials of a binomial experiment where the probability of success is p is given by P (X = x) = n C x p x (1 p) n x, x = 0, 1, 2,..., n Also they showed that such a binomial random variable will have the mean given by µ X = E(X) = np and the standard deviation given by the formula: σ X = np(1 p)

Lecture 6 21 Continuous r.v. s Normal Distribution In the case of continuous r.v. s, computing probabilities is not that easy because the r.v. takes infinitely many values. That is why we look at intervals of values the r.v. might take. Probability density Function A probability density function is a function used to compute probabilities of continuous r.v. s. It has to satisfy the following two properties: (1.) The area under the graph of the equation over all possible values of the r.v. must equal one.

Lecture 6 22 (2.) The graph of the equation must lie on or above the x-axis for all possible values of the r.v.

Lecture 6 23 Property: The probability of observing a value of the r.v. in a certain interval equals the area under the graph of the density function of that r.v., over that interval. A continuous r.v. is normally distributed or has a normal probability distribution if its relative frequency histogram has the shape of a normal curve (bell-shaped and symmetric).

Lecture 6 24 Area and the normal distribution If the r.v. X is normally distributed then the area under the normal curve for any range of values of the r.v. X represents either: 1) the proportion of the population with the characteristics described by the range, or 2) the probability that a randomly chosen individual from the population will have the characteristics described by the range.

Lecture 6 25 Finding the area under the density graph of a normally distributed r.v. is not an easy task. It requires a lot of calculus. One way of avoiding this is to use tables that give us these areas (probabilities). But for each µ and σ we would need a new table. How can we avoid this? By transforming somehow all these r.v. into a standard one. Standardizing a normal r.v. Suppose that the r.v. X is normally distributed with mean µ and standard deviation σ. Then the r.v. Z = X µ σ

Lecture 6 26 is normally distributed with mean µ = 0 and standard deviation σ = 1 Such an r.v. is said to have the standard normal distribution.