CMPSCI 240: Reasoning Under Uncertainty

Similar documents
CMPSCI 240: Reasoning Under Uncertainty

CMPSCI 240: Reasoning Under Uncertainty

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random Variables Example:

CMPSCI 240: Reasoning Under Uncertainty

Lecture 3. Discrete Random Variables

Discrete Random Variables

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

1. Discrete Distributions

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

CMPSCI 240: Reasoning Under Uncertainty

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

3 Multiple Discrete Random Variables

Introduction to Stochastic Processes

RVs and their probability distributions

Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( )

Methods of Mathematics

Discrete Random Variables

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

Randomized Algorithms

Stat Lecture 20. Last class we introduced the covariance and correlation between two jointly distributed random variables.

Discrete Probability Distributions

DEFINITION: IF AN OUTCOME OF A RANDOM EXPERIMENT IS CONVERTED TO A SINGLE (RANDOM) NUMBER (E.G. THE TOTAL

BINOMIAL DISTRIBUTION

Binomial and Poisson Probability Distributions

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

2. AXIOMATIC PROBABILITY

Introductory Probability

Lecture 4: Probability and Discrete Random Variables

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Review of Probability. CS1538: Introduction to Simulations

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39

Statistics and Econometrics I

Topic 3: The Expectation of a Random Variable

Example 1. The sample space of an experiment where we flip a pair of coins is denoted by:

Continuous-Valued Probability Review

DS-GA 1002 Lecture notes 2 Fall Random variables

18440: Probability and Random variables Quiz 1 Friday, October 17th, 2014

Conditional Probability

k P (X = k)

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation

Probability Theory and Simulation Methods

Discrete Structures for Computer Science

Discrete Mathematics and Probability Theory Fall 2012 Vazirani Note 14. Random Variables: Distribution and Expectation

Common Discrete Distributions

Bernoulli Trials, Binomial and Cumulative Distributions

Introduction to Probability 2017/18 Supplementary Problems

Random Variable. Pr(X = a) = Pr(s)

CS 361: Probability & Statistics

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

random variables T T T T H T H H

STA 247 Solutions to Assignment #1

Math 493 Final Exam December 01

COMPSCI 240: Reasoning Under Uncertainty

Sample Spaces, Random Variables

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

Lecture 2 Binomial and Poisson Probability Distributions

Introductory Probability

Discrete Random Variable

Probability, Random Processes and Inference

Probability 1 (MATH 11300) lecture slides

STAT 345 Spring 2018 Homework 4 - Discrete Probability Distributions

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

Fundamental Tools - Probability Theory II

a zoo of (discrete) random variables

CMPSCI 240: Reasoning about Uncertainty

University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

MA 250 Probability and Statistics. Nazar Khan PUCIT Lecture 15

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad

Probability and Statistics Concepts

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Senior Math Circles November 19, 2008 Probability II

Dynamic Programming Lecture #4

The random variable 1

CSC Discrete Math I, Spring Discrete Probability

Name: Firas Rassoul-Agha

Chapter 2 Random Variables

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

Discrete Probability

Recap of Basic Probability Theory

Great Theoretical Ideas in Computer Science

5. Conditional Distributions

Relationship between probability set function and random variable - 2 -

Statistical Inference

Chapter 3 Discrete Random Variables

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

the time it takes until a radioactive substance undergoes a decay

CS 237: Probability in Computing

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Statistics for Economists. Lectures 3 & 4

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya

Recap of Basic Probability Theory

Overview. CSE 21 Day 5. Image/Coimage. Monotonic Lists. Functions Probabilistic analysis

Introduction to Probability

X 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:

Chapter 3: Discrete Random Variable

Transcription:

CMPSCI 240: Reasoning Under Uncertainty Lecture 5 Prof. Hanna Wallach wallach@cs.umass.edu February 7, 2012

Reminders Pick up a copy of B&T Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/ First homework is due on Friday

Recap

Last Time: Counting Sampling k of n objects with replacement: n k

Last Time: Counting Sampling k of n objects with replacement: n k Permutations of n objects: n!

Last Time: Counting Sampling k of n objects with replacement: n k Permutations of n objects: n! k-permutations of k of n objects: n! (n k)!

Last Time: Counting Sampling k of n objects with replacement: n k Permutations of n objects: n! n! k-permutations of k of n objects: (n k)! Combinations of k of n objects: ( ) n k = n! k! (n k)!

Last Time: Counting Sampling k of n objects with replacement: n k Permutations of n objects: n! n! k-permutations of k of n objects: (n k)! Combinations of k of n objects: ( ) n k = n! k! (n k)! Partitions of n objects into r subsets: n! n 1! n 2!... n r! Challenge: try to show this yourself using a r-stage process!

Less Exciting Examples of Combinations e.g., a system contains 2x disks divided into x pairs, where each pair of disks contains the same data. If one of the disks in a pair fails, the data can be recovered, but if both disks fail, it cannot. Suppose two random disks fail. What is the probability that some data is inaccessible?

Binomial Probabilities

Sequences of Independent Trials Suppose we have a biased coin that lands heads with probability p. If we flip this coin 5 times, what is the probability that the outcome is HHTTH?

Sequences of Independent Trials Suppose we have a biased coin that lands heads with probability p. If we flip this coin 5 times, what is the probability that the outcome is HHTTH? If we flip the coin 5 times, what is the probability that the outcome consists of 3 heads and 2 tails in any order?

Binomial Probabilities Consider a sequence of n independent trials, each with a success probability p. The probability of any particular sequence with k successes is p k (1 p) n k

Binomial Probabilities Consider a sequence of n independent trials, each with a success probability p. The probability of any particular sequence with k successes is p k (1 p) n k The probability of exactly k successful trials is ( ) n p k (1 p) n k k

Examples of Sequences of Independent Trials A cell phone provider can handle up to r data requests at once. Assume that every minute, each of the provider s n customers makes a request with probability p, independent of the behavior of the other customers. What is the probability that exactly x customers will make a data request during a particular minute? What is the probability that > r customers will make a data request during a particular minute?

Random Variables

Random Variables Random variable: a real-valued function of the outcome of an experiment... though not written as a function!

Random Variables Random variable: a real-valued function of the outcome of an experiment... though not written as a function! e.g., rolling two dice, Ω = {(1, 1), (1, 2), (1, 3),..., (6, 6)}: examples of random variables include the sum of the two rolls, the number of 6s rolled, the product of the rolls, etc.

Random Variables Random variable: a real-valued function of the outcome of an experiment... though not written as a function! e.g., rolling two dice, Ω = {(1, 1), (1, 2), (1, 3),..., (6, 6)}: examples of random variables include the sum of the two rolls, the number of 6s rolled, the product of the rolls, etc. A random variable can map several outcomes to one value

Examples of Random Variables e.g., suppose we have 4 disks, each of which fails with probability p, Ω = {0000, 0001, 0010,..., 1111}: examples of random variables include the number of disks that failed, a boolean value indicating whether disks 1 or 2 failed, a boolean value indicating whether 3 or more disks failed, etc.

Random Variables and Events Uppercase letters, e.g., X or Y, denote random variables, while lowercase letters, e.g., x or y, denote their values

Random Variables and Events Uppercase letters, e.g., X or Y, denote random variables, while lowercase letters, e.g., x or y, denote their values Event {X =x} is the event consisting of all outcomes in Ω that are mapped to value x by random variable X

Random Variables and Events Uppercase letters, e.g., X or Y, denote random variables, while lowercase letters, e.g., x or y, denote their values Event {X =x} is the event consisting of all outcomes in Ω that are mapped to value x by random variable X e.g., flipping two coins, Ω = {HH, HT, TH, TT }: if X is the number of heads flipped, what is the event {X =1}?

Discrete vs. Continuous Random Variables Discrete random variable: the set of values is finite or countably infinite, e.g., rolling two dice: the number of 6s rolled, the sum of the rolls, the maximum roll, etc.

Discrete vs. Continuous Random Variables Discrete random variable: the set of values is finite or countably infinite, e.g., rolling two dice: the number of 6s rolled, the sum of the rolls, the maximum roll, etc. Continuous random variable: the set of values is uncountably infinite, e.g., choosing a in [ 1, 1]: a 2, a / 2, a + 3.1, etc.

Probability Mass Functions

Probability Mass Functions The probability mass function (PMF) of X is denoted by p X

Probability Mass Functions The probability mass function (PMF) of X is denoted by p X p X (x) is the probability of the event {X =x}: p X (x) = P(X =x) = P({X =x})

Probability Mass Functions The probability mass function (PMF) of X is denoted by p X p X (x) is the probability of the event {X =x}: p X (x) = P(X =x) = P({X =x}) e.g., flipping two (fair) coins, Ω = {HH, HT, TH, TT }: if X is the number of heads flipped, what is p X (1)?

Probability Mass Functions If {X S} is the event consisting of all outcomes in Ω that are mapped to values x S by random variable X, then P(X S) = x S p X (x) = x S P({X =x})

Probability Mass Functions If {X S} is the event consisting of all outcomes in Ω that are mapped to values x S by random variable X, then P(X S) = x S p X (x) = x S P({X =x}) e.g., flipping two (fair) coins: Ω = {HH, HT, TH, TT }: if X is the number of heads flipped, what is P(X 1)?

Probability Mass Functions If {X S} is the event consisting of all outcomes in Ω that are mapped to values x S by random variable X, then P(X S) = x S p X (x) = x S P({X =x}) e.g., flipping two (fair) coins: Ω = {HH, HT, TH, TT }: if X is the number of heads flipped, what is P(X 1)? e.g., what is x p X (x) for any discrete random variable X?

Functions of Random Variables If X is a random variable and g( ) is some linear or nonlinear function, then Y = g(x ) is another random variable

Functions of Random Variables If X is a random variable and g( ) is some linear or nonlinear function, then Y = g(x ) is another random variable If X is discrete then Y is discrete with PMF p Y (y) = p X (x) {x g(x)=y}

Functions of Random Variables If X is a random variable and g( ) is some linear or nonlinear function, then Y = g(x ) is another random variable If X is discrete then Y is discrete with PMF p Y (y) = p X (x) {x g(x)=y} e.g., Y = X and p X (x) = 1 / 9 if x is an integer in the range [ 4, 4] and 0 otherwise: what do p X and p Y look like?

Common Discrete Random Variables

Discrete Uniform Random Variables A discrete uniform random variable X with range [a, b] takes on any integer value between a and b inclusive

Discrete Uniform Random Variables A discrete uniform random variable X with range [a, b] takes on any integer value between a and b inclusive The PMF of a discrete uniform random variable X is p X (k) = 1 b a + 1 for k = a, a + 1,..., b

Discrete Uniform Random Variables A discrete uniform random variable X with range [a, b] takes on any integer value between a and b inclusive The PMF of a discrete uniform random variable X is p X (k) = 1 b a + 1 for k = a, a + 1,..., b e.g., Used to model probabilistic situations with b a + 1 equally likely outcomes (a, a + 1,..., b), e.g., rolling a die

Bernoulli Random Variables Imagine a biased coin that lands heads (success) with probability p. X is a Bernoulli random variable if X is equal to 1 if the coin is heads (success) and 0 if it is tails (failure)

Bernoulli Random Variables Imagine a biased coin that lands heads (success) with probability p. X is a Bernoulli random variable if X is equal to 1 if the coin is heads (success) and 0 if it is tails (failure) The PMF of a Bernoulli random variable X is p X (k) = p if k =1 and p X (k) = (1 p) if k =0

Bernoulli Random Variables Imagine a biased coin that lands heads (success) with probability p. X is a Bernoulli random variable if X is equal to 1 if the coin is heads (success) and 0 if it is tails (failure) The PMF of a Bernoulli random variable X is p X (k) = p if k =1 and p X (k) = (1 p) if k =0 Used to model probabilistic situations with two outcomes, e.g., whether a server is online or an email is spam, etc.

Binomial Random Variables Suppose we flip a biased coin n times. X is a binomial random variable if X is equal to the number of heads (successes)

Binomial Random Variables Suppose we flip a biased coin n times. X is a binomial random variable if X is equal to the number of heads (successes) The PMF of a binomial random variable X is ( ) n p X (k) = p k (1 p) n k for k = 0, 1, 2,..., n k

Binomial Random Variables Suppose we flip a biased coin n times. X is a binomial random variable if X is equal to the number of heads (successes) The PMF of a binomial random variable X is ( ) n p X (k) = p k (1 p) n k for k = 0, 1, 2,..., n k X is the sum of n independent Bernoulli random variables, e.g., number of servers (out of n) that are online, etc.

Examples of Binomial Random Variables e.g., you go to a party with 500 guests. What is the probability that one other guest has the same birthday as you?

[Got to here in class...]

Geometric Random Variables Suppose we flip a biased coin repeatedly until it lands heads (success). X is a geometric random variable if X is equal to the number of times that we have to flip the coin

Geometric Random Variables Suppose we flip a biased coin repeatedly until it lands heads (success). X is a geometric random variable if X is equal to the number of times that we have to flip the coin The PMF of a geometric random variable X is p X (k) = (1 p) k 1 p for k = 1, 2, 3,...

Geometric Random Variables Suppose we flip a biased coin repeatedly until it lands heads (success). X is a geometric random variable if X is equal to the number of times that we have to flip the coin The PMF of a geometric random variable X is p X (k) = (1 p) k 1 p for k = 1, 2, 3,... Used to model the number of repeated independent trials up to (and including) the first successful trial, e.g., spam

Examples of Geometric Random Variables e.g., you have 5 keys for your new apartment but you don t know which key opens the front door. What is the PMF of the number of trials needed to open the front door assuming that at each trial you are equally likely to choose any of the 5 keys?

Poisson Random Variables The PMF of a Poisson random variable X is λ λk p X (k) = e k! for k = 0, 1, 2,...

Poisson Random Variables The PMF of a Poisson random variable X is λ λk p X (k) = e k! for k = 0, 1, 2,... A Poisson PMF with λ = np is a good approximation for a binomial PMF with very small p and very large n if k n

Poisson Random Variables The PMF of a Poisson random variable X is λ λk p X (k) = e k! for k = 0, 1, 2,... A Poisson PMF with λ = np is a good approximation for a binomial PMF with very small p and very large n if k n e.g., the number of typos in a book with n words, number of cars (out of n) that crash in a city on a given day, etc.

Examples of Poisson Random Variables e.g., you go to a party with 500 guests. What is the probability that one other guest has the same birthday as you?

For Next Time Read B&T 2.4, 2.5 Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/ First homework is due on Friday