Methods of Mathematics

Similar documents
Methods of Mathematics

Methods of Mathematics

Probability concepts. Math 10A. October 33, 2017

z-test, t-test Kenneth A. Ribet Math 10A November 28, 2017

Lecture 3. Discrete Random Variables

The remains of the course

CMPSCI 240: Reasoning Under Uncertainty

Intro to probability concepts

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya

Random Variables Example:

Conditional Probability

MATH MW Elementary Probability Course Notes Part I: Models and Counting

Welcome to Math 10A!!

1. Discrete Distributions

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Sample Spaces, Random Variables

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

p. 4-1 Random Variables

BINOMIAL DISTRIBUTION

Topic 9 Examples of Mass Functions and Densities

Binomial and Poisson Probability Distributions

Probability and Statistics. Vittoria Silvestri

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability

5. Conditional Distributions

2. AXIOMATIC PROBABILITY

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

Math 493 Final Exam December 01

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Relationship between probability set function and random variable - 2 -

Lecture 12. Poisson random variables

More on infinite series Antiderivatives and area

Chapter 4 : Discrete Random Variables

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Lecture 1. ABC of Probability

Differential Equations

Discrete Distributions

6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips

X 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:

Chapter 2: The Random Variable

Derivatives: definition and first properties

1: PROBABILITY REVIEW

P (A) = P (B) = P (C) = P (D) =

Probabilistic models

Name: Firas Rassoul-Agha

Probability Year 9. Terminology

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

Lecture 1: Probability Fundamentals

Recap of Basic Probability Theory

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 2 Binomial and Poisson Probability Distributions

Hypothesis testing and the Gamma function

Recap of Basic Probability Theory

Probability Year 10. Terminology

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

RANDOM WALKS IN ONE DIMENSION

Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables

18440: Probability and Random variables Quiz 1 Friday, October 17th, 2014

Lecture 16. Lectures 1-15 Review

Discrete Random Variable

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

CS 361: Probability & Statistics

Probability, For the Enthusiastic Beginner (Exercises, Version 1, September 2016) David Morin,

Some Special Discrete Distributions

2.1 Elementary probability; random sampling

Probability Theory and Random Variables

Discrete Distributions

27 Binary Arithmetic: An Application to Programming

Introduction to Probability and Statistics Slides 3 Chapter 3

CS 237: Probability in Computing

Discrete Random Variables

Probability Theory. Probability and Statistics for Data Science CSE594 - Spring 2016

Methods of Mathematics

Discrete random variables and probability distributions

Example 1. The sample space of an experiment where we flip a pair of coins is denoted by:

Derivatives and series FTW!

1 Sequences of events and their limits

University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1

STAT/SOC/CSSS 221 Statistical Concepts and Methods for the Social Sciences. Random Variables

Unit 4 Probability. Dr Mahmoud Alhussami

Introduction to Probability

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39

Probability 1 (MATH 11300) lecture slides

Lecture 8 : The Geometric Distribution

Mathematical Statistics 1 Math A 6330

Probability Method in Civil Engineering Prof. Dr. Rajib Maity Department of Civil Engineering Indian Institute of Technology, Kharagpur

Things to remember when learning probability distributions:

STAT 111 Recitation 1

Continuous-Valued Probability Review

Calculus (Math 1A) Lecture 1

Guidelines for Solving Probability Problems

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Math 183 Statistical Methods

Midterm 2. Your Exam Room: Name of Person Sitting on Your Left: Name of Person Sitting on Your Right: Name of Person Sitting in Front of You:

Discrete Probability

Random Variable. Pr(X = a) = Pr(s)

Introduction to Probability, Fall 2013

Transcription:

Methods of Mathematics Kenneth A. Ribet UC Berkeley Math 10B February 23, 2016

Office hours Office hours Monday 2:10 3:10 and Thursday 10:30 11:30 in Evans; Tuesday 10:30 noon at the SLC Kenneth A. Ribet February 23

Breakfasts Our next breakfasts will be on Thursday, March 3 and on March 18, both at 8AM. Send email if you d like to come. Kenneth A. Ribet February 23

Crossroads lunch About a month ago, one of the students suggested a Crossroads lunch on See you there! Thursday, January 25 at 12:15PM.

Faculty Club lunch Our next pop-up Faculty Club lunch will be at 12:30PM on Friday, February 26. So that I can keep track of the size of the group, I recommend that you let me know ahead of time if you plan on coming. However, you can just show up! All are welcome. Kenneth A. Ribet February 23

Discrete probability distributions We are interested in a random variable X : Ω R, where Ω is a finite or countably infinite set. In the most interesting examples, the values of X are natural numbers (non-negative integers). Concretely, we can let Ω be the set of outcomes of the experiment of flipping a coin until H comes up, so Ω consists of H, TH, TTH, TTTH, etc. Last time, we introduced the random variable X : T n H n + 1 that takes each string to its length. The possible values of X are the positive integers 1, 2, 3,....

If X is a random variable and x is a value of X, we define P(X = x) := P(ω). ω:x(ω)=x Said somewhat differently, x R corresponds to an event A Ω, namely A = { ω Ω X(ω) = x }. Then P(X = x) := P(A).

The theme of this class meeting is that the set of numbers P(X = x i ) can be interesting! This set of numbers is called a probability distribution. The sum of the numbers of a probability distribution is 1, as you can check easily. Today we will focus on examples of probability distributions.

Geometric distribution Let Ω = { T n H n 0 } be the set of outcomes when we flip a possibly biased coin until a head comes up. Define X(T n H) = n; thus X (for today) is the number of tails before we get the head. (On Thursday, we looked at the random variable X + 1, which indicates the number of tosses, including the head toss.) Then P(X = n) = q n p, where q is the probability of getting a T when we flip the coin and p = 1 q is the probability of getting a H. Note here that the possible values of n are 0, 1, 2,....

Bernoulli distribution We go back a few steps and imagine flipping the coin only once. The possible outcomes are T and H; thus Ω is the two-element set { T, H }. Let X : Ω R be defined by X(T) = 0, X(H) = 1. Then the values of X are 0 and 1, and they occur with probabilities q and p, respectively (same notation as for the geometric distribution).

Bernoulli trials Imagine pulling balls out of a box that contains red and black balls. Suppose that 70% of the balls are red. If you pull a ball out of the box at random, the probability that it s red is 0.70. If you pull a ball out, examine it and record the color, and then replace the ball, you return the situation to its initial state. If you repeat the exercise again and again, you re basically flipping a biased coin. To do something repeatedly when there are two outcomes and the probabilities of the two outcomes do not change is to engage in Bernoulli trials.

Binomial distribution Now imagine that we flip a biased coin n times and record the results. Then Ω has 2 n elements; each element is a string of length n made from the characters T and H. For ω Ω, let X(ω) be the number of Hs that occur in ω. For example, X(THTTHHTHTH) = 5 (if I counted correctly). Then P(X = k) is the product of p k q n k with the number of strings that have k Hs and (n k) Ts. That number is ( n k), so ( ) n P(X = k) = p k q n k. k (The sum of these numbers is (p + q) n = 1 n = 1.)

Hypergeometric distribution Imagine now that our box of balls has m red balls and N m black balls; thus it has N balls in total. Suppose that we pull out n balls (sequentially or all at once) and ask how many of these balls are red. Let X = the number of red balls. Then P(X = k) = ( m )( N m ) k n k ( N, k = 0,..., n. n) Analogue: Instead of a box of balls, we could have a pack of 52 cards, 26 of which are red and 26 of which are black. We could take n = 5; then we re choosing a poker hand and asking how many cards in the hand are red.

Sanity check If the previous slide is right, then the sum of the probabilities has to be 1. This means: n k=0 ( m k )( N m n k ) = as long as n and m are no bigger than N. ( ) N, n Can you think of a combinatorial proof of this identity?

THIS PAGE INTENTIONALLY LEFT BLANK

Poisson distribution This is the first time that we try to impose a distribution on values of a random variable without trying to prove that this distribution is correct through examination of Ω and the precise random variable X that is in question. The distribution is defined by P(X = k) = λk k! e λ, k = 0, 1, 2,..., where λ is a parameter. Because of the factor e λ, the sum of the probabilities is 1 (as it needs to be).

The interpretation of λ is that it is the average value of X. (On Thursday, we will use the term expected value!) The average is by definition the sum P(X = k)k = k=0 k=0 k λk k! e λ, which works out to be λ: the sum may be rewritten k=1 λ k (k 1)! e λ = λe λ k=1 λ k 1 (k 1)!, which becomes λe λ l=0 λ l l! = λ.

Wikipedia examples If the average number of pieces of mail you receive per day is λ, then it is reasonable to expect that the number of pieces of mail that you receive per day is distributed according to the Poisson distribution. On a particular river, overflow floods occur once every 100 years on average. Calculate the probability of 0, 1, 2, 3, 4, 5, or 6 overflow floods in a 100-year interval, assuming the Poisson model is appropriate.

Derivation of the Poisson model Imagine that I receive λ pieces of mail per day on average and I want to estimate the probability of receiving k pieces of mail on a given day. My model is that all n people (including corporations, of course) in the US flip a biased coin every day to decide whether to send me mail. Each coin comes up H ( send Ribet mail ) with probability p. On average, each person then sends me p pieces of mail per day, so (on average) I receive np pieces of mail per day all told. We conclude that p = λ/n (a minuscule number) because I started by saying that I receive λ pieces of mail per day on average. The probability that I receive k pieces of mail on a given day is then known from the binomial distribution.

Specifically, the probability that I receive k pieces of mail on a given day is ( ) n p k (1 p) n k = 1 k k! λk (1 λ n )n J, where J (for junk ) is the product 1(1 1 n )(1 2 n ) (1 k 1 n ) (1 λ n ) k. We have J 1 as n. The term (1 λ n )n approaches e λ by first-semester calculus (Math 10A?). Hence as n, the probability that I receive k pieces of mail on a given day approaches λ k e λ 1 k!, which is the Poisson number. Details via document camera or white board.