Pr[A B] > Pr[A]Pr[B]. Pr[A B C] = Pr[(A B) C] = Pr[A]Pr[B A]Pr[C A B].
|
|
- Jean Fox
- 5 years ago
- Views:
Transcription
1 CS70: Jean Walrand: Lecture 25. Product Rule Product Rule Causality, Independence, Collisions and Collecting 1. Product Rule 2. Correlation and Causality 3. Independence Birthdays 6. Checksums 7. Collecting Coupons Recall the definition: Hence, Consequently, Pr[B A] = Pr[A B]. Pr[A] Pr[A B] = Pr[A]Pr[B A]. Pr[A B C] = Pr[(A B) C] = Pr[A B]Pr[C A B] = Pr[A]Pr[B A]Pr[C A B]. Theorem Product Rule Let A 1,A 2,...,A n be events. Then Pr[A 1 A n ] = Pr[A 1 ]Pr[A 2 A 1 ] Pr[A n A 1 A n 1 ]. Proof: By induction. Assume the result is true for n. (It holds for n = 2.) Then, Pr[A 1 A n A n+1 ] = Pr[A 1 A n ]Pr[A n+1 A 1 A n ] = Pr[A 1 ]Pr[A 2 A 1 ] Pr[A n A 1 A n 1 ]Pr[A n+1 A 1 A n ], so that the result holds for n + 1. Correlation An example. Random experiment: Pick a person at random. Event A: the person has lung cancer. Event B: the person is a heavy smoker. Fact: Conclusion: Pr[A B] = 1.17 Pr[A]. Smoking increases the probability of lung cancer by 17%. Smoking causes lung cancer. Correlation Event A: the person has lung cancer. Event B: the person is a heavy smoker. Pr[A B] = 1.17 Pr[A]. A second look. Note that Pr[A B] = 1.17 Pr[A] Conclusion: Pr[A B] = 1.17 Pr[A] Pr[B] Pr[A B] = 1.17 Pr[A]Pr[B] Pr[B A] = 1.17 Pr[B]. Lung cancer increases the probability of smoking by 17%. Lung cancer causes smoking. Really? Causality vs. Correlation Events A and B are positively correlated if Pr[A B] > Pr[A]Pr[B]. (E.g., smoking and lung cancer.) A and B being positively correlated does not mean that A causes B or that B causes A. Other examples: Tesla owners are more likely to be rich. That does not mean that poor people should buy a Tesla to get rich. People who go to the opera are more likely to have a good career. That does not mean that going to the opera will improve your career. Rabbits eat more carrots and do not wear glasses. Are carrots good for eyesight?
2 Proving Causality Proving causality is generally difficult. One has to eliminate external causes of correlation and be able to test the cause/effect relationship (e.g., randomized clinical trials). Some difficulties: A and B may be positively correlated because they have a common cause. (E.g., being a rabbit.) If B precedes A, then B is more likely to be the cause. (E.g., smoking.) However, they could have a common cause that induces B before A. (E.g., smart, CS70, Tesla.) More about such questions later. For fun, check N. Taleb: Fooled by randomness. Example 2 Independence Recall : A and B are independent Pr[A B] = Pr[A]Pr[B] Pr[A B] = Pr[A]. The intuition is that A does not say anything about B. This intuition is a bit misleading. See next slide. Mutual Independence Example 1 Flip two fair coins. Let A = first coin is H = {HT,HH}; B = second coin is H = {TH,HH}; C = the two coins are different = {TH,HT }. A, C are independent; B, C are independent; A B,C are not independent. (Pr[A B C] = 0 Pr[A B]Pr[C].) If A did not say anything about C and B did not say anything about C, then A B would not say anything about C. Mutual Independence Flip a fair coin 5 times. Let A n = coin n is H, for n = 1,...,5. Then, A m,a n are independent for all m n. Also, A 1 and A 3 A 5 are independent. Indeed, Pr[A 1 (A 3 A 5 )] = 1 8 = Pr[A 1]Pr[A 3 A 5 ]. Similarly, A 1 A 2 and A 3 A 4 A 5 are independent. This leads to a definition... Definition Mutual Independence (a) The events A 1,...,A 5 are mutually independent if Pr[ k K A k ] = Π k K Pr[A k ], for all K {1,...,5}. (b) More generally, the events {A j,j J} are mutually independent if Pr[ k K A k ] = Π k K Pr[A k ], for all finitek J. Example: Flip a fair coin forever. Let A n = coin n is H. Then the events A n are mutually independent. Theorem (a) If the events {A j,j J} are mutually independent and if K 1 and K 2 are disjoint finite subsets of J, then k K1 A k and k K2 A k are independent. (b) More generally, if the K n are pairwise disjoint finite subsets of J, then the events k Kn A k are mutually independent. (c) Also, the same is true if we replace some of the A k by Āk. Proof: See homework.
3 One throws m balls into n > m bins. One throws m balls into n > m bins. Theorem: Pr[no collision] exp{ m2 }, for large enough n. Theorem: Pr[no collision] exp{ m2 }, for large enough n. Theorem: Pr[no collision] exp{ m2 }, for large enough n. In particular, Pr[no collision] 1/2 for m 2 /() ln(2), i.e., E.g., m 2ln(2)n 1.2 n. Roughly, Pr[collision] 1/2 for m = n. (e ) The Calculation. A i = no collision when ith ball is placed in a bin. Pr[A i A i 1 A 1 ] = (1 i 1 n ). no collision = A 1 A m. Product rule: Pr[A 1 A m ] = Pr[A 1 ]Pr[A 2 A 1 ] Pr[A m A 1 A m 1 ] ( Pr[no collision] = 1 1 ) ( 1 m 1 ). n n Hence, ln(pr[no collision]) = m 1 k=1 = 1 n ( ) We used ln(1 ε) ε for ε 1. ( ) m 1 = (m 1)m/2. ln(1 k m 1 n ) ( k n ) ( ) k=1 ( ) m(m 1) m2 2 Approximation exp{ x} = 1 x + 1 2! x x, for x 1. Hence, x ln(1 x) for x 1.
4 Sum of consecutive integers Today s your birthday, it s my birthday too.. Checksums! Recall this useful fact: Probability two of n people have the same birthday? With n = 365, one finds Pr[collision] 1/2 if m If m = 60, we find that Pr[not collision] exp{ m2 602 } = exp{ } Given two random files, What are the odds they have same checksum? Let n = 2 b be the number of checksums. Let m be the number of files. How big should b be to avoid any collisions? If m = 366, then Pr[no collision] = 0. (No approximation here!) Checksum For b-bit checksums for m files, we claim that Pr[collision] 1 if b 2.9lnm E.g., for m = files, a 103-bit checksum suffices! Derivation: b bits n = 2 b bins. Pr[collision] 2 10 Pr[no collision] exp{ m2 } m m m n m b m b 2log 2 (m) + 9 = ( ) 2lnm/(ln2) lnm + 9. An aside. Secure checksums. Random plus security. Want: cannot find two files that hash to the same bucket. Checksum of virus program should checksum of real program. MD5: broke. SHA-1: broke. SHA-2: so far... CS161 Coupon Collector Problem. n different baseball cards. (Brian Wilson, Jackie Robinson, Roger Hornsby,...) One random baseball card in each cereal box. How many cereal boxes do you have to buy to get Brian Wilson card... with probability at least 1 2? ( ) log 2 (x) = ln(x)/ln(2). Indeed: ln(x) = log 2 (x)ln(2) since e log 2 (x)ln(2) = [e ln(2) ] log 2 (x) = 2 log 2 (x) = x.
5 Coupon Collector Problem. Event A m = fail to get Brian Wilson in m cereal boxes Fail the first time: (1 1 n ) Fail the second time: (1 1 n ) And so on... for m times. Hence, Pr[A m ] = (1 1 n ) (1 1 n ) = (1 1 n )m. Analyze expression. Pr[A m ] = (1 1 n )m When is p m = Pr[A m ] 1 2? Taylor s formula: when x is small. Hence, e x = 1 x + x 2 2! x 3 3! 1 x. p m = (1 1 n )m (e 1 n ) m = e m n = e ln(1/p m). After m = n ln 1 p m cards, we fail to get a Brian Wilson card with probability p m. For p m = 1 2, we need around n ln2 0.69n boxes. Collect all cards? Experiment: Choose m cards at random with replacement. Events: E k = fail to get player k, for k = 1,..., n Probability of failing to get at least one of these n players: p := Pr[E 1 E 2 E n ] How does one estimate p? Union Bound: p = Pr[E 1 E 2 E n ] Pr[E 1 ] + Pr[E 2 ] Pr[E n ]. Plug in and get Pr[E k ] e m n,k = 1,...,n. p ne m n. Collect all cards? Bittorrent. Using codes! Thus, Pr[missing at least one card] ne m n. Hence, Pr[missing at least one card] p when m n ln( n p ). To get p = 1/2, set m = n ln(). E.g., n = 10 2 m = 530;n = 10 3 m = If file is split into n pieces. Each server has random piece. Ask n ln() servers to get file with probability 1 2. Use Reed-Solomon codes (or Tornado codes 1 ). Encode n pieces into pieces. Any n pieces ok (n + n pieces ok with Tornado codes.) How many requests to get n different pieces with failure probability at most p? More complicated. Random variables, distributions. But at most: (ln2) + O( n) is good enough. Much better than n ln(). E.g., for n = 100, around a factor of 4 better! 1 Linear time decoding!
6 Summary. Causality, Independence, Collisions and Collecting Main results: Product Rule Correlation Causality : m balls into n > m bins. Pr[no collisions] exp{ m2 } Coupon Collection: n items. Buy m cereal boxes. Pr[miss one specific item] e m n ; Pr[miss any one of the items] ne m n. Key ideas: ln(1 ε) ε;e ε 1 ε; product rule; union bound.
Conditional Probability: Review
CS7: Jean Walrand: Lecture 7 ayes Rule, Mutual Independence, Collisions and Collecting Conditional Probability 2 Independence 3 ayes Rule 4 alls and ins 5 Coupons Conditional Probability: Review Recall:
More informationCS70: Jean Walrand: Lecture 16.
CS70: Jean Walrand: Lecture 16. Events, Conditional Probability, Independence, Bayes Rule 1. Probability Basics Review 2. Events 3. Conditional Probability 4. Independence of Events 5. Bayes Rule Probability
More informationProbability Basics Review
CS70: Jean Walrand: Lecture 16 Events, Conditional Probability, Independence, Bayes Rule Probability Basics Review Setup: Set notation review A B A [ B A \ B 1 Probability Basics Review 2 Events 3 Conditional
More informationProbability Space: Formalism Simplest physical model of a uniform probability space:
Lecture 16: Continuing Probability Probability Space: Formalism Simplest physical model of a uniform probability space: Probability Space: Formalism Simplest physical model of a non-uniform probability
More informationLecture 9b: Events, Conditional Probability, Independence, Bayes' Rule Lecturer: Lale Özkahya
BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 9b: Events, Conditional Probability, Independence, Bayes' Rule Lecturer: Lale Özkahya Resources: Kenneth
More informationContinuing Probability.
Continuing Probability. Wrap up: Probability Formalism. Events, Conditional Probability, Independence, Bayes Rule Probability Space: Formalism Simplest physical model of a uniform probability space: Red
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14
CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten
More information1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).
CS 70 Discrete Mathematics for CS Spring 2006 Vazirani Lecture 8 Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According to clinical trials,
More informationLecture Notes. Here are some handy facts about the probability of various combinations of sets:
Massachusetts Institute of Technology Lecture 20 6.042J/18.062J: Mathematics for Computer Science April 20, 2000 Professors David Karger and Nancy Lynch Lecture Notes 1 Set Theory and Probability 1.1 Basic
More informationTopic 2 Multiple events, conditioning, and independence, I. 2.1 Two or more events on the same sample space
CSE 103: Probability and statistics Fall 2010 Topic 2 Multiple events, conditioning, and independence, I 2.1 Two or more events on the same sample space Frequently we end up dealing with multiple events
More informationMathematical Foundations of Computer Science Lecture Outline October 18, 2018
Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 The Total Probability Theorem. Consider events E and F. Consider a sample point ω E. Observe that ω belongs to either F or
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation
CS 70 Discrete Mathematics and Probability Theory Spring 206 Rao and Walrand Note 6 Random Variables: Distribution and Expectation Example: Coin Flips Recall our setup of a probabilistic experiment as
More informationLecture 04: Balls and Bins: Birthday Paradox. Birthday Paradox
Lecture 04: Balls and Bins: Overview In today s lecture we will start our study of balls-and-bins problems We shall consider a fundamental problem known as the Recall: Inequalities I Lemma Before we begin,
More informationNotes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL:
Notes slides from before lecture CSE 21, Winter 2017, Section A00 Lecture 16 Notes Class URL: http://vlsicad.ucsd.edu/courses/cse21-w17/ Notes slides from before lecture Notes March 8 (1) This week: Days
More informationCSE 312, 2017 Winter, W.L.Ruzzo. 5. independence [ ]
CSE 312, 2017 Winter, W.L.Ruzzo 5. independence [ ] independence Defn: Two events E and F are independent if P(EF) = P(E) P(F) If P(F)>0, this is equivalent to: P(E F) = P(E) (proof below) Otherwise, they
More informationDiscrete Mathematics and Probability Theory Fall 2017 Ramchandran and Rao Midterm 2 Solutions
CS 70 Discrete Mathematics and Probability Theory Fall 2017 Ramchandran and Rao Midterm 2 Solutions PRINT Your Name: Oski Bear SIGN Your Name: OS K I PRINT Your Student ID: CIRCLE your exam room: Pimentel
More information6.1 Occupancy Problem
15-859(M): Randomized Algorithms Lecturer: Anupam Gupta Topic: Occupancy Problems and Hashing Date: Sep 9 Scribe: Runting Shi 6.1 Occupancy Problem Bins and Balls Throw n balls into n bins at random. 1.
More informationBalls & Bins. Balls into Bins. Revisit Birthday Paradox. Load. SCONE Lab. Put m balls into n bins uniformly at random
Balls & Bins Put m balls into n bins uniformly at random Seoul National University 1 2 3 n Balls into Bins Name: Chong kwon Kim Same (or similar) problems Birthday paradox Hash table Coupon collection
More information1 Probability Review. CS 124 Section #8 Hashing, Skip Lists 3/20/17. Expectation (weighted average): the expectation of a random quantity X is:
CS 24 Section #8 Hashing, Skip Lists 3/20/7 Probability Review Expectation (weighted average): the expectation of a random quantity X is: x= x P (X = x) For each value x that X can take on, we look at
More informationThe first bound is the strongest, the other two bounds are often easier to state and compute. Proof: Applying Markov's inequality, for any >0 we have
The first bound is the strongest, the other two bounds are often easier to state and compute Proof: Applying Markov's inequality, for any >0 we have Pr (1 + ) = Pr For any >0, we can set = ln 1+ (4.4.1):
More informationELEG 3143 Probability & Stochastic Process Ch. 1 Probability
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Applications Elementary Set Theory Random
More information6.042/18.062J Mathematics for Computer Science. Independence
6.042/8.062J Mathematics for Computer Science Srini Devadas and Eric Lehman April 26, 2005 Lecture otes Independence Independent Events Suppose that we flip two fair coins simultaneously on opposite sides
More informationCS 125 Section #12 (More) Probability and Randomized Algorithms 11/24/14. For random numbers X which only take on nonnegative integer values, E(X) =
CS 125 Section #12 (More) Probability and Randomized Algorithms 11/24/14 1 Probability First, recall a couple useful facts from last time about probability: Linearity of expectation: E(aX + by ) = ae(x)
More informationUCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis
UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 10 Notes Midterm Good job overall! = 81; =
More informationLecture 4: Two-point Sampling, Coupon Collector s problem
Randomized Algorithms Lecture 4: Two-point Sampling, Coupon Collector s problem Sotiris Nikoletseas Associate Professor CEID - ETY Course 2013-2014 Sotiris Nikoletseas, Associate Professor Randomized Algorithms
More informationTheorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( )
Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr Pr = Pr Pr Pr() Pr Pr. We are given three coins and are told that two of the coins are fair and the
More informationcompare to comparison and pointer based sorting, binary trees
Admin Hashing Dictionaries Model Operations. makeset, insert, delete, find keys are integers in M = {1,..., m} (so assume machine word size, or unit time, is log m) can store in array of size M using power:
More informationIntroduction to discrete probability. The rules Sample space (finite except for one example)
Algorithms lecture notes 1 Introduction to discrete probability The rules Sample space (finite except for one example) say Ω. P (Ω) = 1, P ( ) = 0. If the items in the sample space are {x 1,..., x n }
More informationSets. Review of basic probability. Tuples. E = {all even integers} S = {x E : x is a multiple of 3} CSE 101. I = [0, 1] = {x : 0 x 1}
Sets A = {a, b, c,..., z} A = 26 Review of basic probability = {0, 1} = 2 E = {all even integers} E = CSE 1 S = {x E : x is a multiple of 3} I = [0, 1] = {x : 0 x 1} In a set, the order of elements doesn
More informationExample 1. The sample space of an experiment where we flip a pair of coins is denoted by:
Chapter 8 Probability 8. Preliminaries Definition (Sample Space). A Sample Space, Ω, is the set of all possible outcomes of an experiment. Such a sample space is considered discrete if Ω has finite cardinality.
More informationDiscrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations
EECS 70 Discrete Mathematics and Probability Theory Fall 204 Anant Sahai Note 5 Random Variables: Distributions, Independence, and Expectations In the last note, we saw how useful it is to have a way of
More informationDiscrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation
CS 70 Discrete Mathematics and Probability Theory Fall 203 Vazirani Note 2 Random Variables: Distribution and Expectation We will now return once again to the question of how many heads in a typical sequence
More informationECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview
ECE 450 Lecture 2 Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview Conditional Probability, Pr(A B) Total Probability Bayes Theorem Independent Events
More informationSTA 291 Lecture 8. Probability. Probability Rules. Joint and Marginal Probability. STA Lecture 8 1
STA 291 Lecture 8 Probability Probability Rules Joint and Marginal Probability STA 291 - Lecture 8 1 Union and Intersection Let A and B denote two events. The union of two events: A B The intersection
More informationAttacks on hash functions. Birthday attacks and Multicollisions
Attacks on hash functions Birthday attacks and Multicollisions Birthday Attack Basics In a group of 23 people, the probability that there are at least two persons on the same day in the same month is greater
More informationDiscrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20
CS 70 Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 Today we shall discuss a measure of how close a random variable tends to be to its expectation. But first we need to see how to compute
More informationLecture 19 : Reed-Muller, Concatenation Codes & Decoding problem
IITM-CS6845: Theory Toolkit February 08, 2012 Lecture 19 : Reed-Muller, Concatenation Codes & Decoding problem Lecturer: Jayalal Sarma Scribe: Dinesh K Theme: Error correcting codes In the previous lecture,
More informationWith Question/Answer Animations. Chapter 7
With Question/Answer Animations Chapter 7 Chapter Summary Introduction to Discrete Probability Probability Theory Bayes Theorem Section 7.1 Section Summary Finite Probability Probabilities of Complements
More informationCPSC 467: Cryptography and Computer Security
CPSC 467: Cryptography and Computer Security Michael J. Fischer Lecture 14 October 16, 2013 CPSC 467, Lecture 14 1/45 Message Digest / Cryptographic Hash Functions Hash Function Constructions Extending
More informationFormalizing Probability. Choosing the Sample Space. Probability Measures
Formalizing Probability Choosing the Sample Space What do we assign probability to? Intuitively, we assign them to possible events (things that might happen, outcomes of an experiment) Formally, we take
More informationCMPSCI 240: Reasoning about Uncertainty
CMPSCI 240: Reasoning about Uncertainty Lecture 2: Sets and Events Andrew McGregor University of Massachusetts Last Compiled: January 27, 2017 Outline 1 Recap 2 Experiments and Events 3 Probabilistic Models
More informationDiscrete Mathematics and Probability Theory Fall 2012 Vazirani Note 14. Random Variables: Distribution and Expectation
CS 70 Discrete Mathematics and Probability Theory Fall 202 Vazirani Note 4 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected in, randomly
More informationP (E) = P (A 1 )P (A 2 )... P (A n ).
Lecture 9: Conditional probability II: breaking complex events into smaller events, methods to solve probability problems, Bayes rule, law of total probability, Bayes theorem Discrete Structures II (Summer
More information4. What is the probability that the two values differ by 4 or more in absolute value? There are only six
1. Short Questions: 2/2/2/2/2 Provide a clear and concise justification of your answer. In this problem, you roll two balanced six-sided dice. Hint: Draw a picture. 1. What is the probability that the
More informationLECTURE 1. 1 Introduction. 1.1 Sample spaces and events
LECTURE 1 1 Introduction The first part of our adventure is a highly selective review of probability theory, focusing especially on things that are most useful in statistics. 1.1 Sample spaces and events
More informationLecture 4: Sampling, Tail Inequalities
Lecture 4: Sampling, Tail Inequalities Variance and Covariance Moment and Deviation Concentration and Tail Inequalities Sampling and Estimation c Hung Q. Ngo (SUNY at Buffalo) CSE 694 A Fun Course 1 /
More informationX = X X n, + X 2
CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk
More informationNotes. Combinatorics. Combinatorics II. Notes. Notes. Slides by Christopher M. Bourke Instructor: Berthe Y. Choueiry. Spring 2006
Combinatorics Slides by Christopher M. Bourke Instructor: Berthe Y. Choueiry Spring 2006 Computer Science & Engineering 235 Introduction to Discrete Mathematics Sections 4.1-4.6 & 6.5-6.6 of Rosen cse235@cse.unl.edu
More informationUCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis
UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 6 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 6 Notes Goals: CL Sections 3, 4; FN Section
More informationThird Problem Assignment
EECS 401 Due on January 26, 2007 PROBLEM 1 (35 points Oscar has lost his dog in either forest A (with a priori probability 0.4 or in forest B (with a priori probability 0.6. If the dog is alive and not
More information3.2 Intoduction to probability 3.3 Probability rules. Sections 3.2 and 3.3. Elementary Statistics for the Biological and Life Sciences (Stat 205)
3.2 Intoduction to probability Sections 3.2 and 3.3 Elementary Statistics for the Biological and Life Sciences (Stat 205) 1 / 47 Probability 3.2 Intoduction to probability The probability of an event E
More informationLecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability
Lecture Notes 1 Basic Probability Set Theory Elements of Probability Conditional probability Sequential Calculation of Probability Total Probability and Bayes Rule Independence Counting EE 178/278A: Basic
More informationSolutions to Problem Set 4
UC Berkeley, CS 174: Combinatorics and Discrete Probability (Fall 010 Solutions to Problem Set 4 1. (MU 5.4 In a lecture hall containing 100 people, you consider whether or not there are three people in
More informationIndependence. Conditional Probability. Conditional Probability. Independence. SCONE Lab. Definition: Two events E and F are independent iff
Independence Definition: Two events E and F are independent iff Seoul National University Example Pr(E F) = Pr(E) Pr(F) Flip a coin twice Event E: First trial is heads Event F: Second trial is tails Pr(E)
More informationAditya Bhaskara CS 5968/6968, Lecture 1: Introduction and Review 12 January 2016
Lecture 1: Introduction and Review We begin with a short introduction to the course, and logistics. We then survey some basics about approximation algorithms and probability. We also introduce some of
More information6.042/18.062J Mathematics for Computer Science November 21, 2006 Tom Leighton and Ronitt Rubinfeld. Independence
6.042/18.062J Mathematics for Computer Science November 21, 2006 Tom Leighton and Ronitt Rubinfeld Lecture Notes Independence 1 Independent Events Suppose that we flip two fair coins simultaneously on
More informationCSC Discrete Math I, Spring Discrete Probability
CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields
More informationCS155: Probability and Computing: Randomized Algorithms and Probabilistic Analysis
CS155: Probability and Computing: Randomized Algorithms and Probabilistic Analysis Eli Upfal Eli Upfal@brown.edu Office: 319 TA s: Lorenzo De Stefani and Sorin Vatasoiu cs155tas@cs.brown.edu It is remarkable
More informationUCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis
UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 10 Notes Midterm Good job overall! = 81; =
More informationRandomized Algorithms
Randomized Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A new 4 credit unit course Part of Theoretical Computer Science courses at the Department of Mathematics There will be 4 hours
More informationDiscrete Structures for Computer Science
Discrete Structures for Computer Science William Garrison bill@cs.pitt.edu 6311 Sennott Square Lecture #24: Probability Theory Based on materials developed by Dr. Adam Lee Not all events are equally likely
More informationRandomized Load Balancing:The Power of 2 Choices
Randomized Load Balancing: The Power of 2 Choices June 3, 2010 Balls and Bins Problem We have m balls that are thrown into n bins, the location of each ball chosen independently and uniformly at random
More informationDiscrete Mathematics and Probability Theory Fall 2011 Rao Midterm 2 Solutions
CS 70 Discrete Mathematics and Probability Theory Fall 20 Rao Midterm 2 Solutions True/False. [24 pts] Circle one of the provided answers please! No negative points will be assigned for incorrect answers.
More informationDiscrete Probability
MAT 258 Discrete Mathematics Discrete Probability Kenneth H. Rosen and Kamala Krithivasan Discrete Mathematics 7E Global Edition Chapter 7 Reproduced without explicit consent Fall 2016 Week 11 Probability
More informationProperties of Probability
Econ 325 Notes on Probability 1 By Hiro Kasahara Properties of Probability In statistics, we consider random experiments, experiments for which the outcome is random, i.e., cannot be predicted with certainty.
More informationDiscrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10
EECS 70 Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 Introduction to Basic Discrete Probability In the last note we considered the probabilistic experiment where we flipped
More informationChapter Summary. 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance
Chapter 7 Chapter Summary 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance Section 7.1 Introduction Probability theory dates back to 1526 when the Italian
More informationApplication: Bucket Sort
5.2.2. Application: Bucket Sort Bucket sort breaks the log) lower bound for standard comparison-based sorting, under certain assumptions on the input We want to sort a set of =2 integers chosen I+U@R from
More informationDiscrete Mathematics and Probability Theory Summer 2017 Hongling Lu, Vrettos Moulos, and Allen Tang HW 6
CS 70 Discrete Mathematics and Probability Theory Summer 017 Hongling Lu, Vrettos Moulos, and Allen Tang HW 6 Sundry Before you start your homework, write down your team. Who else did you work with on
More informationHW2 Solutions, for MATH441, STAT461, STAT561, due September 9th
HW2 Solutions, for MATH44, STAT46, STAT56, due September 9th. You flip a coin until you get tails. Describe the sample space. How many points are in the sample space? The sample space consists of sequences
More informationCarleton University. Final Examination Fall DURATION: 2 HOURS No. of students: 195
Carleton University Final Examination Fall 15 DURATION: 2 HOURS No. of students: 195 Department Name & Course Number: Computer Science COMP 2804A Course Instructor: Michiel Smid Authorized memoranda: Calculator
More informationDiscrete Probability
Discrete Probability Counting Permutations Combinations r- Combinations r- Combinations with repetition Allowed Pascal s Formula Binomial Theorem Conditional Probability Baye s Formula Independent Events
More informationHomework 4 Solutions
CS 174: Combinatorics and Discrete Probability Fall 01 Homework 4 Solutions Problem 1. (Exercise 3.4 from MU 5 points) Recall the randomized algorithm discussed in class for finding the median of a set
More informationLectures Conditional Probability and Independence
Lectures 5 11 Conditional Probability and Independence Purpose: Calculate probabilities under restrictions, conditions or partial information on the random experiment. Break down complex probabilistic
More informationLecture 12. Block Diagram
Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data
More informationJanson s Inequality and Poisson Heuristic
Janson s Inequality and Poisson Heuristic Dinesh K CS11M019 IIT Madras April 30, 2012 Dinesh (IITM) Janson s Inequality April 30, 2012 1 / 11 Outline 1 Motivation Dinesh (IITM) Janson s Inequality April
More informationMathematical Foundations of Computer Science Lecture Outline October 9, 2018
Mathematical Foundations of Computer Science Lecture Outline October 9, 2018 Eample. in 4? When three dice are rolled what is the probability that one of the dice results Let F i, i {1, 2, 3} be the event
More information1 True/False. Math 10B with Professor Stankova Worksheet, Discussion #9; Thursday, 2/15/2018 GSI name: Roy Zhao
Math 10B with Professor Stankova Worksheet, Discussion #9; Thursday, 2/15/2018 GSI name: Roy Zhao 1 True/False 1. True False When we solve a problem one way, it is not useful to try to solve it in a second
More informationMARKING A BINARY TREE PROBABILISTIC ANALYSIS OF A RANDOMIZED ALGORITHM
MARKING A BINARY TREE PROBABILISTIC ANALYSIS OF A RANDOMIZED ALGORITHM XIANG LI Abstract. This paper centers on the analysis of a specific randomized algorithm, a basic random process that involves marking
More informationTheoretical Cryptography, Lecture 10
Theoretical Cryptography, Lecture 0 Instructor: Manuel Blum Scribe: Ryan Williams Feb 20, 2006 Introduction Today we will look at: The String Equality problem, revisited What does a random permutation
More informationHomework every week. Keep up to date or you risk falling behind. Quizzes and Final exam are based on homework questions.
Week 1 Fall 2016 1 of 25 CISC-102 Fall 2016 Week 1 David Rappaport daver@cs.queensu.ca Goodwin G-532 Office Hours: Monday 1:00-3:00 (or by appointment) Homework Homework every week. Keep up to date or
More informationEcon 113. Lecture Module 2
Econ 113 Lecture Module 2 Contents 1. Experiments and definitions 2. Events and probabilities 3. Assigning probabilities 4. Probability of complements 5. Conditional probability 6. Statistical independence
More informationthe time it takes until a radioactive substance undergoes a decay
1 Probabilities 1.1 Experiments with randomness Wewillusethetermexperimentinaverygeneralwaytorefertosomeprocess that produces a random outcome. Examples: (Ask class for some first) Here are some discrete
More information2 Chapter 2: Conditional Probability
STAT 421 Lecture Notes 18 2 Chapter 2: Conditional Probability Consider a sample space S and two events A and B. For example, suppose that the equally likely sample space is S = {0, 1, 2,..., 99} and A
More informationCISC-102 Fall 2017 Week 1 David Rappaport Goodwin G-532 Office Hours: Tuesday 1:30-3:30
Week 1 Fall 2017 1 of 42 CISC-102 Fall 2017 Week 1 David Rappaport daver@cs.queensu.ca Goodwin G-532 Office Hours: Tuesday 1:30-3:30 Homework Homework every week. Keep up to date or you risk falling behind.
More informationMATH 10 INTRODUCTORY STATISTICS
MATH 10 INTRODUCTORY STATISTICS Ramesh Yapalparvi Week 2 Chapter 4 Bivariate Data Data with two/paired variables, Pearson correlation coefficient and its properties, general variance sum law Chapter 6
More informationChapter 2. Conditional Probability and Independence. 2.1 Conditional Probability
Chapter 2 Conditional Probability and Independence 2.1 Conditional Probability Probability assigns a likelihood to results of experiments that have not yet been conducted. Suppose that the experiment has
More informationCS 124 Math Review Section January 29, 2018
CS 124 Math Review Section CS 124 is more math intensive than most of the introductory courses in the department. You re going to need to be able to do two things: 1. Perform some clever calculations to
More informationDiscrete random variables
Discrete random variables The sample space associated with an experiment, together with a probability function defined on all its events, is a complete probabilistic description of that experiment Often
More informationUndergraduate Probability I. Class Notes for Engineering Students
Undergraduate Probability I Class Notes for Engineering Students Fall 2014 Contents I Probability Laws 1 1 Mathematical Review 2 1.1 Elementary Set Operations.................... 4 1.2 Additional Rules
More informationCS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 32
CS 473: Algorithms Ruta Mehta University of Illinois, Urbana-Champaign Spring 2018 Ruta (UIUC) CS473 1 Spring 2018 1 / 32 CS 473: Algorithms, Spring 2018 Universal Hashing Lecture 10 Feb 15, 2018 Most
More informationDiscrete Mathematics for CS Spring 2006 Vazirani Lecture 22
CS 70 Discrete Mathematics for CS Spring 2006 Vazirani Lecture 22 Random Variables and Expectation Question: The homeworks of 20 students are collected in, randomly shuffled and returned to the students.
More informationCS 151 Complexity Theory Spring Solution Set 5
CS 151 Complexity Theory Spring 2017 Solution Set 5 Posted: May 17 Chris Umans 1. We are given a Boolean circuit C on n variables x 1, x 2,..., x n with m, and gates. Our 3-CNF formula will have m auxiliary
More informationConditional Probability and Bayes
Conditional Probability and Bayes Chapter 2 Lecture 5 Yiren Ding Shanghai Qibao Dwight High School March 9, 2016 Yiren Ding Conditional Probability and Bayes 1 / 13 Outline 1 Independent Events Definition
More informationMath 243 Section 3.1 Introduction to Probability Lab
Math 243 Section 3.1 Introduction to Probability Lab Overview Why Study Probability? Outcomes, Events, Sample Space, Trials Probabilities and Complements (not) Theoretical vs. Empirical Probability The
More informationLecture 08: Poisson and More. Lisa Yan July 13, 2018
Lecture 08: Poisson and More Lisa Yan July 13, 2018 Announcements PS1: Grades out later today Solutions out after class today PS2 due today PS3 out today (due next Friday 7/20) 2 Midterm announcement Tuesday,
More informationProbability. Chapter 1 Probability. A Simple Example. Sample Space and Probability. Sample Space and Event. Sample Space (Two Dice) Probability
Probability Chapter 1 Probability 1.1 asic Concepts researcher claims that 10% of a large population have disease H. random sample of 100 people is taken from this population and examined. If 20 people
More informationProbabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation
EE 178 Probabilistic Systems Analysis Spring 2018 Lecture 6 Random Variables: Probability Mass Function and Expectation Probability Mass Function When we introduce the basic probability model in Note 1,
More informationGraduate Algorithms CS F-02 Probabilistic Analysis
Graduate Algorithms CS673-20016F-02 Probabilistic Analysis David Galles Department of Computer Science University of San Francisco 02-0: Hiring Problem Need an office assistant Employment Agency sends
More informationBasic Combinatorics. Math 40210, Section 01 Fall Homework 8 Solutions
Basic Combinatorics Math 4010, Section 01 Fall 01 Homework 8 Solutions 1.8.1 1: K n has ( n edges, each one of which can be given one of two colors; so Kn has (n -edge-colorings. 1.8.1 3: Let χ : E(K k
More information