Data Structures and Algorithm Analysis (CSC317) Randomized algorithms

Similar documents
Data Structures and Algorithm Analysis (CSC317) Randomized algorithms

5 ProbabilisticAnalysisandRandomized Algorithms

Randomization in Algorithms

Graduate Algorithms CS F-02 Probabilistic Analysis

CS483 Design and Analysis of Algorithms

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

CMPT 882 Machine Learning

Basic Probability. Introduction

Properties of Probability

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Probabilities and Expectations

Probability Year 9. Terminology

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

RVs and their probability distributions

CMPSCI 240: Reasoning Under Uncertainty

Business Statistics. Lecture 3: Random Variables and the Normal Distribution

Probability Year 10. Terminology

Example 1. The sample space of an experiment where we flip a pair of coins is denoted by:

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

Formalizing Probability. Choosing the Sample Space. Probability Measures

Discrete Probability Distributions

Name: Exam 2 Solutions. March 13, 2017

Dynamic Programming Lecture #4

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Econ 325: Introduction to Empirical Economics

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1

Lecture #13 Tuesday, October 4, 2016 Textbook: Sections 7.3, 7.4, 8.1, 8.2, 8.3

k P (X = k)

Analysis of Algorithms. Randomizing Quicksort

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Great Theoretical Ideas in Computer Science

Terminology. Experiment = Prior = Posterior =

Review of Probability. CS1538: Introduction to Simulations

Example. What is the sample space for flipping a fair coin? Rolling a 6-sided die? Find the event E where E = {x x has exactly one head}

Dept. of Linguistics, Indiana University Fall 2015

Section 13.3 Probability

Chap 4 Probability p227 The probability of any outcome in a random phenomenon is the proportion of times the outcome would occur in a long series of

Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( )

MATH MW Elementary Probability Course Notes Part I: Models and Counting

Lecture 6: The Pigeonhole Principle and Probability Spaces

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Probability. Lecture Notes. Adolfo J. Rumbos

MA : Introductory Probability

P [(E and F )] P [F ]

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39

Janson s Inequality and Poisson Heuristic

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

STAT 285 Fall Assignment 1 Solutions

2.6 Tools for Counting sample points

Introduction to Probability. Ariel Yadin. Lecture 1. We begin with an example [this is known as Bertrand s paradox]. *** Nov.

Lecture 1. Chapter 1. (Part I) Material Covered in This Lecture: Chapter 1, Chapter 2 ( ). 1. What is Statistics?

The random variable 1

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

Probability. VCE Maths Methods - Unit 2 - Probability

Axioms of Probability

Useful for Multiplication Rule: When two events, A and B, are independent, P(A and B) = P(A) P(B).

Probability and Statisitcs

CSC Discrete Math I, Spring Discrete Probability

Random Variable. Pr(X = a) = Pr(s)

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E.

A brief review of basics of probabilities

Discrete Mathematics and Probability Theory Summer 2014 James Cook Final Exam

Probability the chance that an uncertain event will occur (always between 0 and 1)

Probability Theory and Simulation Methods

Sets. Review of basic probability. Tuples. E = {all even integers} S = {x E : x is a multiple of 3} CSE 101. I = [0, 1] = {x : 0 x 1}

CS 125 Section #12 (More) Probability and Randomized Algorithms 11/24/14. For random numbers X which only take on nonnegative integer values, E(X) =

Brief Review of Probability

Toss 1. Fig.1. 2 Heads 2 Tails Heads/Tails (H, H) (T, T) (H, T) Fig.2

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

Probability and random variables

ECE Lecture 4. Overview Simulation & MATLAB

18.440: Lecture 19 Normal random variables

9. DISCRETE PROBABILITY DISTRIBUTIONS

Section 7.1 Experiments, Sample Spaces, and Events

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Introduction to Probability, Fall 2009

Ma/CS 6b Class 15: The Probabilistic Method 2

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

6.2 Introduction to Probability. The Deal. Possible outcomes: STAT1010 Intro to probability. Definitions. Terms: What are the chances of?

CS 441 Discrete Mathematics for CS Lecture 19. Probabilities. CS 441 Discrete mathematics for CS. Probabilities

ORF 245 Fundamentals of Statistics Chapter 5 Probability

DEFINITION: IF AN OUTCOME OF A RANDOM EXPERIMENT IS CONVERTED TO A SINGLE (RANDOM) NUMBER (E.G. THE TOTAL

Review of Probability, Expected Utility

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

Lecture 1. ABC of Probability

(6, 1), (5, 2), (4, 3), (3, 4), (2, 5), (1, 6)

Independence Solutions STAT-UB.0103 Statistics for Business Control and Regression Models

Hidden Markov Models 1

Intro to Probability

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

ECE 302: Chapter 02 Probability Model

Introduction to Probability

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

n N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)

Transcription:

Data Structures and Algorithm Analysis (CSC317) Randomized algorithms

Hiring problem We always want the best hire for a job! Using employment agency to send one candidate at a time Each day, we interview one candidate We must decide immediately if to hire candidate and if so, fire previous!

Hiring problem Always want the best hire Cost to interview (low) Cost to fire/hire (expensive)

Hire-Assistant(n) 1. best = 0 //least qualified candidate 2. for i = 1 to n 3. interview candidate i 4. if candidate i better than best 5. best = i 6. hire candidate i

Hiring problem Always want the best hire fire if better candidate comes along Cost to interview (low) c i c > c Cost to fire/hire (expensive) h i c h n m Total number candidates Total number hired O(?)

Hiring problem Always want the best hire fire if better candidate comes along Cost to interview (low) c i c > c Cost to fire/hire (expensive) h i c h n m Total number candidates Total number hired Different type of cost not run time, but cost of hiring

Hire-Assistant(n) 1. best = 0 //least qualified candidate 2. for i = 1 to n 3. interview candidate i 4. if candidate i better than best 5. best = i 6. hire candidate i Different type of cost, but flavor of max problems, which candidate is best/winning

Hiring problem Always want the best hire fire if better candidate comes along Cost to interview (low) c i c > c Cost to fire/hire (expensive) h i c h n m Total number candidates Total number hired O(c i n + c h m) Different type of cost not run time, but cost of hiring

Hiring problem Always want the best hire fire if better candidate comes along Cost to interview (low) c i c > c Cost to fire/hire (expensive) h i n Total number candidates m Total number hired c h When is this most expensive?

Hiring problem Always want the best hire fire if better candidate comes along Cost to interview (low) c i Cost to fire/hire (expensive) n Total number candidates m Total number hired c h When is this most expensive? When candidates interview in reverse order, worst first

Hiring problem 1. best = 0 //least qualified candidate 2. for i = worst_candidate to best_candidate O(c i n + c h n) When is this most expensive? When candidates interview in reverse order, worst first

Hiring problem 1. best = 0 //least qualified candidate 2. for i = worst_candidate to best_candidate c h > c i O(c i n + c h n) = O(c h n) When is this most expensive? When candidates interview in reverse order, worst first

Hiring problem Always want the best hire fire if better candidate comes along Cost to interview (low) c i Cost to fire/hire (expensive) n Total number candidates m Total number hired c h When is this least expensive?

Hiring problem 1. best = 0 //least qualified candidate 2. for i = best_candidate to worst_candidate O(c i n + c h ) When is this least expensive? When candidates interview in order, best first

Hiring problem 1. best = 0 //least qualified candidate 2. for i = best_candidate to worst_candidate O(c i n + c h ) When is this least expensive? When candidates interview in order, best first But we don t know who is good or bad a priori

Hiring problem c Cost to interview (low i ) Cost to fire/hire (expensive ) n Total number candidates m Total number hired c h O(c i n + c h m) Depends on order of candidates! Constant no matter order of candidates

Hiring problem and randomized algorithms O(c i n + c h m) Depends on order of candidates! Constant no matter order of candidates Randomized order can do us well on average (we will show!)

Hiring problem and randomized algorithms O(c i n + c h m) Depends on order of candidates! Can t assume in advance that candidates are in random order there might be bias But we can add randomization to our algorithm by asking the agency to send names in advance, and randomizing

Hiring problem - randomized We always want the best hire for a job! Employment agency sends list of n candidates in advance Each day, we choose randomly a candidate from the list to interview We do not rely on the agency to send us randomly, but rather take control in algorithm

Randomized hiring problem(n) 1. randomly permute the list of candidates 2. Hire-Assistant(n)

Including random number generator What do we mean by randomly permute?

Including random number generator Random(a,b) Function that returns an integer between a and b, inclusive, where each integer has equal probability Most programs: pseudorandom-number generator

Including random number generator Random(a,b) Function that returns an integer between a and b, inclusive, where each integer has equal probability Random(0,1)?

Including random number generator Random(a,b) Function that returns an integer between a and b, inclusive, where each integer has equal probability Random(0,1)? 0 with prob.5 1 with prob.5

Including random number generator Random(a,b) Function that returns an integer between a and b, Inclusive, where each integer has equal probability Random(3,7)? 3 with prob 1/5 4 with prob 1/5 5 with prob 1/5 Like rolling a (7-3+1) dice

How do we randomly permute?

Random permutation Randomize-in-place(A,n) 1. For i = 1 to n 2. swap A[ i ] with A[Random(i,n)] i A = 1 2 3 4 5 6 A = 3 2 1 4 5 6 Random choice from A[i..n]

Random permutation Randomize-in-place(A,n) 1. For i = 1 to n 2. swap A[ i ] with A[Random(i,n)] A = 1 2 3 4 5 6 A = 3 2 1 4 5 6 A = 3 6 1 4 5 2 A = 3 6 1 4 5 2. i Random choice from A[i..n]

Probability review Sample space S: space of all possible outcomes (e.g., that can happen in a coin toss) Probability of each outcome: p( i )>=0 i S p(i) = 1 Example: coin toss S = {H,T ) p(h ) = p(t ) = 1 2

Probability review Event: a subset of all possible outcomes that can happen;; subset of sample space S E S Probability of event: i E p(i) Example: coin toss that gave Heads E={H}

Probability review Event: a subset of all possible outcomes that can happen;; subset of sample space Example: rolling two dice List all possible outcomes List the event or subset of outcomes for which the sum of the dice is 7 Probability of event that sum of dice 7

Probability review Event: a subset of all possible outcomes that can happen;; subset of sample space Example: rolling two dice List all possible outcomes S={(1,1},(1,2),(1,3), (2,1),(2,2),(2,3), (6,6)} List the event or subset of outcomes for which the sum of the dice is 7 E = {(1,6),(6,1),(2,5),(5,2),(3,4),(4,3)} Probability of event that sum of dice 7: 6/36

Probability review Random variable: attaches value to an outcome/s Example: value of one dice Example: sum of two dice

Probability review Random variable: attaches value to an outcome/s CAPITAL : typically capital letter to denote random variable x X small : typically lower case for particular value/s that random variable takes p(x = x) Probability that random variable X takes on value x

Probability review Random variable: attaches value to an outcome/s Example: r.v. X represents sum of two dice p(x = x) Probability that sum of two dice takes on value x p(x = 7) = 6 36

Probability review Expectation or Expected Value of Random variable: = Average

Probability review Expectation or Expected Value of Random variable: = Average x E[X] = p(x = x) x Sum over each possible value, weighted by probability of that value

Probability review Expectation or Expected Value of Random variable: x E[X] = p(x = x) x Example: X represents sum of two dice E[X] =?

Probability review Expectation or Expected Value of Random variable: x E[X] = p(x = x) x Example: X represents sum of two dice E[X] =?

Probability review Expectation or Expected Value of Random variable: Example: X represents sum of two dice Sum 2: {(1,1)};; 3:{(1,2);;(2,1));; 4:{(1,3),(3,1),(2,2)};; 5:{(1,4),(4,1),(2,3),(3,2)};; 6:{(1,5),(5,1),(4,2),(2,4), (3,3)};; 7:{(1,6);;(6,1);;(5,2);;(2,5);;(3,4);;(4,3)};; 8:{(2,6);;{6,2),{5,3},{3,5},{4,4}};; 9:{(3,6},{6,3},{5,4},{4,5});; 10:{(6,4);;(4,6);;(5,5)};;11:{(6,5);;(5,6)};; 12:{(6,6)} E[X] = 2 1 36 + 3 2 36 + 4 3 36 + 5 4 36 + 6 5 36 +7 6 36 + 8 5 36 + 9 4 36 +10 3 36 +11 2 36 +12 1 36 = 252 36 = 7

Probability review Expectation or Expected Value of Random variable: Example: X represents sum of two dice E[X] = 2 1 36 + 3 2 36 + 4 3 36 + 5 4 36 + 6 5 36 +7 6 36 + 8 5 36 + 9 4 36 +10 3 36 +11 2 36 +12 1 36 = 252 36 = 7 Easier way?

Probability review Linearity of Expectation X 1,..X n n E[ X ] = i i=1 random variables then: n i=1 E[ X ] i Does not require independence of random variables

Probability review Linearity of Expectation n E[ X ] = i i=1 n i=1 E[ X ] i Example: X1, X2 represent the value of each dice E[X 1 ] = E[X 2 ] = 1 1 6 + 2 1 6 + 3 1 6 + 4 1 6 + 5 1 6 + 6 1 6 = 21 6 E[X 1 + X 2 ] = E[X 1 ]+ E[X 2 ] = 42 6 = 7

Probability review Indicator Random Variable (indicating if occurs ) X = I { A} = A 1 if A occurs 0 if A does not occur Lemma: For event A E[X ] = Pr{ A} A Pr = probability E[X A ] = 1Pr(A) + 0(1 Pr(A)) = Pr(A)

Probability review Indicator Random Variable X = I { A} = H 1 if A occurs 0 if A does not occur Example: Expected number Heads when flipping fair coin X H = I { H} E[ X ] H = 0 1 2 +11 2 = 1 2 = Pr(H )

Probability review Indicator Random Variable & Linearity of Expectation X = I { A} = i X = n X i i=1 1 if A occurs 0 if A does not occur Then by linearity of expectation: n E[ X ] = i i=1 n i=1 E[ X ] i

Probability review Indicator Random Variable & Linearity of Expectation Example: Expected number Heads when flipping 3 fair coins X = I { H} = H 1 if Heads occurs 0 if Heads does not occur E[ 3X ] H = 3E[ X ] H = 3 1 2 = 1.5