Lower bound for sorting/probability Distributions

Similar documents
Probability Distributions. Conditional Probability.

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 15 Notes. Class URL:

Probability Distributions. Conditional Probability

CSC Discrete Math I, Spring Discrete Probability

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Probabilistic models

With Question/Answer Animations. Chapter 7

Bayes Rule for probability

Expected Value. Lecture A Tiefenbruck MWF 9-9:50am Center 212 Lecture B Jones MWF 2-2:50pm Center 214 Lecture C Tiefenbruck MWF 11-11:50am Center 212

Sec$on Summary. Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability

First Digit Tally Marks Final Count

EE126: Probability and Random Processes

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL:

Probabilistic models

What is Probability? Probability. Sample Spaces and Events. Simple Event

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

2. Conditional Probability

CS 361: Probability & Statistics

Quantitative Methods for Decision Making

CS 361: Probability & Statistics

Discrete Probability

Introduction to Probability 2017/18 Supplementary Problems

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Conditional Probability

Expected Value 7/7/2006

Basic Statistics for SGPE Students Part II: Probability theory 1

9/6/2016. Section 5.1 Probability. Equally Likely Model. The Division Rule: P(A)=#(A)/#(S) Some Popular Randomizers.

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

Some Basic Concepts of Probability and Information Theory: Pt. 1

Final Examination. Adrian Georgi Josh Karen Lee Min Nikos Tina. There are 12 problems totaling 150 points. Total time is 170 minutes.

Events A and B are said to be independent if the occurrence of A does not affect the probability of B.

27 Binary Arithmetic: An Application to Programming

Uncertainty. Russell & Norvig Chapter 13.

Discrete Structures for Computer Science

n How to represent uncertainty in knowledge? n Which action to choose under uncertainty? q Assume the car does not have a flat tire

ORF 245 Fundamentals of Statistics Chapter 5 Probability

Introduction to Probability, Fall 2009

Homework 4 Solution, due July 23

Great Theoretical Ideas in Computer Science

Chapter 2 PROBABILITY SAMPLE SPACE

Deep Learning for Computer Vision

HW1 Solutions. October 5, (20 pts.) Random variables, sample space and events Consider the random experiment of ipping a coin 4 times.

Discrete Random Variable

(i) Given that a student is female, what is the probability of having a GPA of at least 3.0?

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

CS 441 Discrete Mathematics for CS Lecture 20. Probabilities. CS 441 Discrete mathematics for CS. Probabilities

Probability. VCE Maths Methods - Unit 2 - Probability

Conditional Probability

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Chapter Summary. 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance

Conditional probability

Thus, P(F or L) = P(F) + P(L) - P(F & L) = = 0.553

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

SDS 321: Introduction to Probability and Statistics

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

PERMUTATIONS, COMBINATIONS AND DISCRETE PROBABILITY

Lecture 2. Constructing Probability Spaces

Part (A): Review of Probability [Statistics I revision]

Lecture 3: Probability

Discrete Random Variables

Formalizing Probability. Choosing the Sample Space. Probability Measures

Lecture Lecture 5

Applied Statistics I

1 INFO 2950, 2 4 Feb 10

Elementary Discrete Probability

CS206 Review Sheet 3 October 24, 2018

Discrete Random Variables

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

6.2 Introduction to Probability. The Deal. Possible outcomes: STAT1010 Intro to probability. Definitions. Terms: What are the chances of?

Probability Pr(A) 0, for any event A. 2. Pr(S) = 1, for the sample space S. 3. If A and B are mutually exclusive, Pr(A or B) = Pr(A) + Pr(B).


STAT 430/510 Probability

Vina Nguyen HSSP July 6, 2008

Useful for Multiplication Rule: When two events, A and B, are independent, P(A and B) = P(A) P(B).

Probability (10A) Young Won Lim 6/12/17

CS 441 Discrete Mathematics for CS Lecture 19. Probabilities. CS 441 Discrete mathematics for CS. Probabilities

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev

Lecture 3. January 7, () Lecture 3 January 7, / 35

Basics on Probability. Jingrui He 09/11/2007

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Introduction to Probability and Sample Spaces

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya

Discrete random variables and probability distributions

COS 424: Interacting with Data. Lecturer: Dave Blei Lecture #2 Scribe: Ellen Kim February 7, 2008

Conditional Probability P( )

Statistical Experiment A statistical experiment is any process by which measurements are obtained.

Discrete Probability Distribution

Probability: Part 1 Naima Hammoud

Lecture 6 - Random Variables and Parameterized Sample Spaces

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Conditional Probability. CS231 Dianna Xu

Problems and results for the ninth week Mathematics A3 for Civil Engineering students

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS

Probability Distributions for Discrete RV

MATH MW Elementary Probability Course Notes Part I: Models and Counting

Transcription:

Lower bound for sorting/probability Distributions CSE21 Winter 2017, Day 20 (B00), Day 14 (A00) March 3, 2017 http://vlsicad.ucsd.edu/courses/cse21-w17

Another application of counting lower bounds Sorting algorithm: performance was measured in terms of number of comparisons between list elements What's the fastest possible worst case for any sorting algorithm?

Another application of counting lower bounds Sorting algorithm: performance was measured in terms of number of comparisons between list elements What's the fastest possible worst case for any sorting algorithm? Tree diagram for a sorting algorithm represents possible comparisons we might have to do, based on relative sizes of elements. Sometimes called a decision tree

Another application of counting lower bounds Tree diagram for a sorting algorithm a, b, c distinct integers Rosen p. 761

Another application of counting lower bounds Sorting algorithm: performance was measured in terms of number of comparisons between list elements What's the fastest possible worst case for any sorting algorithm? Maximum (worst-case) number of comparisons for a sorting algorithm is the height of its tree diagram.

Another application of counting lower bounds How many leaves will there be in a decision tree that sorts n elements? A. 2 n D. C(n,2) B. log n E. None of the above. C. n!

Another application of counting lower bounds Sorting algorithm: performance was measured in terms of number of comparisons between list elements What's the fastest possible worst case for any sorting algorithm? max # of comparisons = height of tree diagram For any algorithm, what would be smallest possible height? What do we know about the tree? * Internal nodes correspond to comparisons. * Leaves correspond to possible input arrangements.

Another application of counting lower bounds Sorting algorithm: performance was measured in terms of number of comparisons between list elements What's the fastest possible worst case for any sorting algorithm? max # of comparisons = height of tree diagram For any algorithm, what would be smallest possible height? What do we know about the tree? * Internal nodes correspond to comparisons. * Leaves correspond to possible input arrangements. n!

Another application of counting lower bounds How does height relate to number of leaves? Theorem: There are at most 2 h leaves in a binary tree with height h.

Another application of counting lower bounds What's the fastest possible worst case for any sorting algorithm? max # of comparisons = height of tree diagram Fastest possible worst case performance of sorting is log 2 (n!)

Another application of counting lower bounds What's the fastest possible worst case for any sorting algorithm? log 2 (n!) How big is that? log n!

Another application of counting lower bounds What's the fastest possible worst case for any sorting algorithm? log 2 (n!) How big is that? n log n! = log k k=1

Another application of counting lower bounds What's the fastest possible worst case for any sorting algorithm? log 2 (n!) How big is that? n log n! = log k k=1

Another application of counting lower bounds What's the fastest possible worst case for any sorting algorithm? log 2 (n!) How big is that? n n log n! = log k > න log x dx k=1 1 n n න log x dx = x log x x ቚ = n log n n 0 1 = n log n n + 1 1 1 log n! = Ω n log n

Another application of counting lower bounds What's the fastest possible worst case for any sorting algorithm? log 2 (n!) Therefore, the best sorting algorithms will need Ω n log n comparisons in the worst case. It's impossible to have a comparison-based algorithm that does better than Merge Sort (in the worst case).

Probability Rosen p. 446, 453 Sample space, S: (finite or countable) set of possible outcomes. Probability distribution, p: assignment of probabilities to outcomes in S so that - 0<= p(s) <=1 for each s in S. - Sum of probabilities is 1,.

Probability Rosen p. 446, 453 Sample space, S: (finite or countable) set of possible outcomes. Probability distribution, p: assignment of probabilities to outcomes in S so that - 0<= p(s) <=1 for each s in S. - Sum of probabilities is 1,. Compare flipping a fair coin and a biased coin: A. Have different sample spaces. B. Have the same sample spaces but different probability distributions. C. Have the same sample space and same probability distributions.

Probability Rosen p. 446, 453-4 Sample space, S: (finite or countable) set of possible outcomes. Probability distribution, p: assignment of probabilities to outcomes in S so that - 0<= p(s) <=1 for each s in S. - Sum of probabilities is 1,. Event, E: subset of possible outcomes.

Uniform distribution Rosen p. 454 For sample space S with n elements, uniform distribution assigns the probability 1/n to each element of S. When flipping a fair coin successively three times: A. The sample space is {H, T} B. The empty set is not an event. C. The event {HHH, HHT, HTH, HTT, THH, THT, TTH, TTT} has probability less than 1. D. The uniform distribution assigns probability 1/8 to each outcome. E. None of the above.

Uniform distribution Rosen p. 454 For sample space S with n elements, uniform distribution assigns the probability 1/n to each element of S. When flipping a fair coin successively three times: A. The sample space is {H, T} B. The empty set is not an event. C. The event {HHH, HHT, HTH, HTT, THH, THT, TTH, TTT} has probability less than 1. D. The uniform distribution assigns probability 1/8 to each outcome. E. None of the above.

Uniform distribution Rosen p. 454 For sample space S with n elements, uniform distribution assigns the probability 1/n to each element of S. When flipping a fair coin successively three times, what is the distribution of the number of Hs that appear? A. Uniform distribution. B. P( 0 H ) = P ( 3 H ) = 3/8 and P( 1 H ) =P( 2 H ) = 1/8. C. P( 0 H ) = P (1 H ) = 1/8 and P( 2 H ) = P( 3 H ) = 1/8. D. P( 0 H ) = P (3 H ) = 1/8 and P( 1 H ) = P( 2 H ) = 1/8. E. None of the above.

Uniform distribution Rosen p. 454 For sample space S with n elements, uniform distribution assigns the probability 1/n to each element of S. When flipping a fair coin successively three times, what is the distribution of the number of Hs that appear? A. Uniform distribution. B. P( 0 H ) = P ( 3 H ) = 3/8 and P( 1 H ) =P( 2 H ) = 1/8. C. P( 0 H ) = P (1 H ) = 1/8 and P( 2 H ) = P( 3 H ) = 1/8. D. P( 0 H ) = P (3 H ) = 1/8 and P( 1 H ) = P( 2 H ) = 1/8. E. None of the above.

Probability and counting If start with the uniform distribution on a set S, then the probability of an event E is When flipping n fair coins what is the probability of getting exactly k Hs? A. 1/n B. k/n C. 1/2 n D. C(n,k) / 2 n E. None of the above.

Binomial distribution When flipping n fair coins what is the probability of getting exactly k Hs? Possible coin toss sequences: { HH..HH, HH..HT,, TT..TH, TT..TT }

Binomial distribution When flipping n fair coins what is the probability of getting exactly k Hs? Possible coin toss sequences: { HH..HH, HH..HT,, TT..TH, TT..TT } What if the coin isn't fair?

Binomial distribution Bernoulli trial: a performance of an experiment with two possible outcomes. e.g. flipping a coin Binomial distribution: probability of exactly k successes in n independent Bernoulli trials, when probability of success is p. e.g. # Hs in n coin flips when probability of H is p What is this probability? Rosen p. 480 A. C(n,k) / 2 n B. p k /2 n C. C(n,k) p k D. C(n,k) p k (1-p) n-k E. None of the above.

Randomness in Computer Science When the input is random data mining elections weather stock prices genetic markers analyzing experimental data When is analysis valid? When are we overfitting to available data?

Randomness in Computer Science When the desired output is random - picking a cryptographic key - performing a scientific simulation - programming the adversary in a computer game

The Monty Hall Puzzle Car hidden behind one of three doors. Goats hidden behind the other two. Player gets to choose a door. Host opens another door, reveals a goat. Player can choose whether to swap choice with other closed door or stay with original choice. What's the player's best strategy? A. Always swap. B. Always stay. C. Doesn't matter, it's 50/50.

Some history Puzzle introduced by Steve Selvin in 1975. Marilyn vos Savant was a prodigy with record scores on IQ tests who wrote an advice column. In 1990, a reader asked for the solution to the Monty Hall puzzle. After she published the (correct) answer, thousands of readers (including PhDs and even a professor of statistics) demanded that she correct her "mistake". She built a simulator to demonstrate the solution so they could see for themselves how it worked.

The Monty Hall Puzzle the solution Pick a door at random to start

The Monty Hall Puzzle the solution

The Monty Hall Puzzle the solution What's the probability of winning (C) if you always switch ("Y")? A. 1/3 B. 1/2 C. 2/3 D. 1 E. None of the above.

The Monty Hall Puzzle the solution What's the probability of winning (C) if you always switch ("Y")? A. 1/3 B. 1/2 C. 2/3 D. 1 E. None of the above.

The Monty Hall Puzzle the solution What's the probability of winning (C) if you always stay ("N")? A. 1/3 B. 1/2 C. 2/3 D. 1 E. None of the above.

The Monty Hall Puzzle the solution What's the probability of winning (C) if you always stay ("N")? A. 1/3 B. 1/2 C. 2/3 D. 1 E. None of the above.

The Monty Hall Puzzle the solution What's wrong with the following argument? "It doesn't matter whether you stay or swap because the host opened one door to show a goat, so there are only two doors remaining, and both of them are equally likely to have the car because the prizes were placed behind the doors randomly at the start of the game."

Conditional probabilities Probability of an event may change if have additional information about outcomes. Suppose E and F are events, and P(F) > 0. Then, i.e., Rosen p. 456

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. A. They're equal. B. They're not equal. C. Not sure. Assume that each child being a boy or a girl is equally likely.

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. 1. Sample space Assume that each child being a boy or a girl is equally likely. 2. Initial distribution on the sample space 3. What events are we conditioning on?

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 1. Sample space Possible outcomes: {bb, bg, gb, gg} Order matters! 2. Initial distribution on the sample space 3. What events are we conditioning on?

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 1. Sample space Possible outcomes: {bb, bg, gb, gg} Order matters! 2. Initial distribution on the sample space Uniform distribution, each outcome has probability ¼. 3. What events are we conditioning on?

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 3. What events are we conditioning on?

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 3. What events are we conditioning on? A = { outcomes where oldest is a girl } B = { outcomes where two are girls}

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 3. What events are we conditioning on? A = { outcomes where oldest is a girl } B = { outcomes where two are girls } = { gg, gb} = { gg }

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 3. What events are we conditioning on? A = { outcomes where oldest is a girl } B = { outcomes where two are girls } = { gg, gb} = { gg } P(A) = ½ P(B) = ¼ = P(A B) U

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 3. What events are we conditioning on? A = { outcomes where oldest is a girl } B = { outcomes where two are girls } = { gg, gb} = { gg } P(A) = ½ P(B) = ¼ = P(A B) U By conditional probability law: P(B A) = P(A B) / P(A) = (1/4) / (1/2) = ½. U

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. 1/2 The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 1. Sample space Possible outcomes: {bb, bg, gb, gg} Order matters! 2. Initial distribution on the sample space Uniform distribution, each outcome has probability ¼. 3. What events are we conditioning on?

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. 1/2 The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 3. What events are we conditioning on? C = { outcomes where one is a boy} D = { outcomes where two are boys } = { bb, bg, gb } = { bb } P(C) = ¾ P(D) = ¼ = P(C D) U

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. 1/2 The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 3. What events are we conditioning on? C = { outcomes where one is a boy} D = { outcomes where two are boys } = { bb, bg, gb } = { bb } P(C) = ¾ P(D) = ¼ = P(C D) By conditional probability law: P(D C) = P(C D) / P(C) = (1/4) / (3/4) = 1/3. U U

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. 1/2 The probability that two siblings are boys if know that one of them is a boy. 1/3 Assume that each child being a boy or a girl is equally likely.

C. R. Charig, D. R. Webb, S. R. Payne, J. E. Wickham (29 March 1986). "Comparison of treatment of renal calculi by open surgery, percutaneous nephrolithotomy, and extracorporeal shockwave lithotripsy". Br Med J (Clin Res Ed) 292 (6524): 879 882. doi:10.1136/bmj.292.6524.879. PMC 1339981. PMID 3083922. cf. Wikipedia "Simpson's Paradox" Conditional probabilities: Simpson's Paradox Treatment A Treatment B Small stones 81 successes / 87 (93%) Large stones 192 successes / 263 (73%) Combined 273 successes / 350 (78%) 234 successes / 270 (87%) 55 successes / 80 (69%) 289 successes / 350 (83%) Which treatment is better? A. Treatment A for all cases. C. A for small and B for large. B. Treatment B for all cases. D. A for large and B for small.

C. R. Charig, D. R. Webb, S. R. Payne, J. E. Wickham (29 March 1986). "Comparison of treatment of renal calculi by open surgery, percutaneous nephrolithotomy, and extracorporeal shockwave lithotripsy". Br Med J (Clin Res Ed) 292 (6524): 879 882. doi:10.1136/bmj.292.6524.879. PMC 1339981. PMID 3083922. cf. Wikipedia "Simpson's Paradox" Conditional probabilities: Simpson's Paradox Treatment A Treatment B Small stones 81 successes / 87 (93%) Large stones 192 successes / 263 (73%) Combined 273 successes / 350 (78%) 234 successes / 270 (87%) 55 successes / 80 (69%) 289 successes / 350 (83%) Simpson's Paradox "When the less effective treatment is applied more frequently to easier cases, it can appear to be a more effective treatment."

Random Variables A random variable assigns a real number to each possible outcome of an experiment. Rosen p. 460,478 The distribution of a random variable X is the function r P(X = r) The expectation (average, expected value) of random variable X on sample space S is

Expected Value Examples Rosen p. 460,478 The expectation (average, expected value) of random variable X on sample space S is Calculate the expected number of boys in a family with two children. A. 0 B. 1 C. 1.5 D. 2

Expected Value Examples Rosen p. 460,478 The expectation (average, expected value) of random variable X on sample space S is Calculate the expected number of boys in a family with three children. A. 0 B. 1 C. 1.5 D. 2

Expected Value Examples Rosen p. 460,478 The expectation (average, expected value) of random variable X on sample space S is Calculate the expected number of boys in a family with three children. A. 0 B. 1 C. 1.5 D. 2 The expected value might not be a possible value of the random variable like 1.5 boys!

Expected Value Examples Rosen p. 460,478 The expectation (average, expected value) of random variable X on sample space S is Calculate the expected sum of two 6-sided dice. A. 6 B. 7 C. 8 D. 9 E. None of the above.