UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

Similar documents
UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL:

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

CS 125 Section #12 (More) Probability and Randomized Algorithms 11/24/14. For random numbers X which only take on nonnegative integer values, E(X) =

Topic 3 Random variables, expectation, and variance, II

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20

3 Multiple Discrete Random Variables

Recitation 2: Probability

Midterm Exam 1 Solution

Math 447. Introduction to Probability and Statistics I. Fall 1998.

X = X X n, + X 2

Discrete Probability

Discrete Structures for Computer Science

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

CSC Discrete Math I, Spring Discrete Probability

Probability Notes (A) , Fall 2010

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Class 26: review for final exam 18.05, Spring 2014

Solutionbank S1 Edexcel AS and A Level Modular Mathematics

Lecture 10. Variance and standard deviation

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Stochastic Models of Manufacturing Systems

Introduction to Probability 2017/18 Supplementary Problems

STAT 516 Answers Homework 2 January 23, 2008 Solutions by Mark Daniel Ward PROBLEMS. = {(a 1, a 2,...) : a i < 6 for all i}

MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

lecture notes October 22, Probability Theory

Probabilistic models

Formalizing Probability. Choosing the Sample Space. Probability Measures

Homework 4 Solution, due July 23

Random variables (discrete)

Example. What is the sample space for flipping a fair coin? Rolling a 6-sided die? Find the event E where E = {x x has exactly one head}

Introduction to discrete probability. The rules Sample space (finite except for one example)

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Senior Math Circles November 19, 2008 Probability II

Discrete Mathematics

Overview. CSE 21 Day 5. Image/Coimage. Monotonic Lists. Functions Probabilistic analysis

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

PRACTICE PROBLEMS FOR EXAM 2

Review of Probability. CS1538: Introduction to Simulations

MAT 271E Probability and Statistics

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

Introduction to Stochastic Processes

SDS 321: Introduction to Probability and Statistics

Dynamic Programming Lecture #4

Lecture 08: Poisson and More. Lisa Yan July 13, 2018

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

Introduction to Decision Sciences Lecture 11

CME 106: Review Probability theory

Discrete Random Variables

Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( )

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

MAS275 Probability Modelling Exercises

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

Discrete Random Variables

5.3 Conditional Probability and Independence

Probability, For the Enthusiastic Beginner (Exercises, Version 1, September 2016) David Morin,

Probabilistic models

Conditional Probability

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

Discrete Random Variable Practice

Name: Exam 2 Solutions. March 13, 2017

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Topic 3: The Expectation of a Random Variable

Probability and Statistics Notes

CMPSCI 240: Reasoning Under Uncertainty

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Probability the chance that an uncertain event will occur (always between 0 and 1)

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 15 Notes. Class URL:

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

1 The Basic Counting Principles

18.440: Lecture 28 Lectures Review

Math 510 midterm 3 answers

Debugging Intuition. How to calculate the probability of at least k successes in n trials?

Statistics for Economists. Lectures 3 & 4

Discrete Random Variables

Joint Distribution of Two or More Random Variables

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation

More on Distribution Function

18.440: Lecture 26 Conditional expectation

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

CSE 312, 2017 Winter, W.L.Ruzzo. 5. independence [ ]

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

Randomized Algorithms

12 1 = = 1

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

MATH MW Elementary Probability Course Notes Part I: Models and Counting


Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Discrete Mathematics and Probability Theory Fall 2012 Vazirani Note 14. Random Variables: Distribution and Expectation

Transcription:

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/

Lecture 10 Notes Midterm Good job overall! = 81; = 18; Range: 30 110 Scores are up on TED Also, updated Quiz scores including Week 4 If you are somehow not able to see your grades on TED, please email ABK + David + Pete + Natalie Hand-back process (1) Look at solutions + grading rubric on the class website (2) Go to TA OH s to see your MT (3) If you re satisfied, take your MT (4) Else, leave your MT with a specific regrade request before 5/13 If you take away your MT, you cannot request a regrade later All remaining MTs will be handed back at end of 5/13 lecture Goals for this week Linearity of expectation; Joint distributions Decision trees, hashing example, For quiz: MT solutions, mean/variance, today s material (Next week: conditional probability, Bayes rule; induction and recursion) (Have caught up again to poker chip summary)

Expectation, Variance Random variable X = #heads seen in four tosses of a fair coin Expectation E(X) = k=0,,4 p k x k = p 0 = P(X = 0)= 1/16 p 1 = P(X = 1)= 4/16 p 2 = P(X = 2)= 6/16 p 3 = P(X = 3)= 4/16 p 4 = P(X = 4)= 1/16 * 0 = 0 * 1 = 4/16 * 2 = 12/16 * 3 = 12/16 * 4 = 4/16 32/16 = 2

Expectation, Variance Def. 8, FN-24 Random variable X = #heads seen in four tosses of a fair coin Var(X) captures how much X is likely to vary through its distribution Variance X2 = E(X 2 ) (E(X)) 2 = 5 2 2 = 1 p 0 = P(X = 0)= 1/16 p 1 = P(X = 1)= 4/16 p 2 = P(X = 2)= 6/16 p 3 = P(X = 3)= 4/16 p 4 = P(X = 4)= 1/16 * 0 2 = 0 * 1 2 = 4/16 * 2 2 = 24/16 * 3 2 = 36/16 * 4 2 = 16/16 80/16 = 5

FN Theorem 3: Linearity of Expectation If X and Y are random variables on the same sample space and if a and b are real numbers, then: E(aX) = ae(x) E(aX + by) = ae(x) + be(y) Holds regardless of whether X and Y are independent Example: Roll two dice. What is the expected value of their sum?

Expected Sum of Two Rolled Dice Define r.v. s: X = value of first die s roll Y = value of second die s roll Z = sum of two values: Z = X + Y Linearity of expectation E(Z) = E(X) + E(Y)

Thank-You Notes A child has just had a birthday party. He has written 20 thankyou notes and addressed 20 corresponding envelopes. But then he gets distracted, and puts the letters randomly into the envelopes (one letter per envelope). What is the expected number of letters that are in their correct envelopes? Define r.v. X = #letters in correct envelopes? We want E(X) = i=0,,20 i P(X = i) How about: r.v. s X j = 1 if letter j is in envelope j, and X j = 0 otherwise? We want E( j=1,,20 X j )

Baseball Cards (think about this ) A bubble gum company puts one random major league baseball player s card into every pack of its gum. There are N major leaguers in the league this year. If X is the number of packs of gum that a collector must buy up until she has at least one of each player s card, what is E(X)? Comment: This is the same as the number of steps that a random walk is expected to take to cover all the vertices in a complete graph over N vertices.

Joint Distribution When two random variables are defined on the same sample space, can consider their joint distribution function, h: h X,Y : Im(X) Im(Y) [0,1] h X,Y (i,j) = P(X -1 (i) Y -1 (j)) probability of ((X = i) AND (Y = j))

Joint Distribution Recall: Flip a fair coin four times. U = {(f 1,f 2,f 3,f 4 ) f i {H,T}} r.v. X 1 = number of heads that appear r.v. X 2 = 1 if the second flip is heads, 0 otherwise r.v. X 3 = 1 if the third flip is heads, 0 otherwise What is the joint distribution h X 1,X2 of X 1 and X 2? Domain of h = X {(i,j) i Im(X 1,X2 1 ), j Im(X 2 )} = {0,1,2,3,4} {0,1}

Joint Distribution Recall: Flip a fair coin four times. r.v. X 1 = number of heads that appear r.v. X 2 = 1 if the second flip is heads, 0 otherwise What is h X 1,X2 (2,1)? h X 1,X2 (2,1) is a probability. of what event? h (2,1) = P({(f X 1,X2 1,f 2,f 3,f 4 ) two H s, second toss is H}) = P({(f 1,H,f 3,f 4 ) one of f 1,f 3,f 4 is H}) C(3,1)2-4 = P({(H,H,T,T), (T,H,H,T), (T,H,T,H)}) = 3/16

Joint Distribution Recall: Flip a fair coin four times. r.v. X 1 = number of heads that appear r.v. X 2 = 1 if the second flip is heads, 0 otherwise What is h X 1,X2 (0,1)? h X 1,X2 (0,1) = P({(f 1,f 2,f 3,f 4 ) zero H s, second toss is H}) = P( ) = 0 What is h (4,0)? X 1,X2 h X 1,X2 (4,0) = P({(f 1,f 2,f 3,f 4 ) four H s, second toss is T}) h (3,1)? X 1,X2 h (3,0)? X 1,X2 The joint distribution h? X (exercise) 2,X3

Independence Definition 10, FN-29/30 X and Y are independent random variables if and only if h X,Y (i,j) = f X (i) f Y (j) for all (i,j) Im(X) Im(Y) Intuitively: information about one of the r.v. s doesn t give us any information about the other Yet again: Flip a fair coin four times. r.v. X 1 = number of heads that appear r.v. X 2 = 1 if the second flip is heads, 0 otherwise r.v. X 3 = 1 if the third flip is heads, 0 otherwise Do you think that X 1 and X 2 are independent? Do you think that X 2 and X 3 are independent?

Independence Definition 10, FN-29/30 X and Y are independent random variables if and only if h X,Y (i,j) = f X (i) f Y (j) for all (i,j) Im(X) Im(Y) r.v. X 1 = number of heads that appear r.v. X 2 = 1 if the second flip is heads, 0 otherwise r.v. X 3 = 1 if the third flip is heads, 0 otherwise How can we show X 1 and X 2 are not independent? h (2,1) = X f (2) = 1,X2 X f (1) = 1 X 2 h X 1,X2 (1,1) = f X 1 (1) = f X 2 (1) =

Independence Definition 10, FN-29/30 X and Y are independent random variables if and only if h X,Y (i,j) = f X (i) f Y (j) for all (i,j) Im(X) Im(Y) r.v. X 1 = number of heads that appear r.v. X 2 = 1 if the second flip is heads, 0 otherwise r.v. X 3 = 1 if the third flip is heads, 0 otherwise How can we show X 2 and X 3 are independent? h (0,0) = X f (0) = 2,X3 X f (0) = 2 X 3 h X 2,X3 (1,1) = f X 1 (1) = f X 2 (1) = Im(X 2 ) Im(X 3 ) = 4 this can be tedious!

Independence Definition 10, FN-29/30 If X and Y are independent random variables and f, g : R R are functions, then f(x), g(y) are independent random variables. E(XY) = E(X) E(Y). not true if the r.v. s are not independent!

Independence Definition 10, FN-29/30 E(X Y) = E(X) E(Y). Coin flipping: not true if the r.v. s are not independent! E(X 1 ) = 2, E(X 2 ) = ½ Y = X 1 X 2 {i j : I Im(X 1 ), j Im(X 2 )} = {0,1,2,3,4} E(X 1 X 2 ) = k=0,,4 P(Y = k) k = 1/16 1 + 3/16 2 + 3/16 3 + 1/16 4

Independence Definition 10, FN-29/30 E(X Y) = E(X) E(Y). Coin flipping: not true if the r.v. s are not independent! E(X 2 ) = ½, E(X 3 ) = ½ Y = X 2 X 3 {i j : I Im(X 2 ), j Im(X 3 )} = {0,1} E(X 2 X 3 ) = k=0,1 P(Y = k) k = P(X 2 X 3 = 0) 0 + P(X 2 X 3 = 1) 1

Decision Trees Decision trees are a systematic way of listing templates / possibilities Decision trees are a type of graph Can be used for enumeration Can be used to capture recursive solution approaches (algorithms) Vehicle for correspondences between induction and recursion

A Traveling Salesperson s Tours Recall: Bob the salesman starts at his home in San Diego, and tours 6 cities (Abilene, Bakersfield, Corona, Denver, Eastville and Frankfort), visiting each city exactly once before returning home. In how many possible ways could Bob make his tour?

Number of Non-Increasing Functions Example: How many functions from A = {1,2,3,4} to itself are non-increasing? Each function can be defined by a sequence of decisions: What is f(1) What is f(2) What is f(3) What is f(4) but there are not independent decisions! If f(1) = 1, then we must have f(2) = f(3) = f(4) = 1 If f(1) = 2, then we must have f(2) {1,2} If f(2) = 2, then f(3) {1,2} Etc. (There are 35 such functions)

Decision Tree: Non-Attacking Queens Put N queens on an N x N chessboard such that no queen attacks another queen or, in how many ways can A queen in chess can attack any square on the same row, column, or diagonal. Given an n x n chessboard, we seek to place n queens onto squares of the chessboard, such that no queen attacks another queen. The example shows a placement (red squares) of four mutually non-attacking queens.

Non-Attacking Queens: Leaf in Tree = Solution 1 2 3 4 3 4 4 1 1 2 2 1 4 3 3 2

Non-Attacking Queens Decision Tree Comments Our tree corresponded to a solution space of size 4 4 = 256 What if we numbered the squares as 1, 2,, 16 and tried to enumerate all C(16,4) choices? Our solution space would have size 1820 Our decision tree was smarter We realized that there must be exactly one queen per row, so the decision for the k th row (i.e., k th queen) boils down to picking a square in the row Our illustration of the tree did not show any infeasible configurations

Non-Attacking Queens Decision Trees

Another Tree: 3-Coloring a Graph Three colors: R, G, B 4 3 2 Graph 5 A graph is legally colored with k colors if each vertex has a color (label) between 1 and k (inclusive), and no adjacent vertices have the same color. Example application: scheduling classrooms 1

Another Tree: 3-Coloring a Graph Three colors: R, G, B 4 3 2 G B G Graph 5 R 4G 3R 4B 1B,2G By symmetry, can use any two colors for vertices 1 and 2 4R 3B Tree of Colorings 4G 1 B A graph is legally colored with k colors if each vertex has a color (label) between 1 and k (inclusive), and no adjacent vertices have the same color. Does the ordering of the vertices (1, 2, 3, 4, 5) matter? 5R

Hashing Example: Hashing application Definition: A hash function is any algorithm that maps data of variable length to data of a fixed length. The values returned by a hash function are called hash values, hash codes, hash sums, checksums or simply hashes. Open Hashing (closed addressing): Each item hashes to a particular location in an array, and locations can hold more than one item. However, multiple input data values can have the same hash value. These are called collisions!

Hashing What is the expected number of items per location? Start with n items, hash to table of size k Can model as n independent trials with outcome being the location in table (i.e., hash value) Let r.v. X = number of inputs that hash to location 1 Can write this as the sum of: X i = 1 if the i th input hashes to location 1 X i = 0 otherwise X = X 1 + X 2 + + X n E(X) = E(X 1 ) + E(X 2 ) +.. + E(X n ) = n 1/k = n/k

Hashing (cont.) What is the expected number of empty locations? If I hash three keys into a table with 10 slots, what is the probability that all the keys go to different slots?

Problems 10 P10.1 To receive credit, you must solve both (a) and (b). (a) Consider a row of n seats. A child sits on each. Each child may move by at most one seat. Find the number of ways in which the children can rearrange. (b) Same as in (a), but the children sit in a circle of n seats. P10.2 Two fishing sinker manufacturers A and B together offer 2001 distinct sinkers (with all distinct weights) in their catalogs. Manufacturer A has sinkers labeled a 1,, a 1000, with the corresponding weights being a 1 < a 2 < < a 1000. Similarly, manufacturer B has sinkers labeled b 1,, b 1001, with the corresponding weights being b 1 < b 2 < < b 1001. You have one copy of each of these 2001 sinkers. Explain how to find the sinker whose weight is ranked 1001, using 11 weighings. (Each weighing on a balance can tell you whether the contents of the left pan are heavier or lighter than the contents of the right pan.)

Problems 10 (cont.) P10.3 Prove that you can choose at most 2 n / (n + 1) n-bit binary strings, such that any two of these strings differ in at least three bit positions. (A motivation for this would be errortolerant codes.) P10.4 To receive credit, you must solve both (a) and (b). You have 128 objects with distinct weights. (a) Find, with explanation, the minimum number of pairwise comparisons needed to identify the heaviest and second heaviest of these objects. (b) Find, with explanation, the minimum number of pairwise comparisons needed to identify the heaviest and the lightest of these objects.