UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

Similar documents
UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL:

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

3 Multiple Discrete Random Variables

Recitation 2: Probability

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20

Midterm Exam 1 Solution

Discrete Structures for Computer Science

Probability Notes (A) , Fall 2010

Topic 3 Random variables, expectation, and variance, II

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

X = X X n, + X 2

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Discrete Probability

Stochastic Models of Manufacturing Systems

Random variables (discrete)

CS 125 Section #12 (More) Probability and Randomized Algorithms 11/24/14. For random numbers X which only take on nonnegative integer values, E(X) =

Introduction to discrete probability. The rules Sample space (finite except for one example)

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Class 26: review for final exam 18.05, Spring 2014

CSC Discrete Math I, Spring Discrete Probability

Review of Probability. CS1538: Introduction to Simulations

MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

Lecture 08: Poisson and More. Lisa Yan July 13, 2018

Dynamic Programming Lecture #4

Math 447. Introduction to Probability and Statistics I. Fall 1998.

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

Homework 4 Solution, due July 23

lecture notes October 22, Probability Theory

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

Senior Math Circles November 19, 2008 Probability II

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

Topic 3: The Expectation of a Random Variable

CMPSCI 240: Reasoning Under Uncertainty

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

STAT 516 Answers Homework 2 January 23, 2008 Solutions by Mark Daniel Ward PROBLEMS. = {(a 1, a 2,...) : a i < 6 for all i}

Introduction to Stochastic Processes

Debugging Intuition. How to calculate the probability of at least k successes in n trials?

Discrete Random Variables

Joint Distribution of Two or More Random Variables

SDS 321: Introduction to Probability and Statistics

More on Distribution Function

CSE 312, 2017 Winter, W.L.Ruzzo. 5. independence [ ]

Overview. CSE 21 Day 5. Image/Coimage. Monotonic Lists. Functions Probabilistic analysis

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Notes 12 Autumn 2005

5.3 Conditional Probability and Independence

a zoo of (discrete) random variables

Basic Probability. Introduction

CME 106: Review Probability theory

18.440: Lecture 28 Lectures Review

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

Introduction to Probability 2017/18 Supplementary Problems

Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008

Solutionbank S1 Edexcel AS and A Level Modular Mathematics

Discrete Mathematics

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Lecture 4: Probability and Discrete Random Variables

Deviations from the Mean

Lecture 10. Variance and standard deviation

Introduction to Probability Theory, Algebra, and Set Theory

Conditional distributions (discrete case)

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation

Randomized Algorithms

Statistics for Economists. Lectures 3 & 4

Alex Psomas: Lecture 17.

Recap of Basic Probability Theory

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( )

Recap of Basic Probability Theory

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

2. Probability. Chris Piech and Mehran Sahami. Oct 2017

Probability, For the Enthusiastic Beginner (Exercises, Version 1, September 2016) David Morin,

Discrete Mathematics and Probability Theory Fall 2012 Vazirani Note 14. Random Variables: Distribution and Expectation

RVs and their probability distributions

Randomized Sorting Algorithms Quick sort can be converted to a randomized algorithm by picking the pivot element randomly. In this case we can show th

MAT 271E Probability and Statistics

More than one variable

12 1 = = 1

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

k P (X = k)

MATH MW Elementary Probability Course Notes Part I: Models and Counting

Introduction to Decision Sciences Lecture 11

Discrete Random Variables

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 15 Notes. Class URL:

Discrete Random Variables

General Info. Grading

Chapter 4. Chapter 4 sections

Topic 4 Probability. Terminology. Sample Space and Event

Name: Exam 2 Solutions. March 13, 2017

COS 424: Interacting with Data. Lecturer: Dave Blei Lecture #2 Scribe: Ellen Kim February 7, 2008

6.041/6.431 Fall 2010 Quiz 2 Solutions

Transcription:

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/

Lecture 10 Notes Midterm Good job overall! = 81; = 18; Range: 30 110 Scores are up on TED Also, updated Quiz scores including Week 4 If you are somehow not able to see your grades on TED, please email ABK + David + Pete + Natalie Hand-back process (1) Look at solutions + grading rubric on the class website (2) Go to TA OH s to see your MT (3) If you re satisfied, take your MT (4) Else, leave your MT with a specific regrade request before 5/13 If you take away your MT, you cannot request a regrade later All remaining MTs will be handed back at end of 5/13 lecture Goals for this week Linearity of expectation; Joint distributions Decision trees, hashing example For quiz: MT solutions, mean/variance, today s material (Next week: conditional probability, Bayes rule; induction and recursion) (Have caught up again to poker chip summary)

Expectation, Variance Random variable X = #heads seen in four tosses of a fair coin Expectation E(X) = k=0,,4 p k x k = p 0 = P(X = 0)= 1/16 p 1 = P(X = 1)= 4/16 p 2 = P(X = 2)= 6/16 p 3 = P(X = 3)= 4/16 p 4 = P(X = 4)= 1/16 * 0 = 0 * 1 = 4/16 * 2 = 12/16 * 3 = 12/16 * 4 = 4/16 32/16 = 2

Expectation, Variance Def. 8, FN-24 Random variable X = #heads seen in four tosses of a fair coin Var(X) captures how much X is likely to vary through its distribution Variance X2 = E(X 2 ) (E(X)) 2 = 5 2 2 = 1 p 0 = P(X = 0)= 1/16 p 1 = P(X = 1)= 4/16 p 2 = P(X = 2)= 6/16 p 3 = P(X = 3)= 4/16 p 4 = P(X = 4)= 1/16 * 0 2 = 0 * 1 2 = 4/16 * 2 2 = 24/16 * 3 2 = 36/16 * 4 2 = 16/16 80/16 = 5

FN Theorem 3: Linearity of Expectation If X and Y are random variables on the same sample space and if a and b are real numbers, then: E(aX) = ae(x) E(aX + by) = ae(x) + be(y) Holds regardless of whether X and Y are independent Example: Roll two dice. What is the expected value of their sum?

Expected Sum of Two Rolled Dice Define r.v. s: X = value of first die s roll Y = value of second die s roll Z = sum of two values: Z = X + Y Linearity of expectation E(Z) = E(X) + E(Y)

Thank-You Notes A child has just had a birthday party. He has written 20 thankyou notes and addressed 20 corresponding envelopes. But then he gets distracted, and puts the letters randomly into the envelopes (one letter per envelope). What is the expected number of letters that are in their correct envelopes? Define r.v. X = #letters in correct envelopes? We want E(X) = i=0,,20 i P(X = i) How about: r.v. s X j = 1 if letter j is in envelope j, and X j = 0 otherwise? We want E( j=1,,20 X j )

Joint Distribution When two random variables are defined on the same sample space, can consider their joint distribution function, h: h X,Y : Im(X) Im(Y) [0,1] h X,Y (i,j) = P(X -1 (i) Y -1 (j)) probability of ((X = i) AND (Y = j))

Joint Distribution Recall: Flip a fair coin four times. U = {(f 1,f 2,f 3,f 4 ) f i {H,T}} r.v. X 1 = number of heads that appear r.v. X 2 = 1 if the second flip is heads, 0 otherwise r.v. X 3 = 1 if the third flip is heads, 0 otherwise What is the joint distribution h X 1,X2 of X 1 and X 2? Domain of h = X {(i,j) i Im(X 1,X2 1 ), j Im(X 2 )} = {0,1,2,3,4} {0,1}

Joint Distribution Recall: Flip a fair coin four times. r.v. X 1 = number of heads that appear r.v. X 2 = 1 if the second flip is heads, 0 otherwise What is h X 1,X2 (2,1)? h X 1,X2 (2,1) is a probability. of what event? h (2,1) = P({(f X 1,X2 1,f 2,f 3,f 4 ) two H s, second toss is H}) = P({(f 1,H,f 3,f 4 ) one of f 1,f 3,f 4 is H}) C(3,1)2-4 = P({(H,H,H,T), (T,H,H,T), (T,H,T,H)}) = 3/16

Joint Distribution Recall: Flip a fair coin four times. r.v. X 1 = number of heads that appear r.v. X 2 = 1 if the second flip is heads, 0 otherwise What is h X 1,X2 (0,1)? h X 1,X2 (0,1) = P({(f 1,f 2,f 3,f 4 ) zero H s, second toss is H}) = P( ) = 0 What is h (4,0)? X 1,X2 h X 1,X2 (4,0) = P({(f 1,f 2,f 3,f 4 ) four H s, second toss is T}) h (3,1)? X 1,X2 h (3,0)? X 1,X2 The joint distribution h? X (exercise) 2,X3

Independence Definition 10, FN-29/30 X and Y are independent random variables if and only if h X,Y (i,j) = f X (i) f Y (j) for all (i,j) Im(X) Im(Y) Intuitively: information about one of the r.v. s doesn t give us any information about the other Yet again: Flip a fair coin four times. r.v. X 1 = number of heads that appear r.v. X 2 = 1 if the second flip is heads, 0 otherwise r.v. X 3 = 1 if the third flip is heads, 0 otherwise Do you think that X 1 and X 2 are independent? Do you think that X 2 and X 3 are independent?

Independence Definition 10, FN-29/30 X and Y are independent random variables if and only if h X,Y (i,j) = f X (i) f Y (j) for all (i,j) Im(X) Im(Y) r.v. X 1 = number of heads that appear r.v. X 2 = 1 if the second flip is heads, 0 otherwise r.v. X 3 = 1 if the third flip is heads, 0 otherwise How can we show X 1 and X 2 are not independent? h (2,1) = X f (2) = 1,X2 X f (1) = 1 X 2 h X 1,X2 (1,1) = f X 1 (1) = f X 2 (1) =

Independence Definition 10, FN-29/30 X and Y are independent random variables if and only if h X,Y (i,j) = f X (i) f Y (j) for all (i,j) Im(X) Im(Y) r.v. X 1 = number of heads that appear r.v. X 2 = 1 if the second flip is heads, 0 otherwise r.v. X 3 = 1 if the third flip is heads, 0 otherwise How can we show X 2 and X 3 are independent? h (0,0) = X f (0) = 2,X3 X f (0) = 2 X 3 h X 2,X3 (1,1) = f X 1 (1) = f X 2 (1) = Im(X 2 ) Im(X 3 ) = 4 this can be tedious!

Independence Definition 10, FN-29/30 If X and Y are independent random variables and f, g : R R are functions, then f(x), g(y) are independent random variables. E(XY) = E(X) E(Y). not true if the r.v. s are not independent!

Independence Definition 10, FN-29/30 E(X Y) = E(X) E(Y). Coin flipping: not true if the r.v. s are not independent! E(X 1 ) = 2, E(X 2 ) = ½ Y = X 1 X 2 {i j : I Im(X 1 ), j Im(X 2 )} = {0,1,2,3,4} E(X 1 X 2 ) = k=0,,4 P(Y = k) k = 1/16 1 + 3/16 2 + 3/16 3 + 1/16 4

Independence Definition 10, FN-29/30 E(X Y) = E(X) E(Y). Coin flipping: not true if the r.v. s are not independent! E(X 2 ) = ½, E(X 3 ) = ½ Y = X 2 X 3 {i j : I Im(X 2 ), j Im(X 3 )} = {0,1} E(X 2 X 3 ) = k=0,1 P(Y = k) k = P(X 2 X 3 = 0) 0 + P(X 2 X 3 = 1) 1

Decision Trees Decision trees are a systematic way of listing templates / possibilities Decision trees are a type of graph Can be used for enumeration Can be used to capture recursive solution approaches (algorithms) Vehicle for correspondences between induction and recursion

Decision Trees Example: How many functions from A = {1,2,3,4} to itself are non-increasing? Each function can be defined by a sequence of decisions: What is f(1) What is f(2) What is f(3) What is f(4) but there are not independent decisions! If f(1) = 1, then we must have f(2) = f(3) = f(4) = 1 If f(1) = 2, then we must have f(2) {1,2} If f(2) = 2, then f(3) {1,2} Etc. (There are 35 such functions)

Decision Tree: Non-Attacking Queens Put N queens on an N x N chessboard such that no queen attacks another queen or, in how many ways can A queen in chess can attack any square on the same row, column, or diagonal. Given an n x n chessboard, we seek to place n queens onto squares of the chessboard, such that no queen attacks another queen. The example shows a placement (red squares) of four mutually non-attacking queens.

Non-Attacking Queens: Leaf in Tree = Solution 1 2 3 4 3 4 4 1 1 2 2 1 4 3 3 2

Non-Attacking Queens Decision Tree Comments Our tree corresponded to a solution space of size 4 4 = 256 What if we numbered the squares as 1, 2,, 16 and tried to enumerate all C(16,4) choices? Our solution space would have size 1820 Our decision tree was smarter We realized that there must be exactly one queen per row, so the decision for the k th row (i.e., k th queen) boils down to picking a square in the row Our illustration of the tree did not show any infeasible configurations

Another Tree: 3-Coloring a Graph Three colors: R, G, B 4 3 2 Graph 5 A graph is legally colored with k colors if each vertex has a color (label) between 1 and k (inclusive), and no adjacent vertices have the same color. Example application: scheduling classrooms 1

Another Tree: 3-Coloring a Graph Three colors: R, G, B 4 3 2 G B G Graph 5 R 4G 3R 4B 1B,2G By symmetry, can use any two colors for vertices 1 and 2 4R 3B Tree of Colorings 4G 1 B A graph is legally colored with k colors if each vertex has a color (label) between 1 and k (inclusive), and no adjacent vertices have the same color. Does the ordering of the vertices (1, 2, 3, 4, 5) matter? 5R

Hashing Example: Hashing application Definition: A hash function is any algorithm that maps data of variable length to data of a fixed length. The values returned by a hash function are called hash values, hash codes, hash sums, checksums or simply hashes. Open Hashing (closed addressing): Each item hashes to a particular location in an array, and locations can hold more than one item. However, multiple input data values can have the same hash value. These are called collisions!

Hashing What is the expected number of items per location? Start with n items, hash to table of size k Can model as n independent trials with outcome being the location in table (i.e., hash value) Let r.v. X = number of inputs that hash to location 1 Can write this as the sum of: X i = 1 if the i th input hashes to location 1 X i = 0 otherwise X = X 1 + X 2 + + X n E(X) = E(X 1 ) + E(X 2 ) +.. + E(X n ) = n 1/k = n/k

Hashing (cont.) What is the expected number of empty locations? If I hash three keys into a table with 10 slots, what is the probability that all the keys go to different slots?

Problems 10 P10.1 To receive credit, you must solve both (a) and (b). (a) Consider a row of n seats. A child sits on each. Each child may move by at most one seat. Find the number of ways in which the children can rearrange. (b) Same as in (a), but the children sit in a circle of n seats. P10.2 Two fishing sinker manufacturers A and B together offer 2001 distinct sinkers (with all distinct weights) in their catalogs. Manufacturer A has sinkers labeled a 1,, a 1000, with the corresponding weights being a 1 < a 2 < < a 1000. Similarly, manufacturer B has sinkers labeled b 1,, b 1001, with the corresponding weights being b 1 < b 2 < < b 1001. You have one copy of each of these 2001 sinkers. Explain how to find the sinker whose weight is ranked 1001, using 11 weighings. (Each weighing on a balance can tell you whether the contents of the left pan are heavier or lighter than the contents of the right pan.)

Problems 10 (cont.) P10.3 Prove that you can choose at most 2 n / (n + 1) n-bit binary strings, such that any two of these strings differ in at least three bit positions. (A motivation for this would be errortolerant codes.) 10.4 To receive credit, you must solve both (a) and (b). You have 128 objects with distinct weights. (a) Find, with explanation, the minimum number of pairwise comparisons needed to identify the heaviest and second heaviest of these objects. (b) Find, with explanation, the minimum number of pairwise comparisons needed to identify the heaviest and the lightest of these objects.