ECEN 303: Homework 3 Solutions

Size: px
Start display at page:

Download "ECEN 303: Homework 3 Solutions"

Transcription

1 Problem 1: ECEN 0: Homework Solutions Let A be the event that the dog was lost in forest A and A c be the event that the dog was lost in forest B. Let D n be the event that the dog dies on the nth day. Let F n be the event that the dog is found on the nth day. Let S n be the event that Oscar searches forest A on nth day and S c n be the event that he searches forest B on day n. We know that PrA 0. PrA c 0.6 PrD n+1 D c n, F c n N N + PrF n A, S n, F c n PrF n B, S c n, F c n a We want to find PrF 1 S 1 PrF 1 S c 1 Now PrF 1 S 1 PrF 1 S 1, A PrA + PrF 1 S 1, A c PrA c Similarly PrF 1 S1 c PrF 1 S1, c A PrA + PrF 1 S1, c A c PrA c Thus PrF 1 S 1 > PrF 1 S1 c, so Oscar should search in forest A. b Using Baye s Rule PrA S 1, F c 1 PrA PrF c 1 A, S 1 PrA PrF c 1 A, S 1 + PrA c PrF c 1 A c, S c Now PrS 1 F 1 PrS 1 F 1 PrF 1 PrS 1 F 1 PrS 1 F 1 A PrA PrS 1 PrF 1 S 1, A

2 and PrS c 1 F 1 PrS c 1 F 1 A c PrA c PrS c 1 PrF 1 S c 1, A c Using, PrF 1 PrF 1 S 1 + PrF 1 S1 c, we have 0.05 PrS 1 F d Probability that the Oscar will find a live dog for the first time on the second day is PrF D c F c 1 S 1, S PrD c F, F c 1, S 1, S PrF F c 1, S 1, S PrF c 1 S 1, S PrD F c 1 c PrF F1 c, S PrF1 c S 1 [ PrD F c 1 c PrF c F1 c, S, A PrA F1 c, S 1 ] [ ] + PrF F1 c, S, A c PrA c F1 c, S 1 0 PrF1 c S 1, A PrA + PrF1 c S 1, A c PrA c [0.5 0.] [ ] 0.05 e Probability that Oscar does not find a dead dog at the end of the second day is Pr F D c F1 c, S 1, S 1 PrF D F1 c, S 1, S f Probability that he found a live dog is 1 PrD F, F c 1, S 1, S PrF F c 1, S 1, S 1 PrD F c 1 PrF F c 1, S [0.5 0.] 0.97 PrD c F, F c, F c, F c 1, S 1, S, S, S c PrD c D c D c F, F c, F c, F c 1, S 1, S, S, S c PrD c D c, D c, F, F c, F c, F c 1, S 1, S, S, S c PrD c D c, F, F c, F c, F c 1, S 1, S, S, S c PrD c F, F c, F c, F c 1, S 1, S, S, S c PrD c D c, F c PrD c D c, F c PrD c F c g Let L be the event that he searches twice in forest A and twice in forest B. Then PrD c F, F c, F c, F c 1, L PrD c D c D c F, F c, F c, F c 1, L PrD c D c, D c, F, F c, F c, F c 1, L PrD c D c, F, F c, F c, F c 1, L PrD c F, F c, F c, F c 1, L PrD c D c, F c PrD c D c, F c PrD c F c

3 Problem : a Let P the event that exactly one parent went and R be the event that Rover, the oldest dog, went. We want to find PrR P Number of ways exactly one parent and rover can go Number of ways exactly one parent goes Now if exactly one parent and rover are going, we can choose the parent in two ways and the remaining three positons can be filled by any of the remaining two dogs or the four kids. The number of combinations of this is 6 0 If exactly one parent is going, we can choose the parent in two ways. Both dogs and cats cannot go together, so we have two cases: either the dogs go or the cats go. Suppose the pet is a dog. So we have to choose four positions with three dogs or four kids such that at least one dog is chosen. This can be done in + total number of ways 1 number of ways of not choosing any dog Suppose the pet is cat, the number of ways we can fill the remaining four positions is + 1 number of ways of not choosing any cat total number of ways Thus, the total number of ways exactly one parent can go is Hence, Problem : PrR P Number of ways of distributing cards to players, each getting cards is! ; ; ;!!!! a Number of ways for one player to receive spades is 9 1 ; ; Choose the player Distribute the remaining 9 cards between players

4 Thus, Pr One of the players receive all spades 1 9 ;; ;;; b Number of ways for two of players to receive spades such that each one has at least one spade is Choose the players 9 Choose more cards [ ] 6 Distribute the 6 cards spades and others just chosen between the two players such that each player gets at least 1 spade 6 Thus, Distribute the remaining 6 cards between the remaining players Pr Two players receive all spades such that each one has atleast one spade 9 [ 6 ] 6 ;;; c Number of ways in which each player gets one ace is 8 1; 1; 1; 1 }{{! } Number of ways of distributing the aces between players Number of ways to distribute the remaining 8 cards between players Thus d Pr Each player gets one ace 8 1;1;1;1! ;;; Pr Void in at least one suit Pr Void in exactly one suit + Pr Void in exactly two suits + Pr Void in exactly three suits 0 Further, number of ways to draw hands that are void in exactly one suit [ ] 6 1 Number of ways to choose the suit Number of ways to draw cards from 6 cards that is not void in any suit

5 and number of ways to draw hands that are void in exactly two suits 1 Thus, Problem : Choose the two suits in which we are void [ 6 ] Pr The hand drawn is void in at least one suit 1 + a Let A be the event that 1 is transmitted and A c be the event that 0 is transmitted. Let B n be the event that the n th bit we receive is 1. Then using the law of Total Probability and Baye s rule we have p 1 PrA B 1 9 PrB 1 A PrA 1 ε1 p PrB 1 A PrA + PrB 1 A c PrA c 1 ε1 p + εp. b Note that B 1 and B are conditionally independent given A. Now, p PrA B 1, B PrB A, B 1 PrA B 1 PrB B 1 PrB A, B 1 PrA B 1 PrB A, B 1 PrA B 1 + PrB A c, B 1 PrA c B 1 PrB A, B 1 PrA B 1 PrB A PrA B 1 + PrB A c PrA c B 1 1 εp 1 1 εp 1 + ε1 p 1 c Using the same argument as in part, we get p n 1 εp n 1 1 εp n 1 + ε1 p n 1 1 ε n p 1 ε n p 0 + ε n 1 p 0 p 0 n p 0 + ε 1 ε 1 p0 where p 0 1 p. As n, we have 1 if ε < 1 p n p 0 if ε 1 0 if ε > 1 These results meet the intuitions. When ε < 1/, it means the observations are positively correlated with the bit transmitted, thus knowing the observations B n, n 1,... will increase the conditional probability of A given the observations, until it hits 1. When ε 1/, it means the observations are independent to the bit transmitted, thus given a bunch of observations B n, n 1,... won t change the conditional probability of A given the observations, which is the same as the prior probability 5

6 p. When ε > 1/, it means the observations are negatively correlated with the bit transmitted, thus knowing the observations B n, n 1,... will decrease the conditional probability of A given the observations, until it hits 0. c By part c, to get p n 0.99, the following condition must be true: There are three cases p 0 p n n 0.99 p 0 + ε 1 ε 1 p0 p 0 log 991 p 0 n log ε 1 ε 1 when ε < 1/, we need n log p 0 log[991 p 0 ], log ε log[1 ε] The intuition is the following. In this case, knowing the observations B n, n 1,... will monotonously increase p n, until it hits 1. So we expect after the number of observations n exceeds a threshold, p n can be no less than when ε 1/, the right hand side of 1 is 0. Thus for 1 to be true, the left hand side must be no less than 0, which implies p If p , then for any n, p n exceeds 0.99; otherwise no p n exceeds The intuition is that when ε 1/, the observations are independent to the bit transmitted. Thus the conditional probability of A given the observations won t change as we get more observations. Consequently, the only possible way to have p n > 0.99 is the prior probability p When ε > 1/, we need the following to be true n log p 0 log[991 p 0 ] log ε log[1 ε] Since n is no less than 1, the above inequality further requires log p 0 log[991 p 0 ] log ε log[1 ε] p 0 99ε ε. The intuition is that when ε > 1/, knowing the observations B n, n 1,,... will monotonously decrease p n, until it hits 0. Thus the only possible way for p n 0.99 is that p 0 is large enough so as to make p n 0.99 for n less than a certain threshold. Problem 5: a Let X be the event that A can communicate with B and Y be the event that exactly five links have failed. We are asked to find PrX Y. PrX Y PrX Y PrY 6

7 Note that X Y is the event that exactly 5 links fail and A can communicate with B. This is only possible if the following sets of links fail: either adgef or bchef. The probabilities of each of these possibilities is p 5 1 p. Thus, PrX Y p 5 1 p + p 5 1 p p 5 1 p Furthermore, we can calculate the probability of 5 links failing as 8 PrY p 5 1 p 5 Thus, PrX Y p5 1 p 8 p 5 1 p 1 8 b Let C be the event that either g or h but not both are working. PrC Y PrC Y PrY Note that C Y consists of all the possibilities such that exactly one gate out of g and h is working while the other are not working and exactly four gates out of the remaining six a,b,c,d,e,f are not working while the remaining two are working. Thus, 6 PrC Y p1 p p 1 p 1 1 p1 p 6 p 1 p PrC B p 5 1 p 8 c Let D be the event that a, d and h have failed. Given this the only way A can communicate with B is if gates b, b, f and g are working. Let this event be E. As the gates fail independently, events E and D are independent. Thus PrE D PrE 1 p 7

Third Problem Assignment

Third Problem Assignment EECS 401 Due on January 26, 2007 PROBLEM 1 (35 points Oscar has lost his dog in either forest A (with a priori probability 0.4 or in forest B (with a priori probability 0.6. If the dog is alive and not

More information

Math 151. Rumbos Spring Solutions to Review Problems for Exam 1

Math 151. Rumbos Spring Solutions to Review Problems for Exam 1 Math 5. Rumbos Spring 04 Solutions to Review Problems for Exam. There are 5 red chips and 3 blue chips in a bowl. The red chips are numbered,, 3, 4, 5 respectively, and the blue chips are numbered,, 3

More information

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 The Total Probability Theorem. Consider events E and F. Consider a sample point ω E. Observe that ω belongs to either F or

More information

Second Problem Assignment

Second Problem Assignment EECS 401 Due on January 19, 2007 PROBLEM 1 (30 points) Bo and Ci are the only two people who will enter the Rover Dog Fod jingle contest. Only one entry is allowed per contestant, and the judge (Rover)

More information

2 Chapter 2: Conditional Probability

2 Chapter 2: Conditional Probability STAT 421 Lecture Notes 18 2 Chapter 2: Conditional Probability Consider a sample space S and two events A and B. For example, suppose that the equally likely sample space is S = {0, 1, 2,..., 99} and A

More information

Our learning goals for Bayesian Nets

Our learning goals for Bayesian Nets Our learning goals for Bayesian Nets Probability background: probability spaces, conditional probability, Bayes Theorem, subspaces, random variables, joint distribution. The concept of conditional independence

More information

Conditional Probability

Conditional Probability Chapter 6 Conditional Probability 6.1 Introduction The concept of conditional probability is a fascinating one. What if one noticed that an event occurred and we wish to follow that with another event.

More information

Logic and Bayesian Networks

Logic and Bayesian Networks Logic and Part 1: and Jinbo Huang Jinbo Huang and 1/ 31 What This Course Is About Probabilistic reasoning with Bayesian networks Reasoning by logical encoding and compilation Jinbo Huang and 2/ 31 Probabilities

More information

Discrete Mathematics and Probability Theory Fall 2011 Rao Midterm 2 Solutions

Discrete Mathematics and Probability Theory Fall 2011 Rao Midterm 2 Solutions CS 70 Discrete Mathematics and Probability Theory Fall 20 Rao Midterm 2 Solutions True/False. [24 pts] Circle one of the provided answers please! No negative points will be assigned for incorrect answers.

More information

1. Consider a random independent sample of size 712 from a distribution with the following pdf. c 1+x. f(x) =

1. Consider a random independent sample of size 712 from a distribution with the following pdf. c 1+x. f(x) = 1. Consider a random independent sample of size 712 from a distribution with the following pdf f(x) = c 1+x 0

More information

Formalizing Probability. Choosing the Sample Space. Probability Measures

Formalizing Probability. Choosing the Sample Space. Probability Measures Formalizing Probability Choosing the Sample Space What do we assign probability to? Intuitively, we assign them to possible events (things that might happen, outcomes of an experiment) Formally, we take

More information

Probability Pr(A) 0, for any event A. 2. Pr(S) = 1, for the sample space S. 3. If A and B are mutually exclusive, Pr(A or B) = Pr(A) + Pr(B).

Probability Pr(A) 0, for any event A. 2. Pr(S) = 1, for the sample space S. 3. If A and B are mutually exclusive, Pr(A or B) = Pr(A) + Pr(B). This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this

More information

Math 151. Rumbos Fall Solutions to Review Problems for Exam 1

Math 151. Rumbos Fall Solutions to Review Problems for Exam 1 Math 151. Rumbos Fall 017 1 Solutions to Review Problems for Exam 1 1. There are 5 red chips and 3 blue chips in a bowl. The red chips are numbered 1,, 3, 4, 5 respectively, and the blue chips are numbered

More information

Bayesian Networks. Semantics of Bayes Nets. Example (Binary valued Variables) CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III

Bayesian Networks. Semantics of Bayes Nets. Example (Binary valued Variables) CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III Bayesian Networks Announcements: Drop deadline is this Sunday Nov 5 th. All lecture notes needed for T3 posted (L13,,L17). T3 sample

More information

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam Math 5. Rumbos Fall 7 Solutions to Review Problems for Final Exam. Three cards are in a bag. One card is red on both sides. Another card is white on both sides. The third card is red on one side and white

More information

MS-A0504 First course in probability and statistics

MS-A0504 First course in probability and statistics MS-A0504 First course in probability and statistics Heikki Seppälä Department of Mathematics and System Analysis School of Science Aalto University Spring 2016 Probability is a field of mathematics, which

More information

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview ECE 450 Lecture 2 Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview Conditional Probability, Pr(A B) Total Probability Bayes Theorem Independent Events

More information

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39 Entropy Probability and Computing Presentation 22 Probability and Computing Presentation 22 Entropy 1/39 Introduction Why randomness and information are related? An event that is almost certain to occur

More information

Basic Combinatorics. Math 40210, Section 01 Fall Homework 8 Solutions

Basic Combinatorics. Math 40210, Section 01 Fall Homework 8 Solutions Basic Combinatorics Math 4010, Section 01 Fall 01 Homework 8 Solutions 1.8.1 1: K n has ( n edges, each one of which can be given one of two colors; so Kn has (n -edge-colorings. 1.8.1 3: Let χ : E(K k

More information

The Inclusion-Exclusion Principle

The Inclusion-Exclusion Principle The Inclusion-Exclusion Principle Introductory Example Suppose a survey of 100 people asks if they have a cat or dog as a pet. The results are as follows: 55 answered yes for cat, 58 answered yes for dog

More information

Chapter 7: Section 7-1 Probability Theory and Counting Principles

Chapter 7: Section 7-1 Probability Theory and Counting Principles Chapter 7: Section 7-1 Probability Theory and Counting Principles D. S. Malik Creighton University, Omaha, NE D. S. Malik Creighton University, Omaha, NE Chapter () 7: Section 7-1 Probability Theory and

More information

Lecture 4 - Conditional Probability

Lecture 4 - Conditional Probability ecture 4 - Conditional Probability 6.04 - February, 00 Conditional Probability Suppose that we pick a random person in the world. Everyone has an equal chance of being picked. et A be the event that the

More information

Outline. Introduction. Bayesian Probability Theory Bayes rule Bayes rule applied to learning Bayesian learning and the MDL principle

Outline. Introduction. Bayesian Probability Theory Bayes rule Bayes rule applied to learning Bayesian learning and the MDL principle Outline Introduction Bayesian Probability Theory Bayes rule Bayes rule applied to learning Bayesian learning and the MDL principle Sequence Prediction and Data Compression Bayesian Networks Copyright 2015

More information

Lecture 3: Lower bound on statistically secure encryption, extractors

Lecture 3: Lower bound on statistically secure encryption, extractors CS 7880 Graduate Cryptography September, 015 Lecture 3: Lower bound on statistically secure encryption, extractors Lecturer: Daniel Wichs Scribe: Giorgos Zirdelis 1 Topics Covered Statistical Secrecy Randomness

More information

Discrete Mathematics and Probability Theory Fall 2010 Tse/Wagner MT 2 Soln

Discrete Mathematics and Probability Theory Fall 2010 Tse/Wagner MT 2 Soln CS 70 Discrete Mathematics and Probability heory Fall 00 se/wagner M Soln Problem. [Rolling Dice] (5 points) You roll a fair die three times. Consider the following events: A first roll is a 3 B second

More information

Exam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3)

Exam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3) 1 Exam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3) On this exam, questions may come from any of the following topic areas: - Union and intersection of sets - Complement of

More information

Probability. Lecture Notes. Adolfo J. Rumbos

Probability. Lecture Notes. Adolfo J. Rumbos Probability Lecture Notes Adolfo J. Rumbos October 20, 204 2 Contents Introduction 5. An example from statistical inference................ 5 2 Probability Spaces 9 2. Sample Spaces and σ fields.....................

More information

7.5: Conditional Probability and Independent Events

7.5: Conditional Probability and Independent Events c Dr Oksana Shatalov, Spring 2012 1 7.5: Conditional Probability and Independent Events EXAMPLE 1. Two cards are drawn from a deck of 52 without replacement. (a) What is the probability of that the first

More information

Econ 113. Lecture Module 2

Econ 113. Lecture Module 2 Econ 113 Lecture Module 2 Contents 1. Experiments and definitions 2. Events and probabilities 3. Assigning probabilities 4. Probability of complements 5. Conditional probability 6. Statistical independence

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten

More information

Computing Probability

Computing Probability Computing Probability James H. Steiger October 22, 2003 1 Goals for this Module In this module, we will 1. Develop a general rule for computing probability, and a special case rule applicable when elementary

More information

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II Week 4 Classical Probability, Part II Week 4 Objectives This week we continue covering topics from classical probability. The notion of conditional probability is presented first. Important results/tools

More information

Knowledge Discovery in Data: Overview. Naïve Bayesian Classification. .. Spring 2009 CSC 466: Knowledge Discovery from Data Alexander Dekhtyar..

Knowledge Discovery in Data: Overview. Naïve Bayesian Classification. .. Spring 2009 CSC 466: Knowledge Discovery from Data Alexander Dekhtyar.. Spring 2009 CSC 466: Knowledge Discovery from Data Alexander Dekhtyar Knowledge Discovery in Data: Naïve Bayes Overview Naïve Bayes methodology refers to a probabilistic approach to information discovery

More information

Chapter 7 Probability Basics

Chapter 7 Probability Basics Making Hard Decisions Chapter 7 Probability Basics Slide 1 of 62 Introduction A A,, 1 An Let be an event with possible outcomes: 1 A = Flipping a coin A = {Heads} A 2 = Ω {Tails} The total event (or sample

More information

Bandits, Experts, and Games

Bandits, Experts, and Games Bandits, Experts, and Games CMSC 858G Fall 2016 University of Maryland Intro to Probability* Alex Slivkins Microsoft Research NYC * Many of the slides adopted from Ron Jin and Mohammad Hajiaghayi Outline

More information

Pseudorandom functions and permutations

Pseudorandom functions and permutations Introduction Pseudorandom functions and permutations 15-859I Spring 2003 Informally, a Pseudorandom function family (PRF is a collection of functions which are indistinguishable from random functions PRFs

More information

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ). CS 70 Discrete Mathematics for CS Spring 2006 Vazirani Lecture 8 Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According to clinical trials,

More information

Topic 2 Multiple events, conditioning, and independence, I. 2.1 Two or more events on the same sample space

Topic 2 Multiple events, conditioning, and independence, I. 2.1 Two or more events on the same sample space CSE 103: Probability and statistics Fall 2010 Topic 2 Multiple events, conditioning, and independence, I 2.1 Two or more events on the same sample space Frequently we end up dealing with multiple events

More information

Solution Set for Homework #1

Solution Set for Homework #1 CS 683 Spring 07 Learning, Games, and Electronic Markets Solution Set for Homework #1 1. Suppose x and y are real numbers and x > y. Prove that e x > ex e y x y > e y. Solution: Let f(s = e s. By the mean

More information

Name: Matriculation Number: Tutorial Group: A B C D E

Name: Matriculation Number: Tutorial Group: A B C D E Name: Matriculation Number: Tutorial Group: A B C D E Question: 1 (5 Points) 2 (6 Points) 3 (5 Points) 4 (5 Points) Total (21 points) Score: General instructions: The written test contains 4 questions

More information

1 Indistinguishability for multiple encryptions

1 Indistinguishability for multiple encryptions CSCI 5440: Cryptography Lecture 3 The Chinese University of Hong Kong 26 September 2012 1 Indistinguishability for multiple encryptions We now have a reasonable encryption scheme, which we proved is message

More information

Tutorial 1 : Probabilities

Tutorial 1 : Probabilities Lund University ETSN01 Advanced Telecommunication Tutorial 1 : Probabilities Author: Antonio Franco Emma Fitzgerald Tutor: Farnaz Moradi January 11, 2016 Contents I Before you start 3 II Exercises 3 1

More information

ECE Homework Set 2

ECE Homework Set 2 1 Solve these problems after Lecture #4: Homework Set 2 1. Two dice are tossed; let X be the sum of the numbers appearing. a. Graph the CDF, FX(x), and the pdf, fx(x). b. Use the CDF to find: Pr(7 X 9).

More information

Lecture 4: Computationally secure cryptography

Lecture 4: Computationally secure cryptography CS 7880 Graduate Cryptography September 18, 017 Lecture 4: Computationally secure cryptography Lecturer: Daniel Wichs Scribe: Lucianna Kiffer 1 Topic Covered ε-security Computationally secure cryptography

More information

CS280, Spring 2004: Final

CS280, Spring 2004: Final CS280, Spring 2004: Final 1. [4 points] Which of the following relations on {0, 1, 2, 3} is an equivalence relation. (If it is, explain why. If it isn t, explain why not.) Just saying Yes or No with no

More information

The University of Nottingham

The University of Nottingham The University of Nottingham SCHOOL OF COMPUTER SCIENCE A LEVEL 3 MODULE, AUTUMN SEMESTER 2010-2011 KNOWLEDGE REPRESENTATION AND REASONING Time allowed TWO hours Candidates may complete the front cover

More information

Reasoning with Bayesian Networks

Reasoning with Bayesian Networks Reasoning with Lecture 1: Probability Calculus, NICTA and ANU Reasoning with Overview of the Course Probability calculus, Bayesian networks Inference by variable elimination, factor elimination, conditioning

More information

i=1 Pr(X 1 = i X 2 = i). Notice each toss is independent and identical, so we can write = 1/6

i=1 Pr(X 1 = i X 2 = i). Notice each toss is independent and identical, so we can write = 1/6 Mehryar Mohri Foundations of Machine Learning Courant Institute of Mathematical Sciences Homework assignment 1 - solution Credit: Ashish Rastogi, Afshin Rostamizadeh Ameet Talwalkar, and Eugene Weinstein.

More information

CS6999 Probabilistic Methods in Integer Programming Randomized Rounding Andrew D. Smith April 2003

CS6999 Probabilistic Methods in Integer Programming Randomized Rounding Andrew D. Smith April 2003 CS6999 Probabilistic Methods in Integer Programming Randomized Rounding April 2003 Overview 2 Background Randomized Rounding Handling Feasibility Derandomization Advanced Techniques Integer Programming

More information

Expected Value II. 1 The Expected Number of Events that Happen

Expected Value II. 1 The Expected Number of Events that Happen 6.042/18.062J Mathematics for Computer Science December 5, 2006 Tom Leighton and Ronitt Rubinfeld Lecture Notes Expected Value II 1 The Expected Number of Events that Happen Last week we concluded by showing

More information

Tail Inequalities. The Chernoff bound works for random variables that are a sum of indicator variables with the same distribution (Bernoulli trials).

Tail Inequalities. The Chernoff bound works for random variables that are a sum of indicator variables with the same distribution (Bernoulli trials). Tail Inequalities William Hunt Lane Department of Computer Science and Electrical Engineering, West Virginia University, Morgantown, WV William.Hunt@mail.wvu.edu Introduction In this chapter, we are interested

More information

Chapter 4 Sections 4.1 & 4.7 in Rosner September 23, 2008 Tree Diagrams Genetics 001 Permutations Combinations

Chapter 4 Sections 4.1 & 4.7 in Rosner September 23, 2008 Tree Diagrams Genetics 001 Permutations Combinations Chapter 4 Sections 4.1 & 4.7 in Rosner September 23, 2008 Tree Diagrams Genetics 001 Permutations Combinations Homework 2 (Due Thursday, Oct 2) is at the back of this handout Goal for Chapter 4: To introduce

More information

3.2 Conditional Probability and Independence

3.2 Conditional Probability and Independence 3.2 Conditional Probability and Independence Mark R. Woodard Furman U 2010 Mark R. Woodard (Furman U) 3.2 Conditional Probability and Independence 2010 1 / 6 Outline 1 Conditional Probability 2 Independence

More information

CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-II

CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-II CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-II 1 Bayes Rule Example Disease {malaria, cold, flu}; Symptom = fever Must compute Pr(D fever) to prescribe treatment Why not assess

More information

Inference in Regression Model

Inference in Regression Model Inference in Regression Model Christopher Taber Department of Economics University of Wisconsin-Madison March 25, 2009 Outline 1 Final Step of Classical Linear Regression Model 2 Confidence Intervals 3

More information

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam Math 5. Rumbos Fall 23 Solutions to Review Problems for Final Exam. Three cards are in a bag. One card is red on both sides. Another card is white on both sides. The third card in red on one side and white

More information

2011 Pearson Education, Inc

2011 Pearson Education, Inc Statistics for Business and Economics Chapter 3 Probability Contents 1. Events, Sample Spaces, and Probability 2. Unions and Intersections 3. Complementary Events 4. The Additive Rule and Mutually Exclusive

More information

Pr[A B] > Pr[A]Pr[B]. Pr[A B C] = Pr[(A B) C] = Pr[A]Pr[B A]Pr[C A B].

Pr[A B] > Pr[A]Pr[B]. Pr[A B C] = Pr[(A B) C] = Pr[A]Pr[B A]Pr[C A B]. CS70: Jean Walrand: Lecture 25. Product Rule Product Rule Causality, Independence, Collisions and Collecting 1. Product Rule 2. Correlation and Causality 3. Independence 4. 5. Birthdays 6. Checksums 7.

More information

Probability. VCE Maths Methods - Unit 2 - Probability

Probability. VCE Maths Methods - Unit 2 - Probability Probability Probability Tree diagrams La ice diagrams Venn diagrams Karnough maps Probability tables Union & intersection rules Conditional probability Markov chains 1 Probability Probability is the mathematics

More information

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 CS 70 Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 Today we shall discuss a measure of how close a random variable tends to be to its expectation. But first we need to see how to compute

More information

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Applications Elementary Set Theory Random

More information

The rest of the course

The rest of the course The rest of the course Subtleties involved with maximizing expected utility: Finding the right state space: The wrong state space leads to intuitively incorrect answers when conditioning Taking causality

More information

Maximizing expected utility

Maximizing expected utility Maximizing expected utility Earlier we looked at many decision rules: maximin minimax regret principle of insufficient reason... The most commonly used rule (and the one taught in business schools!) is

More information

11.1 Set Cover ILP formulation of set cover Deterministic rounding

11.1 Set Cover ILP formulation of set cover Deterministic rounding CS787: Advanced Algorithms Lecture 11: Randomized Rounding, Concentration Bounds In this lecture we will see some more examples of approximation algorithms based on LP relaxations. This time we will use

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

Lecture 5: Two-point Sampling

Lecture 5: Two-point Sampling Randomized Algorithms Lecture 5: Two-point Sampling Sotiris Nikoletseas Professor CEID - ETY Course 2017-2018 Sotiris Nikoletseas, Professor Randomized Algorithms - Lecture 5 1 / 26 Overview A. Pairwise

More information

Source Coding with Lists and Rényi Entropy or The Honey-Do Problem

Source Coding with Lists and Rényi Entropy or The Honey-Do Problem Source Coding with Lists and Rényi Entropy or The Honey-Do Problem Amos Lapidoth ETH Zurich October 8, 2013 Joint work with Christoph Bunte. A Task from your Spouse Using a fixed number of bits, your spouse

More information

Probabilities and Expectations

Probabilities and Expectations Probabilities and Expectations Ashique Rupam Mahmood September 9, 2015 Probabilities tell us about the likelihood of an event in numbers. If an event is certain to occur, such as sunrise, probability of

More information

Information and Entropy. Professor Kevin Gold

Information and Entropy. Professor Kevin Gold Information and Entropy Professor Kevin Gold What s Information? Informally, when I communicate a message to you, that s information. Your grade is 100/100 Information can be encoded as a signal. Words

More information

Objectives: Review open, closed, and mixed intervals, and begin discussion of graphing points in the xyplane. Interval notation

Objectives: Review open, closed, and mixed intervals, and begin discussion of graphing points in the xyplane. Interval notation MA 0090 Section 18 - Interval Notation and Graphing Points Objectives: Review open, closed, and mixed intervals, and begin discussion of graphing points in the xyplane. Interval notation Last time, we

More information

Polynomial time Prediction Strategy with almost Optimal Mistake Probability

Polynomial time Prediction Strategy with almost Optimal Mistake Probability Polynomial time Prediction Strategy with almost Optimal Mistake Probability Nader H. Bshouty Department of Computer Science Technion, 32000 Haifa, Israel bshouty@cs.technion.ac.il Abstract We give the

More information

Massachusetts Institute of Technology Lecture J/18.062J: Mathematics for Computer Science 2 May 2000 Professors David Karger and Nancy Lynch

Massachusetts Institute of Technology Lecture J/18.062J: Mathematics for Computer Science 2 May 2000 Professors David Karger and Nancy Lynch Massachusetts Institute of Technology Lecture 23 6.042J/18.062J: Mathematics for Computer Science 2 May 2000 Professors David Karger and Nancy Lynch Lecture Notes 1 The Expected Value of a Product This

More information

Chapter 3 : Conditional Probability and Independence

Chapter 3 : Conditional Probability and Independence STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 2016 Néhémy Lim Chapter 3 : Conditional Probability and Independence 1 Conditional Probabilities How should we modify the probability of an event when

More information

C. Graph the solution to possibilities for Sharmara s number and give the solution in interval notation.

C. Graph the solution to possibilities for Sharmara s number and give the solution in interval notation. Homework Problem Set Sample Solutions S.77 Homework Problem Set 1. Shamara is thinking of a number. A. Could Shamara be thinking of 8? Explain. No, if Shamara thought of 8, the answer would equal 2. B.

More information

The Markov Chain Monte Carlo Method

The Markov Chain Monte Carlo Method The Markov Chain Monte Carlo Method Idea: define an ergodic Markov chain whose stationary distribution is the desired probability distribution. Let X 0, X 1, X 2,..., X n be the run of the chain. The Markov

More information

6.2 Deeper Properties of Continuous Functions

6.2 Deeper Properties of Continuous Functions 6.2. DEEPER PROPERTIES OF CONTINUOUS FUNCTIONS 69 6.2 Deeper Properties of Continuous Functions 6.2. Intermediate Value Theorem and Consequences When one studies a function, one is usually interested in

More information

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) = Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,

More information

7.. SUPPLEMENTARY PROBLEMS FOR CHAPTER Supplementary Problems for Chapter 7 EXERCISE 7. A certain student always arrives the day before the te

7.. SUPPLEMENTARY PROBLEMS FOR CHAPTER Supplementary Problems for Chapter 7 EXERCISE 7. A certain student always arrives the day before the te Modeling the Dynamics of Life: Calculus and for Life Scientists Frederick R. Adler cfrederick R. Adler, 22 Department of Mathematics and Department of Biology, University of Utah, Salt Lake City, Utah

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

Lecture 8. Probabilistic Reasoning CS 486/686 May 25, 2006

Lecture 8. Probabilistic Reasoning CS 486/686 May 25, 2006 Lecture 8 Probabilistic Reasoning CS 486/686 May 25, 2006 Outline Review probabilistic inference, independence and conditional independence Bayesian networks What are they What do they mean How do we create

More information

PubH 7470: STATISTICS FOR TRANSLATIONAL & CLINICAL RESEARCH

PubH 7470: STATISTICS FOR TRANSLATIONAL & CLINICAL RESEARCH PubH 7470: STATISTICS FOR TRANSLATIONAL & CLINICAL RESEARCH The First Step: SAMPLE SIZE DETERMINATION THE ULTIMATE GOAL The most important, ultimate step of any of clinical research is to do draw inferences;

More information

Machine Learning, Midterm Exam: Spring 2008 SOLUTIONS. Q Topic Max. Score Score. 1 Short answer questions 20.

Machine Learning, Midterm Exam: Spring 2008 SOLUTIONS. Q Topic Max. Score Score. 1 Short answer questions 20. 10-601 Machine Learning, Midterm Exam: Spring 2008 Please put your name on this cover sheet If you need more room to work out your answer to a question, use the back of the page and clearly mark on the

More information

Conditional Probability P( )

Conditional Probability P( ) Conditional Probability P( ) 1 conditional probability where P(F) > 0 Conditional probability of E given F: probability that E occurs given that F has occurred. Conditioning on F Written as P(E F) Means

More information

1 The Probably Approximately Correct (PAC) Model

1 The Probably Approximately Correct (PAC) Model COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #3 Scribe: Yuhui Luo February 11, 2008 1 The Probably Approximately Correct (PAC) Model A target concept class C is PAC-learnable by

More information

X = X X n, + X 2

X = X X n, + X 2 CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk

More information

Probability and random variables

Probability and random variables Probability and random variables Events A simple event is the outcome of an experiment. For example, the experiment of tossing a coin twice has four possible outcomes: HH, HT, TH, TT. A compound event

More information

STAT 516: Basic Probability and its Applications

STAT 516: Basic Probability and its Applications Lecture 3: Conditional Probability and Independence Prof. Michael September 29, 2015 Motivating Example Experiment ξ consists of rolling a fair die twice; A = { the first roll is 6 } amd B = { the sum

More information

Lecture 4: Two-point Sampling, Coupon Collector s problem

Lecture 4: Two-point Sampling, Coupon Collector s problem Randomized Algorithms Lecture 4: Two-point Sampling, Coupon Collector s problem Sotiris Nikoletseas Associate Professor CEID - ETY Course 2013-2014 Sotiris Nikoletseas, Associate Professor Randomized Algorithms

More information

12. Vagueness, Uncertainty and Degrees of Belief

12. Vagueness, Uncertainty and Degrees of Belief 12. Vagueness, Uncertainty and Degrees of Belief KR & R Brachman & Levesque 2005 202 Noncategorical statements Ordinary commonsense knowledge quickly moves away from categorical statements like a P is

More information

Determining Probabilities. Product Rule for Ordered Pairs/k-Tuples:

Determining Probabilities. Product Rule for Ordered Pairs/k-Tuples: Determining Probabilities Product Rule for Ordered Pairs/k-Tuples: Determining Probabilities Product Rule for Ordered Pairs/k-Tuples: Proposition If the first element of object of an ordered pair can be

More information

Janson s Inequality and Poisson Heuristic

Janson s Inequality and Poisson Heuristic Janson s Inequality and Poisson Heuristic Dinesh K CS11M019 IIT Madras April 30, 2012 Dinesh (IITM) Janson s Inequality April 30, 2012 1 / 11 Outline 1 Motivation Dinesh (IITM) Janson s Inequality April

More information

Homework 7 Solutions

Homework 7 Solutions Homework 7 Solutions Due: March 22, 2018 CS 151: Intro. to Cryptography and Computer Security 1 Fun with PRFs a. F a s = F 0 k(x) F s (x) is not a PRF, for any choice of F. Consider a distinguisher D a

More information

3.8 Functions of a Random Variable

3.8 Functions of a Random Variable STAT 421 Lecture Notes 76 3.8 Functions of a Random Variable This section introduces a new and important topic: determining the distribution of a function of a random variable. We suppose that there is

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

Inaccessible Entropy and its Applications. 1 Review: Psedorandom Generators from One-Way Functions

Inaccessible Entropy and its Applications. 1 Review: Psedorandom Generators from One-Way Functions Columbia University - Crypto Reading Group Apr 27, 2011 Inaccessible Entropy and its Applications Igor Carboni Oliveira We summarize the constructions of PRGs from OWFs discussed so far and introduce the

More information

Discrete Structures Prelim 1 Selected problems from past exams

Discrete Structures Prelim 1 Selected problems from past exams Discrete Structures Prelim 1 CS2800 Selected problems from past exams 1. True or false (a) { } = (b) Every set is a subset of its power set (c) A set of n events are mutually independent if all pairs of

More information

Probability. A and B are mutually exclusive

Probability. A and B are mutually exclusive Probability The probability of an event is its true relative frequency, the proportion of times the event would occur if we repeated the same process over and over again.! A and B are mutually exclusive

More information

DRAFT. M118 Exam Jam Concise. Contents. Chapter 2: Set Theory 2. Chapter 3: Combinatorics 3. Chapter 4: Probability 4. Chapter 5: Statistics 6

DRAFT. M118 Exam Jam Concise. Contents. Chapter 2: Set Theory 2. Chapter 3: Combinatorics 3. Chapter 4: Probability 4. Chapter 5: Statistics 6 Contents Chapter 2: Set Theory 2 Chapter 3: Combinatorics 3 Chapter 4: Probability 4 Chapter 5: Statistics 6 Chapter 6: Linear Equations and Matrix Algebra 8 Chapter 7: Linear Programming: Graphical Solutions

More information

M118 Exam Jam. Contents. Chapter 2: Set Theory 2. Chapter 3: Combinatorics 5. Chapter 4: Probability 7. Chapter 5: Statistics 12

M118 Exam Jam. Contents. Chapter 2: Set Theory 2. Chapter 3: Combinatorics 5. Chapter 4: Probability 7. Chapter 5: Statistics 12 Contents Chapter 2: Set Theory 2 Chapter 3: Combinatorics 5 Chapter 4: Probability 7 Chapter 5: Statistics 12 Chapter 6: Linear Equations and Matrix Algebra 17 Chapter 7: Linear Programming: Graphical

More information

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture) ECE 564/645 - Digital Communications, Spring 018 Homework # Due: March 19 (In Lecture) 1. Consider a binary communication system over a 1-dimensional vector channel where message m 1 is sent by signaling

More information