Problems for 2.6 and 2.7

Size: px
Start display at page:

Download "Problems for 2.6 and 2.7"

Transcription

1 UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probability and Random Processes Practice Problems for Midterm: SOLUTION # Fall 7 Issued: Thurs, September 7, 7 Solutions: Posted on Tues, October, 7 Reading: Bertsekas & Tsitsiklis, Chapters and Problems for.6 and.7 Problem 3. Let X and Y be independent random variables that take values in the set {,,3}. Let V X + Y, and W X Y. (a Assume that P(X k and P(Y k are positive for any k {,,3}. Can V and W be independent? Explain. (No calculations needed. For the remaining parts of this problem, assume that X and Y are uniformly distributed on {,,3}. (b Find and plot p V (v, and compute E[V ] and var(v. (c Find and show in a diagram p V,W (v,w. (d Find E[V W > ]. (e Find the conditional variance of W given the event V 8. Solution:. V and W cannot be independent. Knowledge of one random variable gives information about the other. For instance, if V we know that W.. We begin by drawing the joint PMF of X and Y. y 3 (/9 (/9 (/9 v (/9 (/9 (/9 (/9 (/9 (/9 v4 v6 v v8 3 x

2 X and Y are uniformly distributed so each of the nine grid points has probability /9. The lines on the graph represent areas of the sample space in which V is constant. This constant value of V is indicated on each line. The PMF of V is calculated by adding the probability associated with each grid point on the appropriate line. p (v V /3 /9 / v By symmetry, E[V ] 8. The variance is: var(v ( ( ( ( Alternatively, note that V is twice the sum of two independent random variables, V (X + Y, and hence var(v var ( (X+Y var(x+y 4 ( var(x+var(y 4 var(x 8var(X (note the use of independence in the third equality; in the fourth one we use the fact that X, Y are identically distributed, therefore they have the same variance. Now, by the distribution of X, we can easily calculate that var(x 3 ( + 3 ( + 3 (3 3, so that in total var(v 6 3, as before. 3. We start by adding lines corresponding to constant values of W to our first graph in part b:

3 y 3 w- w- w v w v v4 v6 w v8 3 x Again, each grid point has probability /9. Using the above graph, we get p V,W (v,w. w - - (/9 (/9 (/9 (/9 (/9 (/9 (/9 (/9 (/ v 4. The event W > is shaded below: w - - (/9 W> (/9 (/9 (/9 (/9 (/9 (/9 (/9 (/ v By symmetry, E[V W > ] The event V 8 is shaded below: 3

4 w - - V v When V 8, W can take on values in the set {,,} with equal probability. By symmetry, E[W V 8]. The variance is: var(w V 8 ( 3 + ( 3 + ( Problem 3. Joe Lucky plays the lottery on any given week with probability p, independently of whether he played on any other week. Each time he plays, he has a probability q of winning, again independently of everything else. During a fixed time period of n weeks, let X be the number of weeks that he played the lottery and Y the number of weeks that he won. (a What is the probability that he played the lottery any particular week, given that he did not win anything that week? (b Find the conditional PMF p Y X (y x. (c Find the joint PMF p X,Y (x,y. (d Find the marginal PMF p Y (y. Hint: One possibility is to start with the answer in (c, but the algebra can be messy. But if you think intuitively about the procedure that generates Y, you may be able to guess the answer. (e Find the conditional PMF p X Y (x y. In all parts of this problem, make sure to indicate the range of values to which your PMF formula applies. Solution:. Let L i be the event that Joe played the lottery on week i, and let W i be the event that he won on week i. We are asked to find P(L i W c i P(Wi c L ip(l i P(Wi c L ip(l i + P(W c i Lc i P(Lc i ( qp ( qp + ( p p pq pq 4

5 . Conditioned on X, Y is binomial ( x q p Y X (y x y y ( q (x y, y x;, otherwise. 3. Realizing that X has a binomial PMF, we have p X,Y (x,y p Y X (y xp X (x ( ( x n q y y ( q (x y x 4. Using the result from (c, we could compute 5. p x ( p (n x, y x n;, otherwise. p Y (y n p X,Y (x,y, xy but the algebra is messy. An easier method is to realize that Y is just the sum of n independent Bernoulli random variables, each having a probability pq of being. Therefore Y has a binomial PMF: p Y (y ( n (pq y y ( pq (n y, y n;, otherwise. p X Y (x y p X,Y (x,y p Y (y Aq y ( q (x n Ap y x x ( p (n n, y x n; A(pq y y ( pq (n y, otherwise. Problem 3.3 Suppose you and your friend play a game where each of you throws a 6-sided die, each of your throws being independent. Each time you play the game, if the largest of the two values you obtained from each die is greater than 4, then you win dollar; otherwise, you lose dollar. Suppose that you play the game n 3 times, each game being independent of the others. (a What is the amount of money you expect to win on the first and last game combined? 5

6 (b How much do you expect to win in your last game given that you lost in the first game? (c How much do you expect to have won in your last game given that you won the first game and you won a total of m dollars at the end? (d What is the probability that you won both the first and last game given that you won a total of m dollars at the end? Solution: (a We can define the collection of random variables {X i } i I for I {,,...,n} with each X i representing the amount that is won in game i. For this question, we want to calculate E [X + X n ] E [X ] + E [X n ] E [X ], where we have used linearity of expectation and the fact that {X i } i I are identically distributed. To compute E [X i ], we note that X i takes value whenever max {roll #, roll #} > 4 and it takes on otherwise. If we name the two dice rolls as A and B and consider the joint PMF p A,B (a,b p A (a p B (b 36, we can evaluate the expectation directly: E [X i ] 9 max{a,b}>4 max{a,b}>4 p A,B (a,b max{a,b} 4 max{a,b} 4 36 ( p A,B (a,b So the solution to this question is $ 9. (b By independence, we can ignore the information about losing the first game, and simply calculate the expected value of the last game as above to get $ 9. (c Easier Solution: The given information provides E [ n i X i] m, and by linearity of expectation and the fact that all X i are identical, we then have (n E [X i ] m E [X i ] m Longer Solution: n. We want to compute E [X n C], where our conditioning event C is defined as C {X } D and D is defined as D { won m dollars total }. The expectation is given by: 6

7 E [X n C] x n {,} x n p xn C(x n P ({X n } {X } D P ({X } D P ({X n } {X } D P ({X } D + ( P ({X n } {X } D P ({X } D To carry out the computation, we need to translate our event statements into familiar forms. First, we consider the event {X n } {X } D. This event can be translated to we win the first game AND the second game AND m- dollars in the other games. Since we have split the sub-events across games, we now have independet events, and we notice that the last event can be expressed as a binomial; to win m dollars in n games, we must have won (m + (n (m / games total and lost (n (m / games total. Therefore, we have: P ({X n } {X } D P ({X n } P ({X } P (D 36 ( ( n m ( n 6 36 n m (n n m To calculate the probability of the event {X } D is similar; we translate into won the first game AND won m- dollars in n- other games, and we carry out the computation with independence and a binomial. Note that for both of the probability expressions, expression is only valid when n m is even. When n m is odd, the probability is zero, because it is impossible to win m dollars in n games in that case (since we must have split the other n m games evenly between wins and losses. Our final expression, neglecting the trivial case of n m odd and after performing some algebra and cancellations, is: E [X n C] ( n n m ( n n m n + m (n m n (d Now we want to compute P({X } {X n } D. As before, we can expand the conditional by the definition of conditional probability, and carry out the computation as above: 7

8 P ({X } {X n } D P ({X } {X n } D P (D Sample midterm problems (n n m (n n m n(n Problem 3.4 Since there is no direct flight from San Diego (S to New York (N, every time Alice wants to go to the New York, she has to stop in either Chicago (C or Denver (D. Due to bad weather conditions, both the flights from S to C and the flights from C to N have independently a delay of hour with probability p. Similarly, at Denver airport, both incoming and outgoing flights are independently subject to a hour delay with probability q. On any given occasion, Alice chooses randomly between the Chicago or Denver routes with equal probability. (a What is the average total delay (across both legs of the overall trip that she experiences in going from S to N? (b Suppose Alice arrives at N with a delay of two hours. What is the probability that she flew through C? (c Suppose that Alice wants to maximize the probability that she arrives in New York with a total delay < hours. Under what conditions on p and q is going via Chicago a better choice than going via Denver? (d Suppose now that Alice always flies through C. On average, how many trips does she make before experiencing a hour delay? (e Suppose now that the flight between S and D is known to be delayed, but Alice still randomly flies either via C or D with equal probability. With what delay should she expect to arrive at N? Solution: The problem can be modeled as a network having four nodes (S,D,C,S, where SD, SC, CD and CN are linked. We can define four independent random variable indicating the delay on each of the link: X SC is wp p and is wp p. X SD is wp q and is wp q. X CN is wp p and is wp p. X DN is wp q and is wp q. Also, let us define D as the event that Alice flies through Denver, and C as the event that Alice flies through Chicago. 8

9 (a There are two possible ways to go to N from S, so using Total Probability law we have E(delay E(delay CP(C + E(delay DP(D E(X SC + X CN P(C + E(X SD + X DN P(D (E(X SC + E(X CN + E(X SD + E(X DN p + q (b Using Total Probability law and Bayes rule, and since P(D P(C we have P(C delay P(delay CP(C P(delay DP(D + P(delay CP(C P(X SC + X CN P(D P(X SD + X DN P(D + P(X SC + X CN P(C P(X SC + X CN P(X SD + X DN + P(X SC + X CN p q( q + p (c Flying via Denver P(delay < D P(X SD + X DN < ( q and flying via Chicago P(delay < C P(X SC + X CN < P(X SC + X CN p This solution has been updated: > should have been <. Alice should fly via Chicago when ( q < p. This is the case when q? p or, equivalently, p < q( q. (d A delay of two hours happens with probability p on each trip. We are asked for the mean of a geometric random variable with parameter p. Thus, the average number of trips is p. (e From the independence of the four random variables E(delay X SD E(delay X SD,DP(D X SD +E(delay X SD,CP(C X SD ( + E(X DN + E(X SC + E(X CN ( + q + p + q + p 9

10 Problem 3.5 We transmit a bit of information which is with probability p and with p. Because of noise on the channel, each transmitted bit is received correctly with probability ǫ. (a Suppose we observe a at the output. Find the conditional probability p that the transmitted bit is a. (b Suppose that we transmit the same information bit n times over the channel. Calculate the probability that the information bit is a given that you have observed n s at the output. What happens when n grows? (c For this part of the problem, we suppose that we transmit the symbol a total of n times over the channel. At the output of the channel, we observe the symbol three times in the n received bits, and that we observe a at the n-th transmission. Given these observations, what is the probability that the k-th received bit is a? (d Going back to the situation in part (a: some unknown bit is transmitted over the channel, and the received bit is a. Suppose in addition that the same information bit is transmitted a second time, and you again receive another. We want to find a recursive formula to update p to get p, the conditional probability that the transmitted bit is a given that we have observed two s at the output of the channel. Show that the update can be written as Solution: p ( ǫp ( ǫp + ǫ( p (a Let A be the event that is transmitted, A c be the event that is transmitted, B n be the event the n-th bit we received is. Then using Total Probability law and Bayes rule: p P(A B P(B AP(A P(B AP(A + P(B A c P(A c ( ǫp ( ǫp + ǫ( p (b As in the previous part, and assuming that the uses of the channel are independent, we have P(A B,...,B n As n, we have P(B,...,B n AP(A P(B,...,B n AP(A + P(B,...,B n A c P(A c ( ǫ n p ( ǫ n p + ǫ n ( p p p + ( ǫ ǫ n ( p ǫ < p p ǫ ǫ >

11 These results meet the intuitions. When ǫ <, it means the observations are positively correlated with the bit transmitted, thus knowing the observations B n n,,... will increase the conditional probability of A given the observations, until it hits. When ǫ, it means the observations are independent to the bit transmitted, thus given a bunch of observations B n n,,... won t change the conditional probability of A given the observations, which is the same as the prior probability p. When ǫ >, it means the observations are negatively correlated with the bit transmitted, thus knowing the observations B n n,,... will decrease the conditional probability of A given the observations, until it hits. (c For j < n P(B j n B i 3,B n,a i P( n i B i 3 B j,b n,ap(b B n,a P( n i B i B n,a ( n ( ǫǫ n 3 ( ǫ ( ǫ ǫ n 3 ( n n while if j n P(B n n i B i 3,B n,a. (d Assuming the uses of the channel are independent, we have p P(A B,B P(B A,B P(A B P(B B P(B,B AP(A P(B AP(A P(A B P(B B ( ǫ ( ǫ p P(B B And since We have P(B B P(B,B P(B P(B,B AP(A + P(B,B A C P(A C P(B AP(A + P(B A c P(A c ( ǫ p + ǫ ( p ( ǫp + ǫ( p ( ǫp ( ǫ ( ǫp + ǫ( p + ǫ ǫ( p ( ǫp + ǫ( p ( ǫp + ( p ǫ p ( ǫp ( ǫp + ( p ǫ

12 One might have noticed that, if we let p p, the functions that we use to upgrade p to p (part (a, and p to p are the same. Intuitively, for the first observation, p is the prior probability of A before the observation, and p is the posterior probability of A after the observation. Similarly, for the second observation, p is the prior probability, and p is the posterior probability. Thus the ways to upgrade two prior probabilities to the two posterior probabilities should be the same. Problem 3.6 You play the lottery by choosing a set of 6 numbers from {,,...,49} without replacement. Let X be a random variable representing the number of matches between your set and the winning set. (The order of numbers in your set and the winning set does not matter. You win the grand prize if all 6 numbers match (i.e., if X 6. (a What is the probability of winning the grand prize? Compute the PMF p X of X. (b Suppose that before playing the lottery, you (illegally wiretap the phone of the lottery, and learn that of the winning numbers are between and ; another are between and 4, and the remaining are between 4 and 49. If you use this information wisely in choosing your six numbers, how does your probability of winning the grand prize improve? (c Now suppose instead that you determine by illegal wiretapping that the maximum number in the winning sequence is some fixed number R (note that R must be 6 or larger. If you use this information wisely in choosing your 6 numbers, how does your probability of winning the grand prize improve? (d Use a counting argument to establish the identity Solution: ( n k n rk ( r. k (a Among the 49 numbers used at the lottery, only 6 correspond to the winning sequence. If the number of matches between our set and the winning set is k, then we must have selected (without replacement and without ordering exactly k elements from the winning set of size 6 and 6 k elements from the remaining set of 49 6 available numbers. Then, we have that ( k( P(X k 6 k ( 49 6 so the probability of winning the grand prize is ( 6 P(X 6 ( 6 49 (

13 (b In order to use wisely the information given to us, we select (without replacement and without ordering two numbers from the set {,...,}, two numbers from the set {,...,4}, and two numbers from the set {4,...,49}. Among the ( ( ( 9 ways of selecting the six numbers, there is only one that corresponds to the right sequence. So, the probability of winning is ( 9 ( ( Which corresponds to an improvement of (roughly a factor with respect to case when no information is available. (c Since we know that the maximum number in the winning sequence is some number r, we select r in our sequence of numbers. Next, we use the information that r is the maximum number, and we select the remaining 5 numbers in the set {,...,r }. The probability of winning the grand prize becomes ( r 5 (d We can imagine to number in increasing order the n elements from which we are sampling k elements. Let us denote with r the maximum number that we select. For a fixed r we can use the result in part c for determining in how many ways we can select the remaining r elements: ( r k. Summing over all the possible values of r, we have the result. 3

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006 Review problems UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Solutions 5 Spring 006 Problem 5. On any given day your golf score is any integer

More information

by Dimitri P. Bertsekas and John N. Tsitsiklis

by Dimitri P. Bertsekas and John N. Tsitsiklis INTRODUCTION TO PROBABILITY by Dimitri P. Bertsekas and John N. Tsitsiklis CHAPTER 2: ADDITIONAL PROBLEMS SECTION 2.2. Probability Mass Functions Problem 1. The probability of a royal flush in poker is

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

Probability Theory and Statistics (EE/TE 3341) Homework 3 Solutions

Probability Theory and Statistics (EE/TE 3341) Homework 3 Solutions Probability Theory and Statistics (EE/TE 3341) Homework 3 Solutions Yates and Goodman 3e Solution Set: 3.2.1, 3.2.3, 3.2.10, 3.2.11, 3.3.1, 3.3.3, 3.3.10, 3.3.18, 3.4.3, and 3.4.4 Problem 3.2.1 Solution

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 10: Expectation and Variance Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/ psarkar/teaching

More information

Exam 1 - Math Solutions

Exam 1 - Math Solutions Exam 1 - Math 3200 - Solutions Spring 2013 1. Without actually expanding, find the coefficient of x y 2 z 3 in the expansion of (2x y z) 6. (A) 120 (B) 60 (C) 30 (D) 20 (E) 10 (F) 10 (G) 20 (H) 30 (I)

More information

Midterm 2. Your Exam Room: Name of Person Sitting on Your Left: Name of Person Sitting on Your Right: Name of Person Sitting in Front of You:

Midterm 2. Your Exam Room: Name of Person Sitting on Your Left: Name of Person Sitting on Your Right: Name of Person Sitting in Front of You: CS70 Discrete Mathematics and Probability Theory, Fall 2018 Midterm 2 8:00-10:00pm, 31 October Your First Name: SIGN Your Name: Your Last Name: Your SID Number: Your Exam Room: Name of Person Sitting on

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 13: Expectation and Variance and joint distributions Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 71E Probability and Statistics Spring 013 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 1.30, Wednesday EEB 5303 10.00 1.00, Wednesday

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

STAT 430/510: Lecture 10

STAT 430/510: Lecture 10 STAT 430/510: Lecture 10 James Piette June 9, 2010 Updates HW2 is due today! Pick up your HW1 s up in stat dept. There is a box located right when you enter that is labeled "Stat 430 HW1". It ll be out

More information

ECEn 370 Introduction to Probability

ECEn 370 Introduction to Probability ECEn 370 Introduction to Probability Section 001 Midterm Winter, 2014 Instructor Professor Brian Mazzeo Closed Book - You can bring one 8.5 X 11 sheet of handwritten notes on both sides. Graphing or Scientic

More information

Introduction to Probability 2017/18 Supplementary Problems

Introduction to Probability 2017/18 Supplementary Problems Introduction to Probability 2017/18 Supplementary Problems Problem 1: Let A and B denote two events with P(A B) 0. Show that P(A) 0 and P(B) 0. A A B implies P(A) P(A B) 0, hence P(A) 0. Similarly B A

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 4: Solutions Fall 007 Issued: Thursday, September 0, 007 Due: Friday, September 8,

More information

Discrete Random Variable

Discrete Random Variable Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n

More information

Discrete Random Variables

Discrete Random Variables Discrete Random Variables An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan Introduction The markets can be thought of as a complex interaction of a large number of random processes,

More information

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

MATH 3C: MIDTERM 1 REVIEW. 1. Counting MATH 3C: MIDTERM REVIEW JOE HUGHES. Counting. Imagine that a sports betting pool is run in the following way: there are 20 teams, 2 weeks, and each week you pick a team to win. However, you can t pick

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

M378K In-Class Assignment #1

M378K In-Class Assignment #1 The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.

More information

CSC Discrete Math I, Spring Discrete Probability

CSC Discrete Math I, Spring Discrete Probability CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

Intro to Probability. Andrei Barbu

Intro to Probability. Andrei Barbu Intro to Probability Andrei Barbu Some problems Some problems A means to capture uncertainty Some problems A means to capture uncertainty You have data from two sources, are they different? Some problems

More information

Multivariate probability distributions and linear regression

Multivariate probability distributions and linear regression Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,

More information

ECE313 Summer Problem Set 5. Note: It is very important that you solve the problems first and check the solutions. 0, if X = 2k, = 3.

ECE313 Summer Problem Set 5. Note: It is very important that you solve the problems first and check the solutions. 0, if X = 2k, = 3. ECE Summer 20 Problem Set 5 Reading: Independence, Important Discrete RVs Quiz date: no quiz, this is for the midterm Note: It is very important that you solve the problems first and chec the solutions

More information

CS 237 Fall 2018, Homework 06 Solution

CS 237 Fall 2018, Homework 06 Solution 0/9/20 hw06.solution CS 237 Fall 20, Homework 06 Solution Due date: Thursday October th at :59 pm (0% off if up to 24 hours late) via Gradescope General Instructions Please complete this notebook by filling

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would

More information

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 8 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 8 Notes Goals for Today Counting Partitions

More information

1 The Basic Counting Principles

1 The Basic Counting Principles 1 The Basic Counting Principles The Multiplication Rule If an operation consists of k steps and the first step can be performed in n 1 ways, the second step can be performed in n ways [regardless of how

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS 6.0/6.3 Spring 009 Quiz Wednesday, March, 7:30-9:30 PM. SOLUTIONS Name: Recitation Instructor: Question Part Score Out of 0 all 0 a 5 b c 5 d 5 e 5 f 5 3 a b c d 5 e 5 f 5 g 5 h 5 Total 00 Write your solutions

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 3: Solutions Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 3: Solutions Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 126: Probablity and Random Processes Problem Set 3: Solutions Fall 2007 Issued: Thursday, September 13, 2007 Due: Friday, September

More information

Lecture 1: Probability Fundamentals

Lecture 1: Probability Fundamentals Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability

More information

Homework 4 Solution, due July 23

Homework 4 Solution, due July 23 Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var

More information

With Question/Answer Animations. Chapter 7

With Question/Answer Animations. Chapter 7 With Question/Answer Animations Chapter 7 Chapter Summary Introduction to Discrete Probability Probability Theory Bayes Theorem Section 7.1 Section Summary Finite Probability Probabilities of Complements

More information

1 INFO 2950, 2 4 Feb 10

1 INFO 2950, 2 4 Feb 10 First a few paragraphs of review from previous lectures: A finite probability space is a set S and a function p : S [0, 1] such that p(s) > 0 ( s S) and s S p(s) 1. We refer to S as the sample space, subsets

More information

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type Chapter 2: Discrete Distributions 2.1 Random Variables of the Discrete Type 2.2 Mathematical Expectation 2.3 Special Mathematical Expectations 2.4 Binomial Distribution 2.5 Negative Binomial Distribution

More information

PRACTICE PROBLEMS FOR EXAM 2

PRACTICE PROBLEMS FOR EXAM 2 PRACTICE PROBLEMS FOR EXAM 2 Math 3160Q Fall 2015 Professor Hohn Below is a list of practice questions for Exam 2. Any quiz, homework, or example problem has a chance of being on the exam. For more practice,

More information

Discrete Probability

Discrete Probability MAT 258 Discrete Mathematics Discrete Probability Kenneth H. Rosen and Kamala Krithivasan Discrete Mathematics 7E Global Edition Chapter 7 Reproduced without explicit consent Fall 2016 Week 11 Probability

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

Probability Review. Chao Lan

Probability Review. Chao Lan Probability Review Chao Lan Let s start with a single random variable Random Experiment A random experiment has three elements 1. sample space Ω: set of all possible outcomes e.g.,ω={1,2,3,4,5,6} 2. event

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

What is a random variable

What is a random variable OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr

More information

Math 218 Supplemental Instruction Spring 2008 Final Review Part A

Math 218 Supplemental Instruction Spring 2008 Final Review Part A Spring 2008 Final Review Part A SI leaders: Mario Panak, Jackie Hu, Christina Tasooji Chapters 3, 4, and 5 Topics Covered: General probability (probability laws, conditional, joint probabilities, independence)

More information

Statistics for Economists. Lectures 3 & 4

Statistics for Economists. Lectures 3 & 4 Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with

More information

Midterm #1 - Solutions

Midterm #1 - Solutions Midterm # - olutions Math/tat 94 Quizzes. Let A be the event Andrea and Bill are both in class. The complementary event is (choose one): A c = Neither Andrea nor Bill are in class A c = Bill is not in

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 7E Probability and Statistics Spring 6 Instructor : Class Meets : Office Hours : Textbook : İlker Bayram EEB 3 ibayram@itu.edu.tr 3.3 6.3, Wednesday EEB 6.., Monday D. B. Bertsekas, J. N. Tsitsiklis,

More information

Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014

Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014 Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem 18.5 Spring 214.5.4.3.2.1-4 -3-2 -1 1 2 3 4 January 1, 217 1 / 31 Expected value Expected value: measure of

More information

Chapter 1 Review of Equations and Inequalities

Chapter 1 Review of Equations and Inequalities Chapter 1 Review of Equations and Inequalities Part I Review of Basic Equations Recall that an equation is an expression with an equal sign in the middle. Also recall that, if a question asks you to solve

More information

Expectation, inequalities and laws of large numbers

Expectation, inequalities and laws of large numbers Chapter 3 Expectation, inequalities and laws of large numbers 3. Expectation and Variance Indicator random variable Let us suppose that the event A partitions the sample space S, i.e. A A S. The indicator

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

Problems and results for the ninth week Mathematics A3 for Civil Engineering students

Problems and results for the ninth week Mathematics A3 for Civil Engineering students Problems and results for the ninth week Mathematics A3 for Civil Engineering students. Production line I of a factor works 0% of time, while production line II works 70% of time, independentl of each other.

More information

Notes 12 Autumn 2005

Notes 12 Autumn 2005 MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 271E Probability and Statistics Spring 2011 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 16.30, Wednesday EEB? 10.00 12.00, Wednesday

More information

Section 9.1. Expected Values of Sums

Section 9.1. Expected Values of Sums Section 9.1 Expected Values of Sums Theorem 9.1 For any set of random variables X 1,..., X n, the sum W n = X 1 + + X n has expected value E [W n ] = E [X 1 ] + E [X 2 ] + + E [X n ]. Proof: Theorem 9.1

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 8 Fall 007 Issued: Thursday, October 5, 007 Due: Friday, November, 007 Reading: Bertsekas

More information

Discussion 03 Solutions

Discussion 03 Solutions STAT Discussion Solutions Spring 8. A new flavor of toothpaste has been developed. It was tested by a group of people. Nine of the group said they liked the new flavor, and the remaining indicated they

More information

FINAL EXAM: Monday 8-10am

FINAL EXAM: Monday 8-10am ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.

More information

Review of Probabilities and Basic Statistics

Review of Probabilities and Basic Statistics Alex Smola Barnabas Poczos TA: Ina Fiterau 4 th year PhD student MLD Review of Probabilities and Basic Statistics 10-701 Recitations 1/25/2013 Recitation 1: Statistics Intro 1 Overview Introduction to

More information

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3 Introduction to Probability Due:August 8th, 211 Solutions of Final Exam Solve all the problems 1. (15 points) You have three coins, showing Head with probabilities p 1, p 2 and p 3. You perform two different

More information

Probability Theory for Machine Learning. Chris Cremer September 2015

Probability Theory for Machine Learning. Chris Cremer September 2015 Probability Theory for Machine Learning Chris Cremer September 2015 Outline Motivation Probability Definitions and Rules Probability Distributions MLE for Gaussian Parameter Estimation MLE and Least Squares

More information

Class 8 Review Problems 18.05, Spring 2014

Class 8 Review Problems 18.05, Spring 2014 1 Counting and Probability Class 8 Review Problems 18.05, Spring 2014 1. (a) How many ways can you arrange the letters in the word STATISTICS? (e.g. SSSTTTIIAC counts as one arrangement.) (b) If all arrangements

More information

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II Week 4 Classical Probability, Part II Week 4 Objectives This week we continue covering topics from classical probability. The notion of conditional probability is presented first. Important results/tools

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Discrete Random Variables

Discrete Random Variables Discrete Random Variables An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2014 Introduction The markets can be thought of as a complex interaction of a large number of random

More information

Basic Probability and Statistics

Basic Probability and Statistics Basic Probability and Statistics Yingyu Liang yliang@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison [based on slides from Jerry Zhu, Mark Craven] slide 1 Reasoning with Uncertainty

More information

ECE313 Summer Problem Set 7. Reading: Cond. Prob., Law of total prob., Hypothesis testinng Quiz Date: Tuesday, July 3

ECE313 Summer Problem Set 7. Reading: Cond. Prob., Law of total prob., Hypothesis testinng Quiz Date: Tuesday, July 3 ECE313 Summer 2012 Problem Set 7 Reading: Cond. Prob., Law of total prob., Hypothesis testinng Quiz Date: Tuesday, July 3 Note: It is very important that you solve the problems first and check the solutions

More information

Introduction to Stochastic Processes

Introduction to Stochastic Processes Stat251/551 (Spring 2017) Stochastic Processes Lecture: 1 Introduction to Stochastic Processes Lecturer: Sahand Negahban Scribe: Sahand Negahban 1 Organization Issues We will use canvas as the course webpage.

More information

Notes on probability : Exercise problems, sections (1-7)

Notes on probability : Exercise problems, sections (1-7) Notes on probability : Exercise problems, sections (1-7) 1 Random variables 1.1 A coin is tossed until for the first time the same result appears twice in succession. To every possible outcome requiring

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018. EECS 6 Probability and Random Processes University of California, Berkeley: Spring 08 Kannan Ramchandran February 4, 08 Midterm Last Name First Name SID You have 0 minutes to read the exam and 90 minutes

More information

INF FALL NATURAL LANGUAGE PROCESSING. Jan Tore Lønning

INF FALL NATURAL LANGUAGE PROCESSING. Jan Tore Lønning 1 INF4080 2018 FALL NATURAL LANGUAGE PROCESSING Jan Tore Lønning 2 Probability distributions Lecture 5, 5 September Today 3 Recap: Bayes theorem Discrete random variable Probability distribution Discrete

More information

1,1 1,2 1,3 1,4 1,5 1,6 2,1 2,2 2,3 2,4 2,5 2,6 3,1 3,2 3,3 3,4 3,5 3,6 4,1 4,2 4,3 4,4 4,5 4,6 5,1 5,2 5,3 5,4 5,5 5,6 6,1 6,2 6,3 6,4 6,5 6,6

1,1 1,2 1,3 1,4 1,5 1,6 2,1 2,2 2,3 2,4 2,5 2,6 3,1 3,2 3,3 3,4 3,5 3,6 4,1 4,2 4,3 4,4 4,5 4,6 5,1 5,2 5,3 5,4 5,5 5,6 6,1 6,2 6,3 6,4 6,5 6,6 Name: Math 4 ctivity 9(Due by EOC Dec. 6) Dear Instructor or Tutor, These problems are designed to let my students show me what they have learned and what they are capable of doing on their own. Please

More information

Chapter 2 Random Variables

Chapter 2 Random Variables Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung

More information

Notes on Probability

Notes on Probability Notes on Probability Mark Schmidt January 7, 2017 1 Probabilites Consider an event A that may or may not happen. For example, if we roll a dice then we may or may not roll a 6. We use the notation p(a)

More information

EE 302 Division 1. Homework 6 Solutions.

EE 302 Division 1. Homework 6 Solutions. EE 3 Division. Homework 6 Solutions. Problem. A random variable X has probability density { C f X () e λ,,, otherwise, where λ is a positive real number. Find (a) The constant C. Solution. Because of the

More information

STAT/MA 416 Answers Homework 4 September 27, 2007 Solutions by Mark Daniel Ward PROBLEMS

STAT/MA 416 Answers Homework 4 September 27, 2007 Solutions by Mark Daniel Ward PROBLEMS STAT/MA 416 Answers Homework 4 September 27, 2007 Solutions by Mark Daniel Ward PROBLEMS 2. We ust examine the 36 possible products of two dice. We see that 1/36 for i = 1, 9, 16, 25, 36 2/36 for i = 2,

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

CS 188: Artificial Intelligence. Bayes Nets

CS 188: Artificial Intelligence. Bayes Nets CS 188: Artificial Intelligence Probabilistic Inference: Enumeration, Variable Elimination, Sampling Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew

More information

Brief Review of Probability

Brief Review of Probability Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic

More information

Econ 325: Introduction to Empirical Economics

Econ 325: Introduction to Empirical Economics Econ 325: Introduction to Empirical Economics Lecture 2 Probability Copyright 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 3-1 3.1 Definition Random Experiment a process leading to an uncertain

More information

Midterm #1. Lecture 10: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example, cont. Joint Distributions - Example

Midterm #1. Lecture 10: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example, cont. Joint Distributions - Example Midterm #1 Midterm 1 Lecture 10: and the Law of Large Numbers Statistics 104 Colin Rundel February 0, 01 Exam will be passed back at the end of class Exam was hard, on the whole the class did well: Mean:

More information

The expected value E[X] of discrete random variable X is defined by. xp X (x), (6.1) E[X] =

The expected value E[X] of discrete random variable X is defined by. xp X (x), (6.1) E[X] = Chapter 6 Meeting Expectations When a large collection of data is gathered, one is typically interested not necessarily in every individual data point, but rather in certain descriptive quantities such

More information

Question Paper Code : AEC11T03

Question Paper Code : AEC11T03 Hall Ticket No Question Paper Code : AEC11T03 VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad Four Year B Tech III Semester Tutorial Question Bank 2013-14 (Regulations: VCE-R11)

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

Bayesian RL Seminar. Chris Mansley September 9, 2008

Bayesian RL Seminar. Chris Mansley September 9, 2008 Bayesian RL Seminar Chris Mansley September 9, 2008 Bayes Basic Probability One of the basic principles of probability theory, the chain rule, will allow us to derive most of the background material in

More information

UC Berkeley Department of Electrical Engineering and Computer Science Department of Statistics. EECS 281A / STAT 241A Statistical Learning Theory

UC Berkeley Department of Electrical Engineering and Computer Science Department of Statistics. EECS 281A / STAT 241A Statistical Learning Theory UC Berkeley Department of Electrical Engineering and Computer Science Department of Statistics EECS 281A / STAT 241A Statistical Learning Theory Solutions to Problem Set 1 Fall 2011 Issued: Thurs, September

More information

Review: Probability. BM1: Advanced Natural Language Processing. University of Potsdam. Tatjana Scheffler

Review: Probability. BM1: Advanced Natural Language Processing. University of Potsdam. Tatjana Scheffler Review: Probability BM1: Advanced Natural Language Processing University of Potsdam Tatjana Scheffler tatjana.scheffler@uni-potsdam.de October 21, 2016 Today probability random variables Bayes rule expectation

More information

UC Berkeley, CS 174: Combinatorics and Discrete Probability (Fall 2008) Midterm 1. October 7, 2008

UC Berkeley, CS 174: Combinatorics and Discrete Probability (Fall 2008) Midterm 1. October 7, 2008 UC Berkeley, CS 74: Combinatorics and Discrete Probability (Fall 2008) Midterm Instructor: Prof. Yun S. Song October 7, 2008 Your Name : Student ID# : Read these instructions carefully:. This is a closed-book

More information

Chapter 2.5 Random Variables and Probability The Modern View (cont.)

Chapter 2.5 Random Variables and Probability The Modern View (cont.) Chapter 2.5 Random Variables and Probability The Modern View (cont.) I. Statistical Independence A crucially important idea in probability and statistics is the concept of statistical independence. Suppose

More information

Part (A): Review of Probability [Statistics I revision]

Part (A): Review of Probability [Statistics I revision] Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery

More information

Week 04 Discussion. a) What is the probability that of those selected for the in-depth interview 4 liked the new flavor and 1 did not?

Week 04 Discussion. a) What is the probability that of those selected for the in-depth interview 4 liked the new flavor and 1 did not? STAT Wee Discussion Fall 7. A new flavor of toothpaste has been developed. It was tested by a group of people. Nine of the group said they lied the new flavor, and the remaining 6 indicated they did not.

More information

Review of probability

Review of probability Review of probability Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Goals for the lecture you should understand the following concepts definition of probability random variables

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 2016 MODULE 1 : Probability distributions Time allowed: Three hours Candidates should answer FIVE questions. All questions carry equal marks.

More information

Midterm Exam 1 (Solutions)

Midterm Exam 1 (Solutions) EECS 6 Probability and Random Processes University of California, Berkeley: Spring 07 Kannan Ramchandran February 3, 07 Midterm Exam (Solutions) Last name First name SID Name of student on your left: Name

More information

Probability theory basics

Probability theory basics Probability theory basics Michael Franke Basics of probability theory: axiomatic definition, interpretation, joint distributions, marginalization, conditional probability & Bayes rule. Random variables:

More information

Introduction to Bayesian Learning. Machine Learning Fall 2018

Introduction to Bayesian Learning. Machine Learning Fall 2018 Introduction to Bayesian Learning Machine Learning Fall 2018 1 What we have seen so far What does it mean to learn? Mistake-driven learning Learning by counting (and bounding) number of mistakes PAC learnability

More information