ECE 313: Hour Exam I

Similar documents
The probability of an event is viewed as a numerical measure of the chance that the event will occur.

Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

Conditional Probability

9/6. Grades. Exam. Card Game. Homework/quizzes: 15% (two lowest scores dropped) Midterms: 25% each Final Exam: 35%

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Statistics for Economists. Lectures 3 & 4

Lecture 3. Discrete Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Discrete Random Variables

ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab

MAT 271E Probability and Statistics

Discrete Random Variables

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Chapter 2 Class Notes

MAT 271E Probability and Statistics

Probability and Statistics Notes

Class 26: review for final exam 18.05, Spring 2014

STAT 516: Basic Probability and its Applications

ECE 302: Chapter 02 Probability Model

Conditional Probability

What is Probability? Probability. Sample Spaces and Events. Simple Event

Introduction to Probability, Fall 2009

Statistics and Econometrics I

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

UNIT 4 MATHEMATICAL METHODS SAMPLE REFERENCE MATERIALS

SDS 321: Introduction to Probability and Statistics

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Random Variables Example:

Elementary Discrete Probability

STAT 345 Spring 2018 Homework 4 - Discrete Probability Distributions

2. AXIOMATIC PROBABILITY

Fault-Tolerant Computer System Design ECE 60872/CS 590. Topic 2: Discrete Distributions

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

Deep Learning for Computer Vision

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

The Components of a Statistical Hypothesis Testing Problem

1 Basic continuous random variable problems

Bernoulli and Binomial Distributions. Notes. Bernoulli Trials. Bernoulli/Binomial Random Variables Bernoulli and Binomial Distributions.

1. Discrete Distributions

Tutorial 3: Random Processes

Lecture 2: Probability. Readings: Sections Statistical Inference: drawing conclusions about the population based on a sample

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

18440: Probability and Random variables Quiz 1 Friday, October 17th, 2014

Massachusetts Institute of Technology

Introduction and basic definitions

Discussion 01. b) What is the probability that the letter selected is a vowel?

STAT509: Probability

Formalizing Probability. Choosing the Sample Space. Probability Measures

1 Basic continuous random variable problems

Discrete Random Variables

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

4.2 Probability Models

Chapter 8: An Introduction to Probability and Statistics

Probability Year 9. Terminology

Machine Learning: Homework Assignment 2 Solutions

Ch 14 Randomness and Probability

Each trial has only two possible outcomes success and failure. The possible outcomes are exactly the same for each trial.

Introduction to Probability and Statistics Slides 3 Chapter 3

Expected Value 7/7/2006

Grundlagen der Künstlichen Intelligenz

Math st Homework. First part of Chapter 2. Due Friday, September 17, 1999.

Module 2 : Conditional Probability

Announcements. Lecture 5: Probability. Dangling threads from last week: Mean vs. median. Dangling threads from last week: Sampling bias

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

2011 Pearson Education, Inc

Introduction to Probability 2017/18 Supplementary Problems

Discrete Distributions

Conditional Probability (cont'd)

Lecture Lecture 5

Mock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual

Discrete Random Variable Practice

Probability Year 10. Terminology

Midterm 2. Your Exam Room: Name of Person Sitting on Your Left: Name of Person Sitting on Your Right: Name of Person Sitting in Front of You:

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Conditional Probability & Independence. Conditional Probabilities

success and failure independent from one trial to the next?

Conditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom

Swarthmore Honors Exam 2012: Statistics

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Lecture 3 Probability Basics

The normal distribution Mixed exercise 3

Chapter 4: An Introduction to Probability and Statistics

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

ECE531 Homework Assignment Number 2

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

Naïve Bayes classification

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

Discrete Random Variable

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Chapter 7: Section 7-1 Probability Theory and Counting Principles

CSE 312 Foundations, II Final Exam

Econ 113. Lecture Module 2

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Lecture 8: Probability

Transcription:

University of Illinois Fall 07 ECE 3: Hour Exam I Wednesday, October, 07 8:45 p.m. 0:00 p.m.. [6 points] Suppose that a fair coin is independently tossed twice. Define the events: A = { The first toss is a head } B = { The second toss is a head } C = { Both tosses yield the same outcome }. Are A, B, C independent? Please show your work. The sample space is: Ω = {HH, HT, T H, T T }. Also, A = {HH, HT }, B = {HH, T H} and C = {HH, T T }. Since the coin is fair: P (A = P (B = P (C = 4 =. We have: A B = {HH}, A C = {HH} and B C = {HH}. Therefore, P (A B = P (A C = P (B C = 4 = P (AP (B = P (AP (C = P (BP (C, i.e., A, B, C are pairwise independent. We have A B C = {HH}, i.e., Therefore, A, B and C are not independent. P (A B C = 4 P (AP (BP (C = 8.. [0 points] Suppose a coin is bent or fair with equal probability. In particular, let the probability of showing up head be p, then P (p = 0.5 = P (p = 0.8 = 0.5. We toss the coin twice. (a Find the probability of getting two heads. P ( heads = 0.5 0.5 0.5 + 0.5 0.8 0.8 = 0.445. (b We will decide if the coin is fair based on the number of heads out of two tosses. How many different decision rules are there? (Hint: The decision rule does not need to be MAP or ML There are a total of three outcomes, 0, or. Hence 3 = 8 different decision rules.

3. [6 points] A machine produces computer chips that are either good (90%, slightly defective (%, or obviously defective (8%. Produced chips get passed through a quality control equipment, which is able to detect any chip that is obviously defective and discard it. What is the probability that a chip is good given that it made it through the quality control equipment? Define the events: By assumption, G = { Good Chip } SD = { Slightly Defective Chip } OD = { Obviously Defective Chip }. P (G = 0.9, P (SD = 0.0, P (OD = 0.08. The probability that a chip is good, given that it passed the quality control, i.e., it is obviously not defective is P (G OD c = P (G ODc P (OD c = P (G P (OD = 0.9 0.08 = 45 46. 4. [ points] Consider a computational creativity system like MasterProbo that can generate an infinite sequence of questions. A student uses the system and answers questions correctly with probability 4/5, independently of all other questions. (a What is the probability the student answers exactly two of the next five questions incorrectly? The situation is well-described by a Bernoulli process with parameter 4/5. The probability the student answers exactly two of the next five questions incorrectly is equal to the probability the student answers exactly three of the next five questions correctly, and is governed by the binomial pmf with n = 5 and p = 4/5, evaluated at k = 3: ( 5 3 (4/5 3 (/5 = 0 64 5 5 = 64 5 5 = 8 65. (b What is the probability that the number of questions the student needs to get three questions correct is five? In a Bernoulli process, the cumulative numbers of trials needed for r = 3 successes is governed by the negative binomial distribution, here with success probability p = 4/5, evaluated at n = 5: ( 4 (4/5 3 (/5 = 6 64 5 5 = 6 64 5 5 = 384 35. 5. [ points] Ziziphus mauritiana, also known as Chinese date, ber, Chinee apple, jujube, Indian plum, Regi pandu, and Indian jujube is a tropical fruit tree species belonging to the family Rhamnaceae. Some of its fruits are sweet and some are bitter (which we assume to be independent of one another; it is very difficult to tell without tasting. The probability of sweet is p and the probability of bitter is p. A forest dweller eats a very large sequence of fruits. (a What is the probability the time until the first sweet fruit is six? The situation is well-described by a Bernoulli process with parameter p. The time until the first success is governed by a geometric random variable, here, evaluated at k = 6: ( p 5 p.

(b What is the expected number of sweet ones the forest dweller will get before he/she has gotten three bitter ones? Note that here we are only concerned with the mean, rather than evaluation of the pmf. Let us think of bitter fruits as success with probability p, and so in considering the cumulative trials until getting three bitter ones, we are in a situation described by a negative binomial random variable with parameters r = 3 and success parameter p. Here, the mean is r/( p, but we must subtract r from this to get the expected number of sweet ones: r p r = 3 p 3. 6. [6 points] Consider two hypotheses, H and H 0. If H is true, the observation X has the distribution P (X = = 0. and P (X = = 0.8. If H 0 is true, P (X = = P (X = = 0.5. (a Specify the ML decision rule given the observation X. What is the miss probability? X = X = H 0. 0.8 H 0 0.5 0.5 Λ 0.4.6 For ML decision rule, the threshold is. Hence if X =, declare H 0. Otherwise, declare H. p miss = P (declare H 0 H is true = P (X = H is true = 0.. (b Given the prior distribution where H is true with probability π = 0.8. Specify the MAP decision rule given the observation X. What is the miss probability? The MAP threshold for the likelihood ratio is π 0 /π = 0.5. Hence we always declare H. p miss = P (declare H 0 H is true = 0. (c Given the same prior distribution as in part (b, suppose we have two independent observation of X, what is the new MAP decision rule based on two observations? (, (, (, (, H 0.04 0.6 0.6 0.64 H 0 0.5 0.5 0.5 0.5 Λ 0.6 0.64 0.64.56 The MAP threshold remains unchanged at 0.5. Hence we declare H 0 when we observe (,, otherwise we declare H. 7. [0 points] Three prisoners A, B and C are sentenced to death. The governor, however, has selected one of them to be pardoned, each with equal probability. The warden knows who is to be pardoned, but is not allowed to tell. Instead the warden agrees to tell A one name that is not to be pardoned as follows. If B is to be pardoned, the warden gives C s name. 3

If C is to be pardoned, the warden gives B s name. If A is to be pardoned, the warden gives B or C s name with equal probability. Find the conditional probability that A is to be pardoned given the warden gives B s name. By Bayes rule, P (A pardoned B s name given = P (B s name given A pardonedp (A pardoned. P (B s name given We have P (B s name given A pardoned = and P (A pardoned = 3. By symmetry we have P (B s name given =. So P (B s name given A pardoned = 3 = 3. 8. [6 points] In basketball games players make three-point field goal attempts and some of the attempts result in successful three-point field goals. Assume each of the attempts of a player is successful independently with equal probability called the three-pointer shooting percentage. Different players may have different three-pointer shooting percentages. (a Alice has a three-pointer shooting percentage of 0.4 and successfully made one threepoint field goal in a game. The number of three-point field goal attempts by Alice in this game is known to be less than or equal to three, but otherwise unknown. Give a maximum likelihood estimate of the number of her three-point field goal attempts. The number of successfully made 3-point field goals by Alice follows a binomial distribution with parameters n and p, where n is the number of attempts and p = 0.4 is the shooting percentage. The likelihood of successfully making one out of n attempts is ( n L(n = p ( p n So the likelihoods for n =,, 3 are = np( p n. L( = p = 0.4. L( = p( p = 0.48. L(3 = 3p( p = 0.43. Then ˆn ML = is the ML estimate of the number of 3-point field goal attempts by Alice in the game. (b Let Bob s three-pointer shooting percentage p be unknown. For practice Bob attempts n three-point field goals and successfully makes X of them. Let his three-pointer shooting percentage be estimated by ˆp = X n. If p is to be estimated within 0.05 (i.e., by an interval estimate of length 0. with 75% confidence, find the minimum value of n based on P { p ( ˆp a n, ˆp + a n } a, where a can be any positive number. a For 75% confidence a should be. Then by setting n = 400. n = 0.05 we get 4

9. [ points] Suppose you want to divide a 5 card deck into four hands with cards each. What is the probability that each hand has a king? (Hint: All 5 cards are distinct and there are only four kings in the set of the cards Let H i be the event that the i th hand has one king. We have the conditional probabilities ( 4 48 ( 3 36 ( 4 P (H = ( ; P (H H = ( ; P (H 3 H H = ( ; P (H 4 H H H 3 = ( 5 ( 39 ( 6 P (H H H 3 H 4 = P (H 4 H H H 3 P (H 3 H H P (H H = 4!( ( 48 36 ( 4 ( 5 ( 39 ( 6 = 48!4!4 5! = 3 7 5 49 = 97 085. 5