Name: 180A MIDTERM 2. (x + n)/2

Similar documents
x x 1 0 1/(N 1) (N 2)/(N 1)

x x 1 0 1/(N 1) (N 2)/(N 1)

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

Notice how similar the answers are in i,ii,iii, and iv. Go back and modify your answers so that all the parts look almost identical.

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

Class 26: review for final exam 18.05, Spring 2014

PRACTICE PROBLEMS FOR EXAM 2

Introduction to Probability 2017/18 Supplementary Problems

STAT 414: Introduction to Probability Theory

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

STAT 418: Probability and Stochastic Processes

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

CS 361: Probability & Statistics

MAT 271E Probability and Statistics

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

Math/Stat 394 Homework 5

What is a random variable

Polytechnic Institute of NYU MA 2212 MIDTERM Feb 12, 2009

Problem # Number of points 1 /20 2 /20 3 /20 4 /20 5 /20 6 /20 7 /20 8 /20 Total /150

Math , Fall 2012: HW 5 Solutions

RVs and their probability distributions

Exam 1 Solutions. Problem Points Score Total 145

Computations - Show all your work. (30 pts)

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

Expected Value 7/7/2006

RANDOM WALKS IN ONE DIMENSION

What is Probability? Probability. Sample Spaces and Events. Simple Event

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

Discrete Distributions

MATH Notebook 5 Fall 2018/2019

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

Discrete Random Variables. Discrete Random Variables

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

STAT/MA 416 Answers Homework 4 September 27, 2007 Solutions by Mark Daniel Ward PROBLEMS

MAS275 Probability Modelling Exercises

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

p. 4-1 Random Variables

The Lottery Paradox. Atriya Sen & Selmer Bringsjord

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Exam III #1 Solutions

Random variables. Lecture 5 - Discrete Distributions. Discrete Probability distributions. Example - Discrete probability model

CS206 Review Sheet 3 October 24, 2018

Problems and results for the ninth week Mathematics A3 for Civil Engineering students

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Recitation 6. Randomization. 6.1 Announcements. RandomLab has been released, and is due Monday, October 2. It s worth 100 points.

18.600: Lecture 24 Covariance and some conditional expectation exercises

ω X(ω) Y (ω) hhh 3 1 hht 2 1 hth 2 1 htt 1 1 thh 2 2 tht 1 2 tth 1 3 ttt 0 none

Probability Theory and Simulation Methods

MAT 271E Probability and Statistics

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm)

Solutionbank S1 Edexcel AS and A Level Modular Mathematics

MATH 3670 First Midterm February 17, No books or notes. No cellphone or wireless devices. Write clearly and show your work for every answer.

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions)

Chapter Summary. 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2012 Vazirani Note 14. Random Variables: Distribution and Expectation

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

MAT 271E Probability and Statistics

Lectures on Elementary Probability. William G. Faris

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

Discrete Mathematics and Probability Theory Summer 2014 James Cook Final Exam

Lecture 8: Continuous random variables, expectation and variance

Conditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom

Discrete Random Variables

STAT 516 Midterm Exam 2 Friday, March 7, 2008

Name: Firas Rassoul-Agha

CSC Discrete Math I, Spring Discrete Probability

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Lecture 14. More discrete random variables

Topic 3: The Expectation of a Random Variable

Notes for Math 324, Part 17

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

the time it takes until a radioactive substance undergoes a decay

Review of Probability. CS1538: Introduction to Simulations

Chapter 10 Markov Chains and Transition Matrices

Exam 1 Review With Solutions Instructor: Brian Powers

random variables T T T T H T H H

Lecture 6: The Pigeonhole Principle and Probability Spaces

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Some Basic Concepts of Probability and Information Theory: Pt. 1

Sociology 6Z03 Topic 10: Probability (Part I)

Probability theory basics

Uncertainty. Michael Peters December 27, 2013

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

18.440: Lecture 25 Covariance and some conditional expectation exercises

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

Denker FALL Probability- Assignment 6

Lecture #13 Tuesday, October 4, 2016 Textbook: Sections 7.3, 7.4, 8.1, 8.2, 8.3

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

DRAFT. Mathematical Methods 2017 Sample paper. Question Booklet 1

With Question/Answer Animations. Chapter 7

Each trial has only two possible outcomes success and failure. The possible outcomes are exactly the same for each trial.

Von Neumann Morgenstern Expected Utility. I. Introduction, Definitions, and Applications. Decision Theory Spring 2014

Transcription:

1. Recall the (somewhat strange) person from the first midterm who repeatedly flips a fair coin, taking a step forward when it lands head up and taking a step back when it lands tail up. Suppose this person flips the coin n times. Let n X n be a random variable indicating the position of the walker after n steps. a. [5 points] What is E[X]? Let H be the number of times the coin lands head up and let T be the number of times it lands tail up. Then X = H T so E[X] = E[H T] = E[H] E[T] = n/ n/ = 0. E[H] = n/ since H is a binomial(n, 1/) random variable. So is T. b. [5 points] What is Var[X]? Notice that since H+T = n, X = H (n H) = H n. Thus Var[X] = Var[H n] = 4Var[H] = n, again using the fact that H is a binomial(n, 1/) random variable. c. [5 points] What is the probability distribution of { X? ( ) ( P(X = x) = P(H n = x) = P H = x + n ) n = (x + n)/ n if x + n even; 0 otherwise. d. [10 points] For n = 10, 000, estimate the probability that the walker ends up more than 100 steps from the starting position. From part (b), SD[X] = 100. Since n is large and the probabilities in (c) are given by a binomial distribution, they are well approximated by a normal distribution with mean 0 and standard deviation 100. The probability of being within 1 standard deviation of the mean is approximately 68% for a normal distribution, so the probability of being more than 100 steps from the starting position is approximately 3%. 1 3 4 EC total score 1

. The first three picks in the NBA draft are determined by a lottery: Fourteen balls labeled 1 to 14 are placed in an urn and four balls are drawn, without replacement. 50 of the possible combinations are assigned to the team with the worst season record, 199 to the team with the second worst season record, etc. One combination, (11, 1, 13, 14), is not assigned to any team; if it is drawn the balls are returned to the urn and the process is repeated. The team to which the combination of numbers drawn is assigned gets the first pick in the draft. a. ([5 points] ) How many different combinations of numbers can be drawn? 14 14 13 1 11 = = 7 13 11 = 1001. 4 1 3 4 b. [10 points] What is the probability that the team with the second worst season record gets the first pick in the draft? For the second worst team to get the first pick, one of its 199 combinations out of the 1001 1 = 1000 combinations that are assigned to teams. So the probability is 0.199. c. [10 points] After the first pick is determined, the balls are returned to the urn, and the process is repeated. If a new team (i.e., not the one that won the first pick) wins, it gets second pick; otherwise the balls are returned to the urn and the process is repeated. What is the probability that the team with the worst season record gets the second pick in the draft? For this team to get the second pick, one of the other teams must win the first pick, and then this team must win the second pick. Let the number of combinations assigned to the team with the i th worst record be c i, and let n be the number of teams in the lottery. Since the events that the team with the i th worst record wins the first pick are disjoint, the probabilities add, so the probability that the worst team gets the second pick is: n i= c i 1000 50 1000 c i.

3. A gambler offers to pay you n dollars if you flip a fair coin until it lands head up, and it does so on the n th flip. a. [15 points] What is the number of dollars you expect to be paid if you play this game? Let W = N be the number of dollars the gambler pays you, where N is the number of times you flip the coin until it lands head up. Then E[W] = n=1 n P(N = n) = n=1 n n = n=1 1 =. b. [10 points] How much would you pay to play this game? Explain your answer. I accepted several different answers for this problem. The most conservative strategy is to pay no more than $ to play, since then you are assured of winning some money, even if the first head appears on the first flip. Less risk-averse players could pay any finite amount of money to play this game and still have a positive expected net payoff. The traditional analysis of this St. Petersburg paradox introduces the notion of utility, which is an increasing function u : R R, but with u < 0, e.g., u(w) = log w. This is supposed to capture the idea that if you have only $1000, another $1000 has larger utility than if you start with $1,000,000. The argument then is that it is E[u(W)] that is relevant, not E[W]. Using the logarithm makes the infinite sum defining the expectation value of the utility finite, in this problem. This is not completely satisfactory, since for any increasing utility function, there is some set of winnings that make the expectation value infinite. 3

4. There is a 500m. 1000m. plot of land on Barro Colorado Island in Panama where a careful census has been made of the trees. The locations of trees of one common species, Alseis blackiana, are indicated by dots in the graphic below, where the size of each dot represents the size of the tree. a. [10 points] Do you think the locations of these trees are a Poisson scatter? Why or why not? Looking at the graphic, there appear to be regions with greater and lesser densities of trees, which suggests that their locations are not a Poisson scatter. I also accepted the observation that since a tree cannot be located within the trunk of another tree, there is not equal probability of a point being located in every region of the same area, so the locations do not satisfy the defining condition of a Poisson scatter. Or you could postpone answering this part until after part (b). b. [15 points] There are 7599 dots in this graphic. If we partition the plot into four congruent rectangles, by dividing it in half vertically and horizontally, the number of dots in each rectangle is 115, 1950, 1708 and 186, moving counterclockwise from the northeast rectangle. What is the probability of observing this distribution if the tree locations are a Poisson scatter? If this were a Poisson scatter, the intensity per area of size 50m. 500m. would be approximately 7599/4 1900, and the number of dots per rectangle would be i.i.d. Poisson random variables, so the probability of observing these numbers of trees would be approximately: 1900 115 115! e 1900 19001950 1950! e 1900 19001708 1708! e 1900 1900186 e 1900. 186! This is a really, really, really small number, since the numbers of dots in the rectangles are not close to 1900, measured in units of the standard deviation, 1900 44. 4

5. [Extra credit] In this problem simulate means do an experiment with outcomes having the same probabilities as the outcomes of. a. [5 points] Suppose you have a coin with unknown bias. How can you simulate the flip of a fair coin? Flip the coin twice. If the sequence of outcomes is head-tail, call it Head ; if it is tail-head, call it Tail. If it is neither of these, repeat. Since the probability of head-tail is pq is the probability of tail-head, the events we are calling Head and Tail occur equally often, so this simulates a fair coin flip. b. [10 points] Suppose you have a fair coin. How can you simulate the flip of a biased coin that lands head up with probability 1/3? [Hint: Think about conditional probabilities.] Flip the coin three times. If it never lands head up or it lands head up more than once, repeat. If the head occurs on the first of the three flips, call that Head ; call it Tail if the head occurs on the second or third flip. Since each has the same probability, the probability of Head is 1/3. 5