x x 1 0 1/(N 1) (N 2)/(N 1)
|
|
- Milton Quinn
- 5 years ago
- Views:
Transcription
1 Please simplify your answers to the extent reasonable without a calculator, show your work, and explain your answers, concisely. If you set up an integral or a sum that you cannot evaluate, leave it as it is; and if the result is needed for the next part, say how you would use the result if you had it. 1. Suppose Bob is trying to guess a specific natural number x {1,...,N}. On his first guess he chooses a number X 0 uniformly at random. For t N, if X t x the game ends; if X t x, he guesses X t+1 uniformly at random from among the numbers different from X t. a. [5 points] What is the expected number of guesses it takes Bob to find x? WecandescribethisasaMarkovprocessontwostatesx and x {1,...,N}\{x }, with transition probability matrix: x ( x ) x 1 0 x. 1/(N 1) (N 2)/(N 1) Let ν be the expected number of steps it takes to reach x from x. Then from P we have ν 1+ N 2 N 1 ν, which we can solve to find ν N 1. Then the expected number of guesses it takes Bob to find x is 1 N (0+1)+ N 1 N (N 1+1) N 1+ 1 N. b. [5 points]supposebobhasabadmemoryandcan trememberthenumber heguessed previously, so that he guesses X t+1 uniformly at random from among {1,...,N}. In this case what is the expected number of guesses it takes him to find x? Bob s bad memory changes the transition probability matrix to: Now ( x x ) x 1 0 x. 1/N (N 1)/N ν 1+ N 1 N ν, so ν N and the expected number of guesses is 1 N (0+1)+ N 1 N (N +1) N.
2 c. [5 points] Suppose Bob has a good memory, and at each step guesses uniformly at random among the numbers he has not guessed at any previous step. Now what is the expected number of guesses it takes him to find x? Bob has equal probability 1/N of guessing correctly on any of guesses 1 through N, so the expected number of guesses in this case is N k1 k 1 N N(N 1) 2 1 N N [15 points] 2 n N is a prime number if its only divisors are 1 and itself. The Prime Number Theorem says that the primes are distributed approximately as if they came from an inhomogeneous Poisson process P(x) with intensity λ(x) 1/lnx. [We have to say approximately since (1) the primes are integers, not general real numbers, and (2) the primes take determined, not random, values.] Use this theorem to estimate the number of primes in the interval [2,N]. Since P(x) is an inhomogeneous Poisson process, the expected number of events in the interval [2,N] is N 1 E[P(N)] E[P(2)] lnx dx. This estimates the number of primes in this interval, and is approximately N/lnN. 3. Suppose we flip a fair coin repeatedly. a. [5 points] What is the expected number of flips until we see Head followed by Tail? We can think of this a Markov process on three states: H, meaning the most recent flip was Head; HT meaning the most recent two flips were Head then Tail; and 0 meaning anything else. Then the transition probability matrix is 2 0 H HT 0 1/2 1/2 0 H 0 1/2 1/2. HT Using ν HT E[#flips HT] 0, from P we have ν 0 E[#flips 0] ν ν H ν H E[#flips H] ν H Solving this system of linear equations gives ν 0 4.
3 b. [5 points] What is the expected number of flips until we see Head followed by Head? Again this a Markov process on three states; now HH, meaning the most recent two flip were both Heads; H meaning only the most recent flips was Head; and 0 meaning anything else. Then the transition probability matrix is 0 H HH 0 1/2 1/2 0 H 1/2 0 1/2. HH Using ν HH E[#flips HH] 0, from P we have ν 0 E[#flips 0] ν ν H ν H E[#flips H] ν 0 Solving this system of linear equations gives ν 0 6. c. [5 points] Give an intuitive explanation for why your answers in (a) and (b) are the same or different. In the Markov chain in case (a), once it reaches state H it cannot return to 0, but in case (b), it can. This makes the expected number of steps to reach the absorbing state longer in case (b). 4. [15 points] Here is a list of the 23 prime numbers between 3 and 100, together with their remainders when divided by 3: In this list 1 follows 1 three times; 2 follows 1 seven times; 1 follows 2 eight times; and 2 follows 2 four times, so we might imagine that the sequence of remainders of prime numbers divided by 3 are the outcome of a Markov process with transition matrix ( 1 2 ) 1 3/10 7/ /3 1/3 If this were true, what fraction of all prime numbers would we expect to have remainder 1 when divided by 3? P is regular, so the limiting distribution for this Markov process is its stationary state, i.e., its left eigenvector with eigenvalue 1: (p 1 p) (p 1 p), so 3 10 p+ 2 (1 p) p, 3 which implies p 20/41; this is the fraction of all primes we would expect to have remainder 1 when divided by 3, if this were a good model for the prime numbers.
4 In fact, a generalization of the theorem mentioned in problem 2, the Prime Number Theorem for arithmetic sequences, says that in the limit as N, 1/2 of prime numbers less than N have remainder 1, and half have remainder 2, when divided by 3. But a recent paper by Robert J. Lemke Oliver and Kannan Soundararajan, Unexpected biases in the distribution of consecutive primes, arxiv: [math.nt], shows that even for very large N, successive primes more often have different than the same remainders when divided by 3. They show that a conjecture of Hardy and Littlewood implies that this bias goes away, slowly, as N, but that conjecture remains unproved. 5. Let X(t) be a Poisson process on R 0, with intensity λ. Suppose Y i Poisson(µ) are independent of one another and of X(t). Let Y(t) X(t) i1 a. [6 points] What is E[Y(t)]? What is Var[Y(t)]? E[Y(t)] λt µ. Var[Y(t)] λt µ+µ 2 λt. b. [6 points] What is E[Y(t) X(t) n]? What is Var[Y(t) X(t) n]? E[Y(t) X(t) n] nµ. Var[Y(t) X(t) n] nµ. c. [4 points] Is Y(t) a Poisson process? Why or why not? No. One way to see this is that E[Y(t)] Var[Y(t)], so Y(t is not a Poisson random variable. Another way to see it is that Y(t) does not make jumps of size 1 the way a Poisson process does. d. [4 points] Give a different probability function for Y i that makes Y(t) a Poisson process. The only possibility is Y i Bernoulli(p). Y i.
5 6. There is a 500m. 1000m. plot of land on Barro Colorado Island in Panama where a careful census has been made of the trees. The locations of trees of one common species, Alseis blackiana, are indicated by dots in the graphic below, where the size of each dot represents the size of the tree. a. [10 points] Do you think the locations of these trees should be modeled as arising from a Poisson process? Why or why not? I think not. At best it appears the Poisson process would be quite inhomogeneous. Also, since a tree cannot grow within another tree, the number of trees within the radius of a tree of its center does not follow a pre-specified, even inhomogeneous, Poisson distribution. Finally, although it is maybe hard to see this without doing some calculations, the numbers of trees in disjoint regions seem not to be independent. b. [10 points] There are 7599 dots in this graphic. If we partition the plot into four congruent rectangles, by dividing it in half vertically and horizontally, the number of dots in each rectangle is 2115, 1950, 1708 and 1826, moving counterclockwise from the northeast rectangle. What is the probability of observing this distribution if the tree locations arise from a homogeneous Poisson process? Conditioning on there being 7599 dots, a homogeneous Poisson process restricted to this rectangle becomes 7599 samples of a uniform process on the rectangle. Since there is probability 1/4 for each sample to fall into each subrectangle, the probability of seeing this distribution is 7599! (1) ! 1950! 1708! 1826! 4 Even without a calculator, we can see that this will be a very small number because the numbers of trees in each subrectangle are so different; if this were really 7599 samples from a uniform distribution it is much more probable that they all would be close to 1900.
6 7. Let Y 1,Y 2 {0,1} be Bernoulli random variables. Suppose and Pr(Y 1 1) p. Pr(Y 2 0 Y 1 0) r Pr(Y 2 1 Y 1 1) s a. [4 points] What is the joint probability function for Y 1 and Y 2? Pr(Y 1 0 and Y 2 0) Pr(Y 2 0 Y 1 0)Pr(Y 1 0) r(1 p) Pr(Y 1 0 and Y 2 1) Pr(Y 2 1 Y 1 0)Pr(Y 1 0) (1 r)(1 p) Pr(Y 1 1 and Y 2 0) Pr(Y 2 0 Y 1 1)Pr(Y 1 1) (1 s)p Pr(Y 1 1 and Y 2 1) Pr(Y 2 1 Y 1 1)Pr(Y 1 1) s(1 p) With the values of Y 1 and Y 2 labeling the rows and columns of a matrix, respectively, this joint probability function is: 0 1 ( ) 0 r(1 p) (1 r)(1 p). 1 (1 s)p ps b. [4 points] For what values of r, s and p are Y 1 and Y 2 independent? To be independent, the joint probability matrix above must be singular, i.e., have nonzero determinant because one row is a multiple of the other since the distribution is a product distribution. Its determinant is rsp(1 p) (1 r)(1 s)p(1 p) (r+s 1)p(1 p) which vanishes if p {0,1} or if r +s 1. c. [4 points] What is Pr(Y 2 1)? Pr(Y 2 1) (1 r)(1 p)+sp. d. [4 points] For what values of r, s and p is Pr(Y 2 1) Pr(Y 1 1)? Solving p (1 r)(1 p)+sp for p gives as long as r +s 2. p 1 r 2 r s, 8. Consider a sequence of customers entering a store. Let Y i {0,1} denote the number of items the i th customer buys, for 0 < i N. Suppose every customer, after the first, sees what the previous customer does; if the previous customer bought nothing, the current customer also buys nothing, with probability r, and if the previous customer bought an item, the current customer does too, with probability s. a. [4 points] Suppose 0 < r s < 1. Without doing any calculation, explain what is the fraction of customers who buy an item in the infinite number of customers limit.
7 In this case buying and not-buying are completely symmetrical, so in the infinite number of customer limit, half of the customers buy an item. b. [4 points] For general 0 < r,s < 1, what is lim n Pr(Y n 1)? The transition probability matrix is ( r 1 r 1 1 s s and we want to find π (1 π 1 π 1 ) that solves π π. Multiplying out gives us the equation r(1 π 1 )+(1 s)π 1 1 π 1, which we solve to get ), lim Pr(Y n 1) π 1 1 r n 2 r s. (Notice that this is the same as the answer to problem 1.d, i.e., that the probability of each customer buying an item is the same.) c. [8 points] Still for general 0 < r,s < 1, what is lim n Pr(Y n 1 and Y n+1 1)? The probability transition matrix for the states of consecutive customers is r 1 r s s 10 r 1 r s s Solving π (π 0 π 1 π 2 π 3 ) πp requires solving: rπ 0 +rπ 2 π 0 (1 r)π 0 +(1 r)π 2 π 1. (1 s)π 1 +(1 s)π 3 π 2 sπ 1 +sπ 3 π 2 The first two of these equations imply π 0 (r/(1 r))π 1 and the last two imply π 3 (s/(1 s))π 2. Plugging these back in to the first and last equations gives π 1 π 2. Requiring that the sum of the π i be 1 gives π 1 (1 r)(1 s) 2 r s π 2,
8 which implies that lim Pr(Y n 1 and Y n+1 1) π 3 s(1 r) n 2 r s. An alternate, cleverer, solution is to notice that lim Pr(Y n 1 and Y n+1 1) lim Pr(Y n+1 1 Y n 1)Pr(Y n 1) n n s lim Pr(Y n 1) n 1 r s 2 r s, where we have used the answer to 2.b for the last step. 9. [16 points] Now suppose people enter a store according to a Poisson process with rate λ. The store owner wants to know if someone buying something makes the next person more likely to buy something (as it did in problem 2 for s > 1/2). To help answer this question, suppose each person who enters the store buys something with probability p, independently of what anyone else does. Let C(t) be the number of people who buy something and are followed by the next customer also buying something, both before time t. What is E[C(t)]? E[C(t)] E[C(t) X(t) n]pr ( X(t) n ) n2 n 1 E[Y i 1 and Y i+1 1]Pr ( X(t) n ) n2 i1 (n 1)p 2(λt)n e λt n! n2 p 2 e (λt λt (λt) n 1 (n 1)! (λt) n ) n! n2 n2 p 2 e λt( λt(e λt 1) (e λt 1 λt) ) p 2 (e λt 1+λt). Extra Credit. [5 points] Suppose instead of the people s purchasing decisions being independent, they are dependent as in problem 2. Now what is E[C(t)]? Since we don t know what the first customer does, or even what the probability distribution for the first customer s action is, we have to make some assumption. If we assume that customer probabilities are stationary, the only change in the calculation above is that the p 2 is replaced by lim n Pr(Y n 1 and Y n+1 1) s(1 r) 2 r s
9 from problem 2.c. 10. Let {X(t) t R 0 } be a Poisson point process with rate λ on R 0. For each point 0 < i N of the process, let D i be the distance to its nearest neighbor. a. [3 points] Write D i in terms of the sojourn times {S j j N}. D 1 S 1 and for i > 1, D i min{s i 1,S i }. b. [3 points] Are the {D i 0 < i N} independent? No, since D 1 and D 2, for example, both depend upon S 1. For example, let 0 s < t; then while Pr(D 1 > s,d 2 > t) Pr(S 1 > s,s 1 > t,s 2 > t) Pr(S 1 > t,s 2 > t) Pr(S 1 > t)pr(s 2 > t), Pr(D 1 > s)pr(d 2 > t) Pr(S 1 > s)pr(s 1 > t,s 2 > t) Pr(S 1 > s)pr(s 1 > t)pr(s 2 > t). c. [10 points] What is the probability density function for each D i? f D1 (d) f S1 (d) λe λd. For i > 1, Pr(d < D i d+ d) Pr(d < distance to nearest point d+ d) Pr(d < distance to point on left d+ d and d+ d < distance to point on right) +Pr(d < distance to point on right d+ d and d+ d < distance to point on left) 2e λd λ de λ d e λd, which implies f Di (d) 2λe 2λd, i.e., just the probability distribution for a sojourn time with double the rate. Alternatively, Pr(D i > d) Pr(S i 1 > d,s i > d) Pr(S i 1 > d)pr(s i > d) e λd e λd e 2λd. From this we can compute the probability density function: for d 0. f Di (d) d dd Pr(D i > d) d dd e 2λd 2λe 2λd, 11. [20 points] An ant starts at one vertex of a cube and at each time step walks along an edge to an adjacent vertex, choosing each possible edge with equal probability 1/3.
10 What is the probability the ant returns to its original vertex before reaching the opposite vertex? Let the state of the ant be described by how many edges away from the corner at which it starts it is, e.g., the opposite corner is state 3. Making states 0 and 3 absorbing states, the probability transition matrix is /3 0 2/ /3 0 1/ Sinceafterthefirststeptheantisatstate1,wemustcomputeν i Pr(absorbed at state 0 from state i) for i 1. Considering the next step the ant takes, we get Solving these equations gives ν 1 3/5. ν ν 2 ν ν Let X,Y be random variables with a bivariate normal distribution such that E[Y] 0 and Var[X] 1 Var[Y]. Suppose E[X Y y] 2 y/3. a. [6 points] What is E[X]? E[X] b. [10 points] What is Cov[X,Y]? E[X Y y]f Y (y)dy (2 y/3)f Y (y)dy E[2 Y/3] 2 E[Y]/3 2. Cov[X,Y] E[XY] E[X]E[Y] E[XY] E[XY Y y]f Y (y)dy ye[x Y y]f Y (y)dy (2y y 2 /3)f Y (y)dy E[2Y Y 2 /3] 2E[Y] E[Y 2 ]/3 1/3, because E[Y] 0 and thus E[Y 2 ] Var[Y] 1. Notice that we did not need to use the fact that the joint distribution of X and Y is bivariate normal.
x x 1 0 1/(N 1) (N 2)/(N 1)
Please simplify your answers to the extent reasonable without a calculator, show your work, and explain your answers, concisely. If you set up an integral or a sum that you cannot evaluate, leave it as
More informationName: 180A MIDTERM 2. (x + n)/2
1. Recall the (somewhat strange) person from the first midterm who repeatedly flips a fair coin, taking a step forward when it lands head up and taking a step back when it lands tail up. Suppose this person
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.
EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015 Midterm Exam Last name First name SID Rules. You have 80 mins (5:10pm - 6:30pm)
More informationX = X X n, + X 2
CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk
More informationClass 8 Review Problems solutions, 18.05, Spring 2014
Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots
More informationDiscrete Structures for Computer Science
Discrete Structures for Computer Science William Garrison bill@cs.pitt.edu 6311 Sennott Square Lecture #24: Probability Theory Based on materials developed by Dr. Adam Lee Not all events are equally likely
More informationMAS275 Probability Modelling Exercises
MAS75 Probability Modelling Exercises Note: these questions are intended to be of variable difficulty. In particular: Questions or part questions labelled (*) are intended to be a bit more challenging.
More informationReview of Probability. CS1538: Introduction to Simulations
Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed
More informationMidterm Exam 1 Solution
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:
More informationConditional densities, mass functions, and expectations
Conditional densities, mass functions, and expectations Jason Swanson April 22, 27 1 Discrete random variables Suppose that X is a discrete random variable with range {x 1, x 2, x 3,...}, and that Y is
More informationExpectation is linear. So far we saw that E(X + Y ) = E(X) + E(Y ). Let α R. Then,
Expectation is linear So far we saw that E(X + Y ) = E(X) + E(Y ). Let α R. Then, E(αX) = ω = ω (αx)(ω) Pr(ω) αx(ω) Pr(ω) = α ω X(ω) Pr(ω) = αe(x). Corollary. For α, β R, E(αX + βy ) = αe(x) + βe(y ).
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014 Midterm Exam 1 Last name First name SID Rules. DO NOT open the exam until instructed
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationDiscrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations
EECS 70 Discrete Mathematics and Probability Theory Fall 204 Anant Sahai Note 5 Random Variables: Distributions, Independence, and Expectations In the last note, we saw how useful it is to have a way of
More informationBasic Probability. Introduction
Basic Probability Introduction The world is an uncertain place. Making predictions about something as seemingly mundane as tomorrow s weather, for example, is actually quite a difficult task. Even with
More informationMarkov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains
Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time
More informationMATH 564/STAT 555 Applied Stochastic Processes Homework 2, September 18, 2015 Due September 30, 2015
ID NAME SCORE MATH 56/STAT 555 Applied Stochastic Processes Homework 2, September 8, 205 Due September 30, 205 The generating function of a sequence a n n 0 is defined as As : a ns n for all s 0 for which
More informationCS 70 Discrete Mathematics and Probability Theory Spring 2018 Ayazifar and Rao Final
CS 70 Discrete Mathematics and Probability Theory Spring 2018 Ayazifar and Rao Final PRINT Your Name:, (Last) (First) READ AND SIGN The Honor Code: As a member of the UC Berkeley community, I act with
More informationCS280, Spring 2004: Final
CS280, Spring 2004: Final 1. [4 points] Which of the following relations on {0, 1, 2, 3} is an equivalence relation. (If it is, explain why. If it isn t, explain why not.) Just saying Yes or No with no
More informationDisjointness and Additivity
Midterm 2: Format Midterm 2 Review CS70 Summer 2016 - Lecture 6D David Dinh 28 July 2016 UC Berkeley 8 questions, 190 points, 110 minutes (same as MT1). Two pages (one double-sided sheet) of handwritten
More informationNotes on Mathematics Groups
EPGY Singapore Quantum Mechanics: 2007 Notes on Mathematics Groups A group, G, is defined is a set of elements G and a binary operation on G; one of the elements of G has particularly special properties
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More informationMidterm 2 Review. CS70 Summer Lecture 6D. David Dinh 28 July UC Berkeley
Midterm 2 Review CS70 Summer 2016 - Lecture 6D David Dinh 28 July 2016 UC Berkeley Midterm 2: Format 8 questions, 190 points, 110 minutes (same as MT1). Two pages (one double-sided sheet) of handwritten
More informationCS 361: Probability & Statistics
February 19, 2018 CS 361: Probability & Statistics Random variables Markov s inequality This theorem says that for any random variable X and any value a, we have A random variable is unlikely to have an
More information5.3 Conditional Probability and Independence
28 CHAPTER 5. PROBABILITY 5. Conditional Probability and Independence 5.. Conditional Probability Two cubical dice each have a triangle painted on one side, a circle painted on two sides and a square painted
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationRandomized Algorithms
Randomized Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A new 4 credit unit course Part of Theoretical Computer Science courses at the Department of Mathematics There will be 4 hours
More information2007 Winton. Empirical Distributions
1 Empirical Distributions 2 Distributions In the discrete case, a probability distribution is just a set of values, each with some probability of occurrence Probabilities don t change as values occur Example,
More informationMassachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis
6.04/6.43: Probabilistic Systems Analysis Question : Multiple choice questions. CLEARLY circle the best answer for each question below. Each question is worth 4 points each, with no partial credit given.
More informationMATH Solutions to Probability Exercises
MATH 5 9 MATH 5 9 Problem. Suppose we flip a fair coin once and observe either T for tails or H for heads. Let X denote the random variable that equals when we observe tails and equals when we observe
More informationEECS 70 Discrete Mathematics and Probability Theory Fall 2015 Walrand/Rao Final
EECS 70 Discrete Mathematics and Probability Theory Fall 2015 Walrand/Rao Final PRINT Your Name:, (last) SIGN Your Name: (first) PRINT Your Student ID: CIRCLE your exam room: 220 Hearst 230 Hearst 237
More information1 Probability theory. 2 Random variables and probability theory.
Probability theory Here we summarize some of the probability theory we need. If this is totally unfamiliar to you, you should look at one of the sources given in the readings. In essence, for the major
More informationMATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM
MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM YOUR NAME: KEY: Answers in blue Show all your work. Answers out of the blue and without any supporting work may receive no credit even if they are
More informationJoint Distribution of Two or More Random Variables
Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few
More informationRandom variables (discrete)
Random variables (discrete) Saad Mneimneh 1 Introducing random variables A random variable is a mapping from the sample space to the real line. We usually denote the random variable by X, and a value that
More informationIndividual Round CHMMC November 20, 2016
Individual Round CHMMC 20 November 20, 20 Problem. We say that d k d k d d 0 represents the number n in base 2 if each d i is either 0 or, and n d k ( 2) k + d k ( 2) k + + d ( 2) + d 0. For example, 0
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More information[POLS 8500] Review of Linear Algebra, Probability and Information Theory
[POLS 8500] Review of Linear Algebra, Probability and Information Theory Professor Jason Anastasopoulos ljanastas@uga.edu January 12, 2017 For today... Basic linear algebra. Basic probability. Programming
More informationMath 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =
Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,
More informationECSE B Solutions to Assignment 8 Fall 2008
ECSE 34-35B Solutions to Assignment 8 Fall 28 Problem 8.1 A manufacturing system is governed by a Poisson counting process {N t ; t < } with rate parameter λ >. If a counting event occurs at an instant
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationDiscrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20
CS 70 Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 Today we shall discuss a measure of how close a random variable tends to be to its expectation. But first we need to see how to compute
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationStochastic Processes
qmc082.tex. Version of 30 September 2010. Lecture Notes on Quantum Mechanics No. 8 R. B. Griffiths References: Stochastic Processes CQT = R. B. Griffiths, Consistent Quantum Theory (Cambridge, 2002) DeGroot
More informationStat 150 Practice Final Spring 2015
Stat 50 Practice Final Spring 205 Instructor: Allan Sly Name: SID: There are 8 questions. Attempt all questions and show your working - solutions without explanation will not receive full credit. Answer
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More information1 Proof techniques. CS 224W Linear Algebra, Probability, and Proof Techniques
1 Proof techniques Here we will learn to prove universal mathematical statements, like the square of any odd number is odd. It s easy enough to show that this is true in specific cases for example, 3 2
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More information(Practice Version) Midterm Exam 2
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 7, 2014 (Practice Version) Midterm Exam 2 Last name First name SID Rules. DO NOT open
More informationPlease simplify your answers to the extent reasonable without a calculator. Show your work. Explain your answers, concisely.
Please simplify your answers to the extent reasonable without a calculator. Show your work. Explain your answers, concisely. 1. Consider a game which involves flipping a coin: winning $1 when it lands
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationLatent voter model on random regular graphs
Latent voter model on random regular graphs Shirshendu Chatterjee Cornell University (visiting Duke U.) Work in progress with Rick Durrett April 25, 2011 Outline Definition of voter model and duality with
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationLecture 20 : Markov Chains
CSCI 3560 Probability and Computing Instructor: Bogdan Chlebus Lecture 0 : Markov Chains We consider stochastic processes. A process represents a system that evolves through incremental changes called
More informationDO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO
QUESTION BOOKLET EE 26 Spring 2006 Final Exam Wednesday, May 7, 8am am DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO You have 80 minutes to complete the final. The final consists of five
More informationNOTES. [Type the document subtitle] Math 0310
NOTES [Type the document subtitle] Math 010 Cartesian Coordinate System We use a rectangular coordinate system to help us map out relations. The coordinate grid has a horizontal axis and a vertical axis.
More informationGRE Quantitative Reasoning Practice Questions
GRE Quantitative Reasoning Practice Questions y O x 7. The figure above shows the graph of the function f in the xy-plane. What is the value of f (f( ))? A B C 0 D E Explanation Note that to find f (f(
More informationGuidelines for Solving Probability Problems
Guidelines for Solving Probability Problems CS 1538: Introduction to Simulation 1 Steps for Problem Solving Suggested steps for approaching a problem: 1. Identify the distribution What distribution does
More informationCSC Discrete Math I, Spring Discrete Probability
CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields
More informationSome Definition and Example of Markov Chain
Some Definition and Example of Markov Chain Bowen Dai The Ohio State University April 5 th 2016 Introduction Definition and Notation Simple example of Markov Chain Aim Have some taste of Markov Chain and
More informationLecture 5: Random Walks and Markov Chain
Spectral Graph Theory and Applications WS 20/202 Lecture 5: Random Walks and Markov Chain Lecturer: Thomas Sauerwald & He Sun Introduction to Markov Chains Definition 5.. A sequence of random variables
More informationDiscrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation
CS 70 Discrete Mathematics and Probability Theory Fall 203 Vazirani Note 2 Random Variables: Distribution and Expectation We will now return once again to the question of how many heads in a typical sequence
More informationProbability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27
Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple
More informationDiscrete Mathematics and Probability Theory Fall 2010 Tse/Wagner MT 2 Soln
CS 70 Discrete Mathematics and Probability heory Fall 00 se/wagner M Soln Problem. [Rolling Dice] (5 points) You roll a fair die three times. Consider the following events: A first roll is a 3 B second
More informationPart I: Discrete Math.
Part I: Discrete Math. 1. Propositions. 10 points. 3/3/4 (a) The following statement expresses the fact that there is a smallest number in the natural numbers, ( y N) ( x N) (y x). Write a statement that
More informationProbability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.
Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationRANDOM WALKS AND THE PROBABILITY OF RETURNING HOME
RANDOM WALKS AND THE PROBABILITY OF RETURNING HOME ELIZABETH G. OMBRELLARO Abstract. This paper is expository in nature. It intuitively explains, using a geometrical and measure theory perspective, why
More informationCMPSCI 240: Reasoning Under Uncertainty
CMPSCI 240: Reasoning Under Uncertainty Lecture 5 Prof. Hanna Wallach wallach@cs.umass.edu February 7, 2012 Reminders Pick up a copy of B&T Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 6
MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and
More information1 What is the area model for multiplication?
for multiplication represents a lovely way to view the distribution property the real number exhibit. This property is the link between addition and multiplication. 1 1 What is the area model for multiplication?
More informationName: Firas Rassoul-Agha
Midterm 1 - Math 5010 - Spring 016 Name: Firas Rassoul-Agha Solve the following 4 problems. You have to clearly explain your solution. The answer carries no points. Only the work does. CALCULATORS ARE
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014.
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014 Midterm Exam 2 Last name First name SID Rules. DO NOT open the exam until instructed
More informationMidterm Exam 1 (Solutions)
EECS 6 Probability and Random Processes University of California, Berkeley: Spring 07 Kannan Ramchandran February 3, 07 Midterm Exam (Solutions) Last name First name SID Name of student on your left: Name
More informationLecture 4a: Continuous-Time Markov Chain Models
Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time
More informationExercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov
Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Many of the exercises are taken from two books: R. Durrett, The Essentials of Probability, Duxbury
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationMAT 271E Probability and Statistics
MAT 7E Probability and Statistics Spring 6 Instructor : Class Meets : Office Hours : Textbook : İlker Bayram EEB 3 ibayram@itu.edu.tr 3.3 6.3, Wednesday EEB 6.., Monday D. B. Bertsekas, J. N. Tsitsiklis,
More informationRecitation 6. Randomization. 6.1 Announcements. RandomLab has been released, and is due Monday, October 2. It s worth 100 points.
Recitation 6 Randomization 6.1 Announcements RandomLab has been released, and is due Monday, October 2. It s worth 100 points. FingerLab will be released after Exam I, which is going to be on Wednesday,
More informationThis does not cover everything on the final. Look at the posted practice problems for other topics.
Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More informationNotes for Math 324, Part 20
7 Notes for Math 34, Part Chapter Conditional epectations, variances, etc.. Conditional probability Given two events, the conditional probability of A given B is defined by P[A B] = P[A B]. P[B] P[A B]
More information. Find E(V ) and var(v ).
Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number
More informationMarkov Chains and Pandemics
Markov Chains and Pandemics Caleb Dedmore and Brad Smith December 8, 2016 Page 1 of 16 Abstract Markov Chain Theory is a powerful tool used in statistical analysis to make predictions about future events
More informationProblems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.
Math 224 Fall 2017 Homework 1 Drew Armstrong Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman. Section 1.1, Exercises 4,5,6,7,9,12. Solutions to Book Problems.
More informationDiscrete Random Variables
Discrete Random Variables We have a probability space (S, Pr). A random variable is a function X : S V (X ) for some set V (X ). In this discussion, we must have V (X ) is the real numbers X induces a
More informationRandom Variables. Will Perkins. January 11, 2013
Random Variables Will Perkins January 11, 2013 Random Variables If a probability model describes an experiment, a random variable is a measurement - a number associated with each outcome of the experiment.
More informationExam 1 Practice Questions I solutions, 18.05, Spring 2014
Exam Practice Questions I solutions, 8.5, Spring 4 Note: This is a set of practice problems for exam. The actual exam will be much shorter.. Sort the letters: A BB II L O P R T Y. There are letters in
More informationProbability Theory and Simulation Methods
Feb 28th, 2018 Lecture 10: Random variables Countdown to midterm (March 21st): 28 days Week 1 Chapter 1: Axioms of probability Week 2 Chapter 3: Conditional probability and independence Week 4 Chapters
More informationCS 124 Math Review Section January 29, 2018
CS 124 Math Review Section CS 124 is more math intensive than most of the introductory courses in the department. You re going to need to be able to do two things: 1. Perform some clever calculations to
More informationExample continued. Math 425 Intro to Probability Lecture 37. Example continued. Example
continued : Coin tossing Math 425 Intro to Probability Lecture 37 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan April 8, 2009 Consider a Bernoulli trials process with
More informationECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.
ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. 1. Enter your name, student ID number, e-mail address, and signature in the space provided on this page, NOW! 2. This is a closed book exam.
More information3 PROBABILITY TOPICS
Chapter 3 Probability Topics 135 3 PROBABILITY TOPICS Figure 3.1 Meteor showers are rare, but the probability of them occurring can be calculated. (credit: Navicore/flickr) Introduction It is often necessary
More information