Math 180B Homework 4 Solutions
|
|
- Linette Thomas
- 5 years ago
- Views:
Transcription
1 Math 80B Homework 4 Solutions Note: We will make repeated use of the following result. Lemma. Let (X n ) be a time-homogeneous Markov chain with countable state space S, let A S, and let T = inf { n 0 : X n A } be the first time that the Markov chain reaches a state in A. Then for each i S \ A, we have E[T X 0 = i] = + S\A P (X = X 0 = i)e[t X 0 = ]. Proof. Suppose i S \ A and S. By the Markov property and time homogeneity, we have E[T X 0 = i, X = ] = P (T > n X 0 = i, X = ) = P (X 0 / A,..., X n / A X 0 = i, X = ) = + P (X / A,..., X n / A X 0 = i, X = ) n= = + P (X / A,..., X n / A X = ) n= = + P (X 0 / A,..., X n / A X 0 = ) n= = + P (T n X = ) n= = + E[T X 0 = ]. Also, note that if A, then E[T X 0 = ] = 0. Therefore, by first-step analysis, if i / A, then E[T X 0 = i] = np (T = n X 0 = i)
2 Math 80B Homework 4 Solutions = n P (T = n, X = X 0 = i) S = np (T = n X 0 = i, X = )P (X = X 0 = i) S = S P (X = X 0 = i)e[t X 0 = i, X = ] = P (X = X 0 = i) ( + E[T X 0 = ] ) S = S P (X = X 0 = i) + S P (X = X 0 = i)e[t X 0 = ] = + S\A P (X = X 0 = i)e[t X 0 = ]. Problem (Pinsky & Karlin, Exercise 3.4.). Find the mean time to reach state 3 starting from state 0 for the Markov chain whose transition probability matrix is P = Solution. Let T be the time it takes for the Markov chain to reach state 3. That is, if (X n ) is the Markov chain in question, then T = inf { n 0 : X n = 3 }. We need to compute E[T X 0 = 0]. Let g i = E[T X 0 = i] for i {0,,, 3}. By Lemma, we have the following linear system of equations: g 0 = + 0.4g g + 0.g, g = + 0.7g + 0.g, g = + 0.9g. Solving this system, we get E[T X 0 = 0] = g 0 = 0.
3 Math 80B Homework 4 Solutions Problem (Pinsky & Karlin, Exercise 3.4.4). A coin is tossed repeatedly until two successive heads appear. Find the mean number of tosses required. Hint: Let X n be the cumulative number of successive heads. The state space is 0,,, and the transition probability matrix is P = Determine the mean time to reach state starting from state 0 by invoking a first step analysis. Solution. Let T = inf { n 0 : X n = }. We wish to compute E[T X 0 = 0]. Let g i = E[T X 0 = i] for i {0,, }. By Lemma, we get the following linear system of equations: Therefore, E[T X 0 = 0] = g 0 = 6. g 0 = + g 0 + g, g = + g 0. Problem 3 (Pinsky & Karlin, Problem 3.4.). Which will take fewer flips, on average: successively flipping a quarter until the pattern HHT appears, i.e., until you observe two successive heads followed by a tails; or successively flipping a quarter until the pattern HT H appears? Can you explain why these are different? Solution. Let (X n ) be the Markov chain whose states are the possible patterns of three consecutive coin tosses. Then (X n ) has the following 3
4 Math 80B Homework 4 Solutions transition probability matrix (blank entires represent 0): P = T T T T T H T HT T HH HT T HT H HHT HHH T T T / / T T H / / T HT / / T HH / / HT T / / HT H / / HHT / / HHH / / Define T HHT = inf { n 0 : X n = HHT }, T HT H = inf { n 0 : X n = HT H }. We wish to compare E[T HHT ] and E[T HT H ]. Let g i = E[T HHT X 0 = i] for each i. Then by Lemma, we have the following linear system of equations: g T T T = + g T T T + g T T H, g HT T = + g T T T + g T T H, g T T H = + g T HT + g T HH, g HT H = + g T HT + g T HH, g T HT = + g HT T + g HT H, g T HH = + g HHH, g HHH = + g HHH. Solving this system yields g HHH = g T HH =, g HT H = g T T H = 6, g T T T = g HT T = g T HT = 8, and hence E[T HHT ] = P (X 0 = )g = 8( ) = 5. Next, let h i = E[T HT H X 0 = i] for each i. Then by Lemma, we have the following linear system of equations: h T T T = + h T T T + h T T H, h HT T = + h T T T + h T T H, 4
5 Math 80B Homework 4 Solutions h T T H = + h T HT + h T HH, h T HT = + h HT T, h HHT = + h HT T, h T HH = + h HHT + h HHH, h HHH = + h HHT + h HHH. Solving this system yields h T HT = h HHT = 6, h T T H = h T HH = h HHH = 8, h HT T = h T T T = 0, and hence E[T HT H ] = P (X 0 = )h = 8( ) = 7. Thus, on average, it will take fewer flips to observe HHT than HT H. Problem 4 (Pinsky & Karlin, Problem 3.4.). A zero-seeking device operates as follows: If it is in state m at time n, then at time n +, its position is uniformly distributed over the states 0,,..., m. Find the expected time until the device first hits zero starting from state m. Note: This is a highly simplified model for an algorithm that seeks a maximum over a finite set of points. Solution. This device can be modeled by a Markov chain (X n ) taking values in the nonnegative integers with transition probabilities given by Let, if m > 0 and 0 < m, m P (X n+ = X n = m) =, if m = = 0, 0, otherwise. T = inf { n 0 : X n = 0 }. We wish to compute E[T X 0 = m] for all m 0. If m = 0, then clearly E[T X 0 = m] = 0. Suppose m > 0. By Lemma, we have E[T X 0 = m] = + P (X = X 0 = m)e[t X 0 = ] = 5
6 Math 80B Homework 4 Solutions = + m m = E[T X 0 = ] We will prove by strong induction that E[T X 0 = m] = m k=, and the base k case m = is clear. Now suppose m > and that E[T X 0 = ] = k= k when < m. Then E[T X 0 = m] = + m = + m = + m m = m = k= m k= E[T X 0 = ] k = + m m k k = + m k= m k= m =k k k m m = m k= k. Problem 5 (Pinsky & Karlin, Problem 3.4.7). Let X n be a Markov chain with transition probabilities P i,. We are given a discount factor β with 0 < β < and a cost function c(i), and we wish to determine the total expected discounted cost starting from state i, defined by [ ] h i = E β n c(x n ) X 0 = i. Using a first step analysis show that h i satisfies the system of linear equations h i = c(i) + β P i, h. Solution. Assuming c(i) 0 for all i (so that we may interchange summation and expectation), we have [ ] h i = E β n c(x n ) X 0 = i = β n E[c(X n ) X 0 = i] = E[c(X 0 ) X 0 = i] + β β n E[c(X n ) X 0 = i] n= 6
7 Math 80B Homework 4 Solutions = c(i) + β β n E[c(X n+ ) X 0 = i] = c(i) + βe [ ] β n c(x n+ ) X 0 = i. Thus, it suffices to show that [ ] E β n c(x n+ ) X 0 = i = P i, h. (5.) To see this, we use time-homogeneity and the Markov property: P (X = X 0 = i)e[c(x n ) X 0 = ] = P (X = X 0 = i) c(k)p (X n = k X 0 = ) k =,k =,k =,k = k c(k)p (X = X 0 = i)p (X n+ = k X = ) c(k)p (X = X 0 = i)p (X n+ = k X 0 = i, X = ) c(k)p (X n+ = k, X = X 0 = i) c(k)p (X n+ = k X 0 = i) = E[c(X n+ ) X 0 = i], and so P i, h = [ ] P (X = X 0 = i)e β n c(x n ) X 0 = = β n P (X = X 0 = i)e[c(x n ) X 0 = ] = β n E[c(X n+ ) X 0 = i] [ ] = E β n c(x n+ ) X 0 = i. Therefore, (5.) holds. 7
8 Math 80B Homework 4 Solutions Problem 6 (Pinsky & Karlin, Problem 3.4.8). An urn contains five red and three green balls. The balls are chosen at random, one by one, from the urn. If a red ball is chosen, it is removed. Any green ball that is chosen is returned to the urn. The selection process continues until all of the red balls have been removed from the urn. What is the mean duration of the game? Solution. Let X 0 be the initial number of red balls in the urn, and for n, let X n denote the number of red balls in the urn after the nth draw. Then (X n ) is a Markov chain with transition matrix P = /4 3/ /5 3/ /6 3/ /7 3/ /8 3/8. Let T = inf { n 0 : X n = 0 } be the time until all red balls are removed from the urn and the game ends. We wish to compute E[T X 0 = 5]. Let g i = E[T X 0 = i]. By Lemma, we have the following linear system of equations: g = g, g = + 5 g g, g 3 = g g 3, g 4 = g g 4, g 5 = g g 5. Solving this system, we get E[T X 0 = 5] = g 5 = 77/0. 8
9 Math 80B Homework 4 Solutions Problem 7 (Pinsky & Karlin, Problem 3.4.5). A simplified model for the spread of a rumor goes this way: There are N = 5 people in a group of friends, of which some have heard the rumor and the others have not. During any single period of time, two people are selected at random from the group and assumed to interact. The selection is such that an encounter between any pair of friends is ust as likely as between any other pair. If one of these persons has heard the rumor and the other has not, then with probability α = 0. the rumor is transmitted. Let X n denote the number of friends who have heard the rumor at the end of the nth period. Assuming that the process begins at time 0 with a single person knowing the rumor, what is the mean time that it takes for everyone to hear it? Solution. The stochastic process (X n ) is a Markov chain with state space {,, 3, 4, 5} and transition probabilities given by P (X n+ = 5 X n = 5) =, and if i {,, 3, 4}, then and P (X n+ = i + X n = i) = α P (X n+ = i X n = i) = All other transition probabilities are zero. Let ( )( ) i 5 i ( ) 5 = T = inf { n 0 : X n = 5 } i(5 i) 00. i(5 i) 00, be the time until all five people have heard the rumor. We wish to compute E[T X 0 = ]. Let g i = E[T X 0 = i] for i =,, 3, 4, 5. By Lemma, we have the following linear system of equations: ( g = + 4 ) g + 4 ( g, g = + 3 ) 00 ( g 3 = + 3 ) g ( g 4, g 4 = Solving this system, we get E[T X 0 = ] = g = 50/3. g g 3, ) g 4. 9
10 Ex The transition probability matrix of the Markov chain (of states 0,,, 3) is , which is of the form of a success runs Markov chain with all probabilities p i = q i =. Ex The state space of the Markov chain is the nonnegative integers. The transition probabilities are as follows: α = 0 = k α = 0, k = ( α)β > 0, k = P(X n+ = k X n = ) =. αβ + ( α)( β) = k > 0 α( β) > 0, k = + 0 otherwise So corresponding to (3.38), r 0 = α, p 0 = α, q i = ( α)β, r i = αβ + ( α)( β), p i = α( β) for i. Pr 3.5. We are taking M = 3 in equation (3.35) in section 3.5., with P(ξ = i) = a i for i = 0,,, 3 and a 0 + a + a + a 3 =. All the a i = 0 with i 4. Putting the corresponding a 3, a 4,... into equation (3.35), we have µ = /a 3. Since in this case, T = min{n : X n 3} = min{n : X n = 3}, µ = ν 0, the mean time until absorption starting from 0. Pr We let x i, i =,,... to be i.i.d. discrete random variable have uniform distribution on the integers,,..., 6. X 0 = 0 and X n+ = X n + ξ n+ is the Markov chain we are interested in. Set T = min{n 0 : X n > 0}. We wil solve u i = P(X T = 3 X 0 = i) for 0 i 0. Clearly, u i = 6 P(X T = 3 X i + = i + k)p(ξ i+ = k) = k= 6 k= u i+k 6.
11 So we have u 0 = 6 u 9 = 6 u u 8 = 6 u u u 7 = 6 u u u u 6 = 6 u u u u 0 u 5 = 6 u u u u u 0 u 4 = 6 u u u u u u 0 u 3 = 6 u u u u u u 9 u = 6 u u u u u u 8 u = 6 u + 6 u u u u u 7 u 0 = 6 u + 6 u + 6 u u u u 6. Solving the system of linear equations yields u 0 = Pr (a) X n is clearly nonnegative. We observe that for k, E[X n+ X n = k] = (k ) + (k + ) = k and Thus E[X n+ X n ] = X n and since E[X n+ X n = 0] = 0. E[X n+ ] = E[X n+ X n = k]p(x n = k) = kp(x n = k) = E[X n ], k=0 k= we have E[ X n ] < for all n. (b) Using the maximal inequality, if X 0 < N, and if X 0 N. P(max n X n N) E[X 0] N P(max n X n N) =
Expected Value 7/7/2006
Expected Value 7/7/2006 Definition Let X be a numerically-valued discrete random variable with sample space Ω and distribution function m(x). The expected value E(X) is defined by E(X) = x Ω x m(x), provided
More information1 Random variables and distributions
Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set
More informationProblem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1
Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1
IEOR 3106: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 1 Probability Review: Read Chapters 1 and 2 in the textbook, Introduction to Probability
More informationRandom Variable. Pr(X = a) = Pr(s)
Random Variable Definition A random variable X on a sample space Ω is a real-valued function on Ω; that is, X : Ω R. A discrete random variable is a random variable that takes on only a finite or countably
More informationSolution to HW 12. Since B and B 2 form a partition, we have P (A) = P (A B 1 )P (B 1 ) + P (A B 2 )P (B 2 ). Using P (A) = 21.
Solution to HW 12 (1) (10 pts) Sec 12.3 Problem A screening test for a disease shows a positive result in 92% of all cases when the disease is actually present and in 7% of all cases when it is not. Assume
More informationMarkov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains
Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time
More informationMAT 135B Midterm 1 Solutions
MAT 35B Midterm Solutions Last Name (PRINT): First Name (PRINT): Student ID #: Section: Instructions:. Do not open your test until you are told to begin. 2. Use a pen to print your name in the spaces above.
More informationChapter 11 Advanced Topic Stochastic Processes
Chapter 11 Advanced Topic Stochastic Processes CHAPTER OUTLINE Section 1 Simple Random Walk Section 2 Markov Chains Section 3 Markov Chain Monte Carlo Section 4 Martingales Section 5 Brownian Motion Section
More informationWhat is a random variable
OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr
More information18.05 Practice Final Exam
No calculators. 18.05 Practice Final Exam Number of problems 16 concept questions, 16 problems. Simplifying expressions Unless asked to explicitly, you don t need to simplify complicated expressions. For
More informationOutline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks
Outline 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks Likelihood A common and fruitful approach to statistics is to assume
More information1 Sequences of events and their limits
O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For
More informationBasic Probability. Introduction
Basic Probability Introduction The world is an uncertain place. Making predictions about something as seemingly mundane as tomorrow s weather, for example, is actually quite a difficult task. Even with
More informationCSE 312 Foundations, II Final Exam
CSE 312 Foundations, II Final Exam 1 Anna Karlin June 11, 2014 DIRECTIONS: Closed book, closed notes except for one 8.5 11 sheet. Time limit 110 minutes. Calculators allowed. Grading will emphasize problem
More informationQuiz 1 Date: Monday, October 17, 2016
10-704 Information Processing and Learning Fall 016 Quiz 1 Date: Monday, October 17, 016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED.. Write your name, Andrew
More information18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages
Name No calculators. 18.05 Final Exam Number of problems 16 concept questions, 16 problems, 21 pages Extra paper If you need more space we will provide some blank paper. Indicate clearly that your solution
More information3.5. First step analysis
12 3.5. First step analysis A journey of a thousand miles begins with the first step. Lao Tse Example 3.6. (Coin Tossing) Repeatedly toss a fair coin a number of times. What s the expected number of tosses
More informationCHAPTER - 16 PROBABILITY Random Experiment : If an experiment has more than one possible out come and it is not possible to predict the outcome in advance then experiment is called random experiment. Sample
More informationChapter 35 out of 37 from Discrete Mathematics for Neophytes: Number Theory, Probability, Algorithms, and Other Stuff by J. M. Cargal.
35 Mixed Chains In this chapter we learn how to analyze Markov chains that consists of transient and absorbing states. Later we will see that this analysis extends easily to chains with (nonabsorbing)
More informationEntropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39
Entropy Probability and Computing Presentation 22 Probability and Computing Presentation 22 Entropy 1/39 Introduction Why randomness and information are related? An event that is almost certain to occur
More informationMath 180B Problem Set 3
Math 180B Problem Set 3 Problem 1. (Exercise 3.1.2) Solution. By the definition of conditional probabilities we have Pr{X 2 = 1, X 3 = 1 X 1 = 0} = Pr{X 3 = 1 X 2 = 1, X 1 = 0} Pr{X 2 = 1 X 1 = 0} = P
More informationTotal. Name: Student ID: CSE 21A. Midterm #2. February 28, 2013
Name: Student ID: CSE 21A Midterm #2 February 28, 2013 There are 6 problems. The number of points a problem is worth is shown next to the problem. Show your work (even on multiple choice questions)! Also,
More informationP i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=
2.7. Recurrence and transience Consider a Markov chain {X n : n N 0 } on state space E with transition matrix P. Definition 2.7.1. A state i E is called recurrent if P i [X n = i for infinitely many n]
More informationMath , Fall 2012: HW 5 Solutions
Math 230.0, Fall 202: HW 5 Solutions Due Thursday, October 4th, 202. Problem (p.58 #2). Let X and Y be the numbers obtained in two draws at random from a box containing four tickets labeled, 2, 3, 4. Display
More informationMAS275 Probability Modelling Exercises
MAS75 Probability Modelling Exercises Note: these questions are intended to be of variable difficulty. In particular: Questions or part questions labelled (*) are intended to be a bit more challenging.
More informationThis exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.
TEST #1 STA 5326 September 25, 2014 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access
More informationThus f is continuous at x 0. Matthew Straughn Math 402 Homework 6
Matthew Straughn Math 402 Homework 6 Homework 6 (p. 452) 14.3.3, 14.3.4, 14.3.5, 14.3.8 (p. 455) 14.4.3* (p. 458) 14.5.3 (p. 460) 14.6.1 (p. 472) 14.7.2* Lemma 1. If (f (n) ) converges uniformly to some
More informationIntroduction to Probability, Fall 2009
Introduction to Probability, Fall 2009 Math 30530 Review questions for exam 1 solutions 1. Let A, B and C be events. Some of the following statements are always true, and some are not. For those that are
More informationHW2 Solutions, for MATH441, STAT461, STAT561, due September 9th
HW2 Solutions, for MATH44, STAT46, STAT56, due September 9th. You flip a coin until you get tails. Describe the sample space. How many points are in the sample space? The sample space consists of sequences
More informationFirst Digit Tally Marks Final Count
Benford Test () Imagine that you are a forensic accountant, presented with the two data sets on this sheet of paper (front and back). Which of the two sets should be investigated further? Why? () () ()
More informationMath 651 Introduction to Numerical Analysis I Fall SOLUTIONS: Homework Set 1
ath 651 Introduction to Numerical Analysis I Fall 2010 SOLUTIONS: Homework Set 1 1. Consider the polynomial f(x) = x 2 x 2. (a) Find P 1 (x), P 2 (x) and P 3 (x) for f(x) about x 0 = 0. What is the relation
More informationProbability reminders
CS246 Winter 204 Mining Massive Data Sets Probability reminders Sammy El Ghazzal selghazz@stanfordedu Disclaimer These notes may contain typos, mistakes or confusing points Please contact the author so
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationPolytechnic Institute of NYU MA 2212 MIDTERM Feb 12, 2009
Polytechnic Institute of NYU MA 2212 MIDTERM Feb 12, 2009 Print Name: Signature: Section: ID #: Directions: You have 55 minutes to answer the following questions. You must show all your work as neatly
More informationExam 1 Solutions. Problem Points Score Total 145
Exam Solutions Read each question carefully and answer all to the best of your ability. Show work to receive as much credit as possible. At the end of the exam, please sign the box below. Problem Points
More informationMATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM
MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM YOUR NAME: KEY: Answers in blue Show all your work. Answers out of the blue and without any supporting work may receive no credit even if they are
More informationProblems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.
Math 224 Fall 2017 Homework 1 Drew Armstrong Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman. Section 1.1, Exercises 4,5,6,7,9,12. Solutions to Book Problems.
More informationk P (X = k)
Math 224 Spring 208 Homework Drew Armstrong. Suppose that a fair coin is flipped 6 times in sequence and let X be the number of heads that show up. Draw Pascal s triangle down to the sixth row (recall
More informationV. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE
V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE A game of chance featured at an amusement park is played as follows: You pay $ to play. A penny a nickel are flipped. You win $ if either
More informationMATH 56A SPRING 2008 STOCHASTIC PROCESSES 65
MATH 56A SPRING 2008 STOCHASTIC PROCESSES 65 2.2.5. proof of extinction lemma. The proof of Lemma 2.3 is just like the proof of the lemma I did on Wednesday. It goes like this. Suppose that â is the smallest
More informationProbabilistic models
Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became the definitive formulation
More informationMath 1313 Experiments, Events and Sample Spaces
Math 1313 Experiments, Events and Sample Spaces At the end of this recording, you should be able to define and use the basic terminology used in defining experiments. Terminology The next main topic in
More informationLecture 2 : CS6205 Advanced Modeling and Simulation
Lecture 2 : CS6205 Advanced Modeling and Simulation Lee Hwee Kuan 21 Aug. 2013 For the purpose of learning stochastic simulations for the first time. We shall only consider probabilities on finite discrete
More informationMATH 151, FINAL EXAM Winter Quarter, 21 March, 2014
Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists
More informationMath Homework 5 Solutions
Math 45 - Homework 5 Solutions. Exercise.3., textbook. The stochastic matrix for the gambler problem has the following form, where the states are ordered as (,, 4, 6, 8, ): P = The corresponding diagram
More informationMath 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =
Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,
More informationDiscrete Random Variable
Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n
More informationA COMPOUND POISSON APPROXIMATION INEQUALITY
J. Appl. Prob. 43, 282 288 (2006) Printed in Israel Applied Probability Trust 2006 A COMPOUND POISSON APPROXIMATION INEQUALITY EROL A. PEKÖZ, Boston University Abstract We give conditions under which the
More informationProbability. Lecture Notes. Adolfo J. Rumbos
Probability Lecture Notes Adolfo J. Rumbos October 20, 204 2 Contents Introduction 5. An example from statistical inference................ 5 2 Probability Spaces 9 2. Sample Spaces and σ fields.....................
More information1 Stat 605. Homework I. Due Feb. 1, 2011
The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due
More informationCS 361: Probability & Statistics
February 19, 2018 CS 361: Probability & Statistics Random variables Markov s inequality This theorem says that for any random variable X and any value a, we have A random variable is unlikely to have an
More informationProbability, Random Processes and Inference
INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx
More information1 if the i-th toss is a Head (with probability p) ξ i = 0 if the i-th toss is a Tail (with probability 1 p)
6 Chapter 3. Markov Chain: Introduction Whatever happened in the past, be it glory or misery, be Markov! 3.1. Examples Example 3.1. (Coin Tossing.) Let ξ 0 0 and, for i 1, 1 if the i-th toss is a Head
More informationLecture 6: Entropy Rate
Lecture 6: Entropy Rate Entropy rate H(X) Random walk on graph Dr. Yao Xie, ECE587, Information Theory, Duke University Coin tossing versus poker Toss a fair coin and see and sequence Head, Tail, Tail,
More informationProblem Points S C O R E Total: 120
PSTAT 160 A Final Exam December 10, 2015 Name Student ID # Problem Points S C O R E 1 10 2 10 3 10 4 10 5 10 6 10 7 10 8 10 9 10 10 10 11 10 12 10 Total: 120 1. (10 points) Take a Markov chain with the
More informationReview of Probability. CS1538: Introduction to Simulations
Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed
More informationn(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3
Introduction to Probability Due:August 8th, 211 Solutions of Final Exam Solve all the problems 1. (15 points) You have three coins, showing Head with probabilities p 1, p 2 and p 3. You perform two different
More informationProbability Theory and Simulation Methods
Feb 28th, 2018 Lecture 10: Random variables Countdown to midterm (March 21st): 28 days Week 1 Chapter 1: Axioms of probability Week 2 Chapter 3: Conditional probability and independence Week 4 Chapters
More informationx x 1 0 1/(N 1) (N 2)/(N 1)
Please simplify your answers to the extent reasonable without a calculator, show your work, and explain your answers, concisely. If you set up an integral or a sum that you cannot evaluate, leave it as
More informationMATH1231 Algebra, 2017 Chapter 9: Probability and Statistics
MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics A/Prof. Daniel Chan School of Mathematics and Statistics University of New South Wales danielc@unsw.edu.au Daniel Chan (UNSW) MATH1231 Algebra
More informationSTAT/MA 416 Answers Homework 4 September 27, 2007 Solutions by Mark Daniel Ward PROBLEMS
STAT/MA 416 Answers Homework 4 September 27, 2007 Solutions by Mark Daniel Ward PROBLEMS 2. We ust examine the 36 possible products of two dice. We see that 1/36 for i = 1, 9, 16, 25, 36 2/36 for i = 2,
More informationMODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES
MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES 7-11 Topics 2.1 RANDOM VARIABLE 2.2 INDUCED PROBABILITY MEASURE 2.3 DISTRIBUTION FUNCTION AND ITS PROPERTIES 2.4 TYPES OF RANDOM VARIABLES: DISCRETE,
More informationOutline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this
More informationRandom Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin
Random Variables Statistics 110 Summer 2006 Copyright c 2006 by Mark E. Irwin Random Variables A Random Variable (RV) is a response of a random phenomenon which is numeric. Examples: 1. Roll a die twice
More informationSTAT 516 Midterm Exam 2 Friday, March 7, 2008
STAT 516 Midterm Exam 2 Friday, March 7, 2008 Name Purdue student ID (10 digits) 1. The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional
More informationMath489/889 Stochastic Processes and Advanced Mathematical Finance Solutions for Homework 7
Math489/889 Stochastic Processes and Advanced Mathematical Finance Solutions for Homework 7 Steve Dunbar Due Mon, November 2, 2009. Time to review all of the information we have about coin-tossing fortunes
More informationLecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality
Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek Bhrushundi
More informationLecture Notes. This lecture introduces the idea of a random variable. This name is a misnomer, since a random variable is actually a function.
Massachusetts Institute of Technology Lecture 21 6.042J/18.062J: Mathematics for Computer Science 25 April 2000 Professors David Karger and Nancy Lynch Lecture Notes 1 Random Variables This lecture introduces
More informationNANYANG TECHNOLOGICAL UNIVERSITY SEMESTER I EXAMINATION MH4702/MAS446/MTH437 Probabilistic Methods in OR
NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER I EXAMINATION 2013-201 MH702/MAS6/MTH37 Probabilistic Methods in OR December 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains
More informationNotes 1 : Measure-theoretic foundations I
Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,
More informationLecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.
1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if
More informationLecture 4: Probability and Discrete Random Variables
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1
More information1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) =
1. If X has density f(x) = { cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. 2. Let X have density f(x) = { xe x, 0 < x < 0, otherwise. (a) Find P (X > 2). (b) Find
More informationMARKOV PROCESSES. Valerio Di Valerio
MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some
More informationMaster s Written Examination - Solution
Master s Written Examination - Solution Spring 204 Problem Stat 40 Suppose X and X 2 have the joint pdf f X,X 2 (x, x 2 ) = 2e (x +x 2 ), 0 < x < x 2
More informationA birthday in St. Petersburg
Mathematical Assoc of America College Mathematics Journal 45: July 29, 207 2:27 pm StPetersburgtex page A birthday in St Petersburg Bio to be included if accepted for publication Consider the following
More informationProblem Points S C O R E Total: 120
PSTAT 160 A Final Exam Solution December 10, 2015 Name Student ID # Problem Points S C O R E 1 10 2 10 3 10 4 10 5 10 6 10 7 10 8 10 9 10 10 10 11 10 12 10 Total: 120 1. (10 points) Take a Markov chain
More information1 Probability theory. 2 Random variables and probability theory.
Probability theory Here we summarize some of the probability theory we need. If this is totally unfamiliar to you, you should look at one of the sources given in the readings. In essence, for the major
More informationChapter 2: The Random Variable
Chapter : The Random Variable The outcome of a random eperiment need not be a number, for eample tossing a coin or selecting a color ball from a bo. However we are usually interested not in the outcome
More informationUCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7
UCSD ECE50 Handout #0 Prof. Young-Han Kim Monday, February 6, 07 Solutions to Exercise Set #7. Minimum waiting time. Let X,X,... be i.i.d. exponentially distributed random variables with parameter λ, i.e.,
More informationExam #1 March 9, 2016
Mathematics 1372/D552, Spring 2016 Instructor: Suman Ganguli Exam #1 March 9, 2016 Name: 1. (20 points) The following data set lists the high temperatures (in degrees Fahrenheit) observed in Central Park
More information4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space
I. Vocabulary: A. Outcomes: the things that can happen in a probability experiment B. Sample Space (S): all possible outcomes C. Event (E): one outcome D. Probability of an Event (P(E)): the likelihood
More informationMonty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch
Monty Hall Puzzle Example: You are asked to select one of the three doors to open. There is a large prize behind one of the doors and if you select that door, you win the prize. After you select a door,
More information9. DISCRETE PROBABILITY DISTRIBUTIONS
9. DISCRETE PROBABILITY DISTRIBUTIONS Random Variable: A quantity that takes on different values depending on chance. Eg: Next quarter s sales of Coca Cola. The proportion of Super Bowl viewers surveyed
More informationProbability inequalities 11
Paninski, Intro. Math. Stats., October 5, 2005 29 Probability inequalities 11 There is an adage in probability that says that behind every limit theorem lies a probability inequality (i.e., a bound on
More information12 Markov chains The Markov property
12 Markov chains Summary. The chapter begins with an introduction to discrete-time Markov chains, and to the use of matrix products and linear algebra in their study. The concepts of recurrence and transience
More informationBRANCHING PROCESSES 1. GALTON-WATSON PROCESSES
BRANCHING PROCESSES 1. GALTON-WATSON PROCESSES Galton-Watson processes were introduced by Francis Galton in 1889 as a simple mathematical model for the propagation of family names. They were reinvented
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationTOPIC 12: RANDOM VARIABLES AND THEIR DISTRIBUTIONS
TOPIC : RANDOM VARIABLES AND THEIR DISTRIBUTIONS In the last section we compared the length of the longest run in the data for various players to our expectations for the longest run in data generated
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationChapter 18 Sampling Distribution Models
Chapter 18 Sampling Distribution Models The histogram above is a simulation of what we'd get if we could see all the proportions from all possible samples. The distribution has a special name. It's called
More informationMATH 564/STAT 555 Applied Stochastic Processes Homework 2, September 18, 2015 Due September 30, 2015
ID NAME SCORE MATH 56/STAT 555 Applied Stochastic Processes Homework 2, September 8, 205 Due September 30, 205 The generating function of a sequence a n n 0 is defined as As : a ns n for all s 0 for which
More informationElementary Discrete Probability
Elementary Discrete Probability MATH 472 Financial Mathematics J Robert Buchanan 2018 Objectives In this lesson we will learn: the terminology of elementary probability, elementary rules of probability,
More informationCS 125 Section #10 (Un)decidability and Probability November 1, 2016
CS 125 Section #10 (Un)decidability and Probability November 1, 2016 1 Countability Recall that a set S is countable (either finite or countably infinite) if and only if there exists a surjective mapping
More informationConditional Probability
Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the
More informationHomework 9 (due November 24, 2009)
Homework 9 (due November 4, 9) Problem. The join probability density function of X and Y is given by: ( f(x, y) = c x + xy ) < x
More information2. This exam consists of 15 questions. The rst nine questions are multiple choice Q10 requires two
CS{74 Combinatorics & Discrete Probability, Fall 96 Final Examination 2:30{3:30pm, 7 December Read these instructions carefully. This is a closed book exam. Calculators are permitted. 2. This exam consists
More informationChapter 2: Markov Chains and Queues in Discrete Time
Chapter 2: Markov Chains and Queues in Discrete Time L. Breuer University of Kent 1 Definition Let X n with n N 0 denote random variables on a discrete space E. The sequence X = (X n : n N 0 ) is called
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More information