Rare Event Simulation using Importance Sampling and Cross-Entropy p. 1
|
|
- Silas Nichols
- 5 years ago
- Views:
Transcription
1 Rare Event Simulation using Importance Sampling and Cross-Entropy Caitlin James Supervisor: Phil Pollett Rare Event Simulation using Importance Sampling and Cross-Entropy p. 1
2 500 Simulation of a population Population size X(t) Time t (years) Rare Event Simulation using Importance Sampling and Cross-Entropy p. 2
3 Outline A rare event problem Stochastic SIS Logistic Epidemic (SIS) Crude Monte Carlo Method (CMCM) Importance Sampling (IS) Cross-Entropy Method (CE) Comparison of simulation estimates Rare Event Simulation using Importance Sampling and Cross-Entropy p. 3
4 A rare event problem Denote X(t) as the size of the population at time t. Let N denote the maximum population size. We wish to estimate α = P(X(t) hits 0 before N). Rare Event Simulation using Importance Sampling and Cross-Entropy p. 4
5 Notation Suppose (X(t) : t 0) is a birth-death process on finite space X = {0, 1,..., N}. λ 0 λ 1 λ N 2 λ N N-1 N µ 1 µ 2 µ N 1 µ N Figure 1: Transition diagram of a finite-state birth-death process Rare Event Simulation using Importance Sampling and Cross-Entropy p. 5
6 Notation Suppose (X(t) : t 0) is a birth-death process on finite space X = {0, 1,..., N}. Let (X m, m = 0, 1,... ) denote the corresponding jump chain. Rare Event Simulation using Importance Sampling and Cross-Entropy p. 5
7 Notation Suppose (X(t) : t 0) is a birth-death process on finite space X = {0, 1,..., N}. Let (X m, m = 0, 1,... ) denote the corresponding jump chain. Let A be the collection of all the sample paths of (X m ) & let A 0 be the collection of all the paths that hit 0 before N. Rare Event Simulation using Importance Sampling and Cross-Entropy p. 5
8 Notation Suppose (X(t) : t 0) is a birth-death process on finite space X = {0, 1,..., N}. Let (X m, m = 0, 1,... ) denote the corresponding jump chain. Let A be the collection of all the sample paths of (X m ) & let A 0 be the collection of all the paths that hit 0 before N. Let X = (X 0, X 1,... ) be a random sample path of A and let A be the event {X A 0 }. Rare Event Simulation using Importance Sampling and Cross-Entropy p. 5
9 Equation Then we can write α = P(A) = E f H(X) = where A H(x)f(x; u, P )µ(dx). H(X) is the indicator function of the rare event A defined by H(x) = { 1 if x A0 0 if x A 0. Rare Event Simulation using Importance Sampling and Cross-Entropy p. 6
10 Equation Then we can write and α = P(A) = E f H(X) = A H(x)f(x; u, P )µ(dx). f(x; u, P ) = u(x 0 ) m 1 k=0 P (x k, x k+1 ). Rare Event Simulation using Importance Sampling and Cross-Entropy p. 7
11 Equation Then we can write and α = P(A) = E f H(X) = A H(x)f(x; u, P )µ(dx). f(x; u, P ) = u(x 0 ) m 1 k=0 P (x k, x k+1 ). u = (u(i) : i X ) is the initial distribution. Rare Event Simulation using Importance Sampling and Cross-Entropy p. 7
12 Equation Then we can write and α = P(A) = E f H(X) = A H(x)f(x; u, P )µ(dx). f(x; u, P ) = u(x 0 ) m 1 k=0 P (x k, x k+1 ). u = (u(i) : i X ) is the initial distribution. P = (P (i, j) : i, j X ) is the one-step transition matrix of (X m ). Rare Event Simulation using Importance Sampling and Cross-Entropy p. 7
13 Stochastic SIS Logistic Epidemic The SIS model is a finite-state birth and death process for which the origin is an absorbing state. λ 1 λ N 2 λ N N-1 N µ 1 µ 2 µ N 1 µ N Figure 2: Transition diagram of SIS model Rare Event Simulation using Importance Sampling and Cross-Entropy p. 8
14 Stochastic SIS Logistic Epidemic The SIS model is a finite-state birth and death process for which the origin is an absorbing state. The rate of infection per contact is denoted by λ and µ denotes the per-capita death rate. λ 1 λ N 2 λ N N-1 N µ 1 µ 2 µ N 1 µ N Figure 2: Transition diagram of SIS model Rare Event Simulation using Importance Sampling and Cross-Entropy p. 8
15 Stochastic SIS Logistic Epidemic The non-zero transition rates are defined as λ i = λ 1 N i(n i) and µ i = µi. Rare Event Simulation using Importance Sampling and Cross-Entropy p. 9
16 Stochastic SIS Logistic Epidemic The non-zero transition rates are defined as λ i = λ 1 N i(n i) and µ i = µi. The jump chain (X m ) has jump probabilities p i = and q i = (1 p i ) = λ(n i) µn + λ(n i) µn µn + λ(n i). Rare Event Simulation using Importance Sampling and Cross-Entropy p. 9
17 Crude Monte Carlo Method We can estimate our probability using CMCM. This involves simulating n replicates, X 1,..., X n, from f( ; u, P ) and setting ˆα = 1 n n k=1 H(X k ). We simulate our model until the process hits N or 0. We count 1 if the process hits 0. Rare Event Simulation using Importance Sampling and Cross-Entropy p. 10
18 Crude Monte Carlo Method We can estimate our probability using CMCM. This involves simulating n replicates, X 1,..., X n, from f( ; u, P ) and setting ˆα = 1 n n k=1 H(X k ). The number of trials needed to get one successful trial has a geometric distribution with the expected value being 1/p, where p is the probability of a successful trial. Rare Event Simulation using Importance Sampling and Cross-Entropy p. 10
19 CMCM example If we start in State 8 and we wish to estimate the probability of reaching 0 before 10, then we would require 1/α runs to see one successful path hit 0 before 10. (Failure is hitting 10 before 0.) Rare Event Simulation using Importance Sampling and Cross-Entropy p. 11
20 CMCM example If we start in State 8 and we wish to estimate the probability of reaching 0 before 10, then we would require 1/α runs to see one successful path hit 0 before 10. (Failure is hitting 10 before 0.) The exact value of α starting in state 8 is 1.18E-05. Thus we need around 1/1.18E-05 85, 000 runs to see our first path hit 0 before 10. Rare Event Simulation using Importance Sampling and Cross-Entropy p. 11
21 CMCM example If we start in State 8 and we wish to estimate the probability of reaching 0 before 10, then we would require 1/α runs to see one successful path hit 0 before 10. (Failure is hitting 10 before 0.) The exact value of α starting in state 8 is 1.18E-05. Thus we need around 1/1.18E-05 85, 000 runs to see our first path hit 0 before 10. If we use CMCM, then we would require many more than 85, 000 runs to obtain a reasonable estimate for α. Rare Event Simulation using Importance Sampling and Cross-Entropy p. 11
22 Importance Sampling ũ to be an alternative initial distribution Rare Event Simulation using Importance Sampling and Cross-Entropy p. 12
23 Importance Sampling ũ to be an alternative initial distribution P to be an alternative transition matrix Rare Event Simulation using Importance Sampling and Cross-Entropy p. 12
24 Importance Sampling ũ to be an alternative initial distribution P to be an alternative transition matrix g(x; ũ, P ) = ũ(x 0 ) m 1 k=0 P (x k, x k+1 ) Rare Event Simulation using Importance Sampling and Cross-Entropy p. 12
25 Importance Sampling ũ to be an alternative initial distribution P to be an alternative transition matrix g(x; ũ, P ) = ũ(x 0 ) m 1 k=0 P (x k, x k+1 ) The following conditions must also hold: u(i) > 0 implies ũ(i) > 0 and P (i, j) > 0 implies P (i, j) > 0. Rare Event Simulation using Importance Sampling and Cross-Entropy p. 12
26 Importance Sampling Under the alternative measure g we can estimate α by f(x; u, P ) α = E f H(X) = H(x) A g(x; ũ, P ) g(x; ũ, P )µ(dx) ) = E g (H(X)L T (X; u, P, ũ, P ); T <, where Rare Event Simulation using Importance Sampling and Cross-Entropy p. 13
27 Importance Sampling Under the alternative measure g we can estimate α by f(x; u, P ) α = E f H(X) = H(x) A g(x; ũ, P ) g(x; ũ, P )µ(dx) ) = E g (H(X)L T (X; u, P, ũ, P ); T <, where L m (x; u, P, ũ, P ) = f(x; u, P ) g(x; ũ, P ) = u(x m 1 0) k=0 P (x k, x k+1 ) ũ(x 0 ) m 1 P k=0 (x k, x k+1 ). Rare Event Simulation using Importance Sampling and Cross-Entropy p. 13
28 Importance Sampling Under the alternative measure g we can estimate α by f(x; u, P ) α = E f H(X) = H(x) A g(x; ũ, P ) g(x; ũ, P )µ(dx) ) = E g (H(X)L T (X; u, P, ũ, P ); T <, where L m (x; u, P, ũ, P ) = with f(x; u, P ) g(x; ũ, P ) = u(x m 1 0) k=0 P (x k, x k+1 ) ũ(x 0 ) m 1 P k=0 (x k, x k+1 ). T = inf{m : X m = 0 or X m = N} ("Stopping Time") Rare Event Simulation using Importance Sampling and Cross-Entropy p. 13
29 Importance Sampling estimator We can estimate α using the importance sampling (IS) estimator, defined by ˆα = 1 n n i=1 H(X)L m (X; u, P, ũ, P ), where (recall) L m (x; u, P, ũ, P ) = f(x; u, P ) g(x; ũ, P ) = u(x m 1 0) k=0 P (x k, x k+1 ) ũ(x 0 ) m 1 P k=0 (x k, x k+1 ). Rare Event Simulation using Importance Sampling and Cross-Entropy p. 14
30 Optimal IS density Which change of measure gives the IS estimator with smallest variance? Rare Event Simulation using Importance Sampling and Cross-Entropy p. 15
31 Optimal IS density Which change of measure gives the IS estimator with smallest variance? The smallest variance is obtained when g = g, the optimal importance sampling density, given by g (x) = A H(x) f(x; u, P ) H(x) f(x; u, P )µ(dx). Rare Event Simulation using Importance Sampling and Cross-Entropy p. 15
32 Optimal IS density Which change of measure gives the IS estimator with smallest variance? The smallest variance is obtained when g = g, the optimal importance sampling density, given by g (x) = A H(x) f(x; u, P ) H(x) f(x; u, P )µ(dx). If H(x) 0, then g (x) = H(x)f(x; u, P ) α. Rare Event Simulation using Importance Sampling and Cross-Entropy p. 15
33 Cross-Entropy Method The Kullback-Leibler Cross-Entropy (CE) measure defines a "distance" between two densities g and h D(g, h) = = g(x) ln g(x) h(x) µ(dx) g(x) ln g(x)µ(dx) g(x) ln h(x)µ(dx). The purpose of CE is to choose the IS density h such that the "distance" between the optimal IS density g and density h is as small as possible. Rare Event Simulation using Importance Sampling and Cross-Entropy p. 16
34 Properties of CE 1. D(, ) is non-symmetric ie: D(g, h) D(h, g), thus D(g, h) is not a true distance between g and h in a formal sense, although 2. D(g, h) D(g, g) = 0. Rare Event Simulation using Importance Sampling and Cross-Entropy p. 17
35 Restrict the IS density If we restrict the density to belong to some family F which contains the original density f(x; u, P ) = u(x 0 ) and the alternative density f(x; ũ, P ) = ũ(x 0 ) m 1 k=0 m 1 k=0 P (x k, x k+1 ) P (x k, x k+1 ) Rare Event Simulation using Importance Sampling and Cross-Entropy p. 18
36 CE optimisation problem Then the CE method aims to solve the parametric optimisation problem min (ũ, P ) D(g, f( ; ũ, P )), where (recall) g (x) = H(x)f( ; u, P ) α. Rare Event Simulation using Importance Sampling and Cross-Entropy p. 19
37 CE reference parameter Since f( ; u, P ) does not depend on (ũ, P ), minimising the CE distance between g and f( ; ũ, P ) is equivalent to maximising, with respect to (ũ, P ), H(x) f(x; u, P ) ln f(x; ũ, P )µ(dx) = E (u,p ) H(X) ln f(x; ũ, P ). Rare Event Simulation using Importance Sampling and Cross-Entropy p. 20
38 CE reference parameter Since f( ; u, P ) does not depend on (ũ, P ), minimising the CE distance between g and f( ; ũ, P ) is equivalent to maximising, with respect to (ũ, P ), H(x) f(x; u, P ) ln f(x; ũ, P )µ(dx) = E (u,p ) H(X) ln f(x; ũ, P ). Assuming H(X) 0, the optimal (ũ, P ) (with respect to CE) is the solution to (ũ, P ) = arg max (ũ, P ) E (u,p ) H(X) ln f(x; ũ, P ). Rare Event Simulation using Importance Sampling and Cross-Entropy p. 20
39 CE reference parameter Since f( ; u, P ) does not depend on (ũ, P ), minimising the CE distance between g and f( ; ũ, P ) is equivalent to maximising, with respect to (ũ, P ), H(x) f(x; u, P ) ln f(x; ũ, P )µ(dx) = E (u,p ) H(X) ln f(x; ũ, P ). Assuming H(X) 0, the optimal P (with respect to CE) is the solution to P = arg max P E (u,p ) H(X) ln f(x; u, P ). Rare Event Simulation using Importance Sampling and Cross-Entropy p. 20
40 CE reference parameter The optimal transition probability matrix is given by P (i, j) = E (u,p )H(X) k:x k =i 1 (X k+1 =j) E (u,p ) H(X) k:x k =i 1. We can approximate this optimal transition probability matrix by implementing IS to obtain: Rare Event Simulation using Importance Sampling and Cross-Entropy p. 21
41 CE reference parameter The optimal transition probability matrix is given by P (i, j) = E (u,p )H(X) k:x k =i 1 (X k+1 =j) E (u,p ) H(X) k:x k =i 1. We can approximate this optimal transition probability matrix by implementing IS to obtain: P l+1(i, j) = E (u, P l ) H(X)L m(x; u, P, u, P l ) k:x k =i 1 (X k+1 =j) E (u, Pl ) H(X)L m(x; u, P, u, P l ) k:x k =i 1 Rare Event Simulation using Importance Sampling and Cross-Entropy p. 21
42 CE reference parameter The optimal transition probability matrix is given by P (i, j) = E (u,p )H(X) k:x k =i 1 (X k+1 =j) E (u,p ) H(X) k:x k =i 1. We can approximate this optimal transition probability matrix by implementing IS to obtain: P l+1(i, j) X n X=X 1 H(X)L m(x; u, P, u, P l ) k:x k =i 1 (X k+1 =j) X n X=X 1 H(X)L m(x; u, P, u, P l ) k:x k =i 1, Rare Event Simulation using Importance Sampling and Cross-Entropy p. 21
43 Initial CE parameter Original parameters: p i = λ(n i) µn + λ(n i) and q i = (1 p i ) = µn µn + λ(n i). Initial change of measures: p i = µ(n i) λn + µ(n i) and q i = (1 p i ) = λn λn + µ(n i). Rare Event Simulation using Importance Sampling and Cross-Entropy p. 22
44 Estimation of the probability of reaching 0 before N exact CMC Importance Sampling estimation IS 95% CI Probability Initial state Rare Event Simulation using Importance Sampling and Cross-Entropy p. 23
45 Estimation of the probability of reaching 0 before N exact CMC Importance Sampling estimation IS 95% CI Probability Initial state Rare Event Simulation using Importance Sampling and Cross-Entropy p. 24
46 Comparison of simulation estimates N = 10, λ = 0.9, µ = 0.1, ρ = 0.11, X 0 = 8, No. of CE runs= 4 Sample Size Exact IS CMCM E E E E E E E E-05 0 Rare Event Simulation using Importance Sampling and Cross-Entropy p. 25
47 Comparison of CE runs N = 10, λ = 0.9, µ = 0.1, ρ = 0.11, X 0 = 8, Sample Size = 1000, Exact = 1.18E-05 CE run IS 2 StD p 1 p 5 p E E E E E E E E E E E E Rare Event Simulation using Importance Sampling and Cross-Entropy p. 26
48 Acknowledgements Phil Pollett Joshua Ross David Sirl Ben Gladwin Rare Event Simulation using Importance Sampling and Cross-Entropy p. 27
Cross Entropy. Owen Jones, Cardiff University. Cross Entropy. CE for rare events. CE for disc opt. TSP example. A new setting. Necessary.
Owen Jones, Cardiff University The Cross-Entropy (CE) method started as an adaptive importance sampling scheme for estimating rare event probabilities (Rubinstein, 1997), before being successfully applied
More informationInterval Markov chains: Performance measures and sensitivity analysis
Interval Markov chains: Performance measures and sensitivity analysis Mingmei Teo Thesis submitted for the degree of Master of Philosophy in Applied Mathematics at The University of Adelaide School of
More informationOn the Choice of Parametric Families of Copulas
On the Choice of Parametric Families of Copulas Radu Craiu Department of Statistics University of Toronto Collaborators: Mariana Craiu, University Politehnica, Bucharest Vienna, July 2008 Outline 1 Brief
More informationMath Homework 5 Solutions
Math 45 - Homework 5 Solutions. Exercise.3., textbook. The stochastic matrix for the gambler problem has the following form, where the states are ordered as (,, 4, 6, 8, ): P = The corresponding diagram
More informationSemi-Parametric Importance Sampling for Rare-event probability Estimation
Semi-Parametric Importance Sampling for Rare-event probability Estimation Z. I. Botev and P. L Ecuyer IMACS Seminar 2011 Borovets, Bulgaria Semi-Parametric Importance Sampling for Rare-event probability
More informationExtinction times for a birth-death process with two phases
Extinction times for a birth-death process with two phases J.V. Ross P.K. Pollett January 6, 2006 Abstract Many populations have a negative impact on their habitat or upon other species in the environment
More informationMarkov decision processes and interval Markov chains: exploiting the connection
Markov decision processes and interval Markov chains: exploiting the connection Mingmei Teo Supervisors: Prof. Nigel Bean, Dr Joshua Ross University of Adelaide July 10, 2013 Intervals and interval arithmetic
More informationMonte Carlo Solution of Integral Equations
1 Monte Carlo Solution of Integral Equations of the second kind Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ March 7th, 2011 MIRaW Monte
More informationMarch 5, 2009 Name The problems count as marked. The total number of points available is 131. Throughout this test, show your work.
March 5, 2009 Name The problems count as marked. The total number of points available is 131. Throughout this test, show your work. 1. (12 points) Consider the cubic curve f(x) = 2x 3 + 3x + 2. (a) What
More informationGeneralized Cross-Entropy Methods
Generalized Cross-Entropy Methods Z. I. Botev D. P. Kroese T. Taimre Abstract The cross-entropy and minimum cross-entropy methods are well-known Monte Carlo simulation techniques for rare-event probability
More informationCross entropy-based importance sampling using Gaussian densities revisited
Cross entropy-based importance sampling using Gaussian densities revisited Sebastian Geyer a,, Iason Papaioannou a, Daniel Straub a a Engineering Ris Analysis Group, Technische Universität München, Arcisstraße
More informationMath 1120 Calculus Final Exam
May 4, 2001 Name The first five problems count 7 points each (total 35 points) and rest count as marked. There are 195 points available. Good luck. 1. Consider the function f defined by: { 2x 2 3 if x
More informationTaylor Series and stationary points
Chapter 5 Taylor Series and stationary points 5.1 Taylor Series The surface z = f(x, y) and its derivatives can give a series approximation for f(x, y) about some point (x 0, y 0 ) as illustrated in Figure
More informationLinear models. Linear models are computationally convenient and remain widely used in. applied econometric research
Linear models Linear models are computationally convenient and remain widely used in applied econometric research Our main focus in these lectures will be on single equation linear models of the form y
More informationMAS1302 Computational Probability and Statistics
MAS1302 Computational Probability and Statistics April 23, 2008 3. Simulating continuous random behaviour 3.1 The Continuous Uniform U(0,1) Distribution We have already used this random variable a great
More information6.231 DYNAMIC PROGRAMMING LECTURE 9 LECTURE OUTLINE
6.231 DYNAMIC PROGRAMMING LECTURE 9 LECTURE OUTLINE Rollout algorithms Policy improvement property Discrete deterministic problems Approximations of rollout algorithms Model Predictive Control (MPC) Discretization
More informationIEOR 4703: Homework 2 Solutions
IEOR 4703: Homework 2 Solutions Exercises for which no programming is required Let U be uniformly distributed on the interval (0, 1); P (U x) = x, x (0, 1). We assume that your computer can sequentially
More informationThree examples of a Practical Exact Markov Chain Sampling
Three examples of a Practical Exact Markov Chain Sampling Zdravko Botev November 2007 Abstract We present three examples of exact sampling from complex multidimensional densities using Markov Chain theory
More informationMath 1120 Calculus, section 3 Test 1
October 4, 205 Name The problems count as marked. The total number of points available is 7. Throughout this test, show your work. Use of calculator to circumvent ideas discussed in class will generally
More informationa factors The exponential 0 is a special case. If b is any nonzero real number, then
0.1 Exponents The expression x a is an exponential expression with base x and exponent a. If the exponent a is a positive integer, then the expression is simply notation that counts how many times the
More informationOptimal Sampling and Problematic Likelihood Functions in a Simple Population Model
Optimal Sampling and Problematic Likelihood Functions in a Simple Population Model 1 Pagendam, D.E. and 1 Pollett, P.K. 1 Department of Mathematics, School of Physical Sciences University of Queensland,
More informationMath 1120 Calculus, section 2 Test 1
February 6, 203 Name The problems count as marked. The total number of points available is 49. Throughout this test, show your work. Using a calculator to circumvent ideas discussed in class will generally
More informationFormalism of Quantum Mechanics
The theory of quantum mechanics is formulated by defining a set of rules or postulates. These postulates cannot be derived from the laws of classical physics. The rules define the following: 1. How to
More informationBayesian Methods with Monte Carlo Markov Chains II
Bayesian Methods with Monte Carlo Markov Chains II Henry Horng-Shing Lu Institute of Statistics National Chiao Tung University hslu@stat.nctu.edu.tw http://tigpbp.iis.sinica.edu.tw/courses.htm 1 Part 3
More informationThe Cross Entropy Method for the N-Persons Iterated Prisoner s Dilemma
The Cross Entropy Method for the N-Persons Iterated Prisoner s Dilemma Tzai-Der Wang Artificial Intelligence Economic Research Centre, National Chengchi University, Taipei, Taiwan. email: dougwang@nccu.edu.tw
More informationMixing Times and Hitting Times
Mixing Times and Hitting Times David Aldous January 12, 2010 Old Results and Old Puzzles Levin-Peres-Wilmer give some history of the emergence of this Mixing Times topic in the early 1980s, and we nowadays
More informationLimit theorems for discrete-time metapopulation models
MASCOS AustMS Meeting, October 2008 - Page 1 Limit theorems for discrete-time metapopulation models Phil Pollett Department of Mathematics The University of Queensland http://www.maths.uq.edu.au/ pkp AUSTRALIAN
More informationDerivation of Itô SDE and Relationship to ODE and CTMC Models
Derivation of Itô SDE and Relationship to ODE and CTMC Models Biomathematics II April 23, 2015 Linda J. S. Allen Texas Tech University TTU 1 Euler-Maruyama Method for Numerical Solution of an Itô SDE dx(t)
More information1 (t + 4)(t 1) dt. Solution: The denominator of the integrand is already factored with the factors being distinct, so 1 (t + 4)(t 1) = A
Calculus Topic: Integration of Rational Functions Section 8. # 0: Evaluate the integral (t + )(t ) Solution: The denominator of the integrand is already factored with the factors being distinct, so (t
More informationSTAT STOCHASTIC PROCESSES. Contents
STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5
More informationDiscrete Distributions
Chapter 2 Discrete Distributions 2.1 Random Variables of the Discrete Type An outcome space S is difficult to study if the elements of S are not numbers. However, we can associate each element/outcome
More information2.6 Logarithmic Functions. Inverse Functions. Question: What is the relationship between f(x) = x 2 and g(x) = x?
Inverse Functions Question: What is the relationship between f(x) = x 3 and g(x) = 3 x? Question: What is the relationship between f(x) = x 2 and g(x) = x? Definition (One-to-One Function) A function f
More informationANSWERS, Homework Problems, Spring 2014 Now You Try It, Supplemental problems in written homework, Even Answers R.6 8) 27, 30) 25
ANSWERS, Homework Problems, Spring 2014, Supplemental problems in written homework, Even Answers Review Assignment: Precalculus Even Answers to Sections R1 R7 R.1 24) 4a 2 16ab + 16b 2 R.2 24) Prime 5x
More informationA Tutorial on the Cross-Entropy Method
A Tutorial on the Cross-Entropy Method Pieter-Tjerk de Boer Electrical Engineering, Mathematics and Computer Science department University of Twente ptdeboer@cs.utwente.nl Dirk P. Kroese Department of
More informationy+2 x 1 is in the range. We solve x as x =
Dear Students, Here are sample solutions. The most fascinating thing about mathematics is that you can solve the same problem in many different ways. The correct answer will always be the same. Be creative
More informationThe SIS and SIR stochastic epidemic models revisited
The SIS and SIR stochastic epidemic models revisited Jesús Artalejo Faculty of Mathematics, University Complutense of Madrid Madrid, Spain jesus_artalejomat.ucm.es BCAM Talk, June 16, 2011 Talk Schedule
More informationMonte Carlo Methods. Geoff Gordon February 9, 2006
Monte Carlo Methods Geoff Gordon ggordon@cs.cmu.edu February 9, 2006 Numerical integration problem 5 4 3 f(x,y) 2 1 1 0 0.5 0 X 0.5 1 1 0.8 0.6 0.4 Y 0.2 0 0.2 0.4 0.6 0.8 1 x X f(x)dx Used for: function
More informationCSCI-6971 Lecture Notes: Monte Carlo integration
CSCI-6971 Lecture otes: Monte Carlo integration Kristopher R. Beevers Department of Computer Science Rensselaer Polytechnic Institute beevek@cs.rpi.edu February 21, 2006 1 Overview Consider the following
More informationAnalytic Geometry and Calculus I Exam 1 Practice Problems Solutions 2/19/7
Analytic Geometry and Calculus I Exam 1 Practice Problems Solutions /19/7 Question 1 Write the following as an integer: log 4 (9)+log (5) We have: log 4 (9)+log (5) = ( log 4 (9)) ( log (5)) = 5 ( log
More informationAUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET. Questions AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET
The Problem Identification of Linear and onlinear Dynamical Systems Theme : Curve Fitting Division of Automatic Control Linköping University Sweden Data from Gripen Questions How do the control surface
More informationBirth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes
DTU Informatics 247 Stochastic Processes 6, October 27 Today: Limiting behaviour of birth and death processes Birth and death processes with absorbing states Finite state continuous time Markov chains
More informationMath 106 Answers to Exam 1a Fall 2015
Math 06 Answers to Exam a Fall 05.. Find the derivative of the following functions. Do not simplify your answers. (a) f(x) = ex cos x x + (b) g(z) = [ sin(z ) + e z] 5 Using the quotient rule on f(x) and
More informationWinter 2019 Math 106 Topics in Applied Mathematics. Lecture 8: Importance Sampling
Winter 2019 Math 106 Topics in Applied Mathematics Data-driven Uncertainty Quantification Yoonsang Lee (yoonsang.lee@dartmouth.edu) Lecture 8: Importance Sampling 8.1 Importance Sampling Importance sampling
More informationStochastic Design Criteria in Linear Models
AUSTRIAN JOURNAL OF STATISTICS Volume 34 (2005), Number 2, 211 223 Stochastic Design Criteria in Linear Models Alexander Zaigraev N. Copernicus University, Toruń, Poland Abstract: Within the framework
More information8 Basics of Hypothesis Testing
8 Basics of Hypothesis Testing 4 Problems Problem : The stochastic signal S is either 0 or E with equal probability, for a known value E > 0. Consider an observation X = x of the stochastic variable X
More informationThe story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes
The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element
More informationProbability Distributions
Lecture 1: Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function
More informationModels for predicting extinction times: shall we dance (or walk or jump)?
Models for predicting extinction times: shall we dance (or walk or jump)? 1,2 Cairns, B.J., 1 J.V. Ross and 1 T. Taimre 1 School of Physical Sciences, The University of Queensland, St Lucia QLD 4072, Australia.
More informationLink lecture - Lagrange Multipliers
Link lecture - Lagrange Multipliers Lagrange multipliers provide a method for finding a stationary point of a function, say f(x, y) when the variables are subject to constraints, say of the form g(x, y)
More informationEXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS
(February 25, 2004) EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS BEN CAIRNS, University of Queensland PHIL POLLETT, University of Queensland Abstract The birth, death and catastrophe
More informationErik A. van Doorn Department of Applied Mathematics University of Twente Enschede, The Netherlands
REPRESENTATIONS FOR THE DECAY PARAMETER OF A BIRTH-DEATH PROCESS Erik A. van Doorn Department of Applied Mathematics University of Twente Enschede, The Netherlands University of Leeds 30 October 2014 Acknowledgment:
More informationThe SIR Disease Model Trajectories and MatLab
The SIR Disease Model Trajectories and MatLab James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University November 17, 2013 Outline Reviewing the SIR
More informationReinforcement Learning and Optimal Control. ASU, CSE 691, Winter 2019
Reinforcement Learning and Optimal Control ASU, CSE 691, Winter 2019 Dimitri P. Bertsekas dimitrib@mit.edu Lecture 8 Bertsekas Reinforcement Learning 1 / 21 Outline 1 Review of Infinite Horizon Problems
More informationIntertwining of Markov processes
January 4, 2011 Outline Outline First passage times of birth and death processes Outline First passage times of birth and death processes The contact process on the hierarchical group 1 0.5 0-0.5-1 0 0.2
More informationTDT4173 Machine Learning
TDT4173 Machine Learning Lecture 3 Bagging & Boosting + SVMs Norwegian University of Science and Technology Helge Langseth IT-VEST 310 helgel@idi.ntnu.no 1 TDT4173 Machine Learning Outline 1 Ensemble-methods
More informationMarkov processes and queueing networks
Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution
More informationEC5555 Economics Masters Refresher Course in Mathematics September 2013
EC5555 Economics Masters Refresher Course in Mathematics September 013 Lecture 3 Differentiation Francesco Feri Rationale for Differentiation Much of economics is concerned with optimisation (maximise
More informationQuestion: My computer only knows how to generate a uniform random variable. How do I generate others? f X (x)dx. f X (s)ds.
Simulation Question: My computer only knows how to generate a uniform random variable. How do I generate others?. Continuous Random Variables Recall that a random variable X is continuous if it has a probability
More informationGeorgia Department of Education Common Core Georgia Performance Standards Framework CCGPS Advanced Algebra Unit 2
Polynomials Patterns Task 1. To get an idea of what polynomial functions look like, we can graph the first through fifth degree polynomials with leading coefficients of 1. For each polynomial function,
More informationLecture 8: The Metropolis-Hastings Algorithm
30.10.2008 What we have seen last time: Gibbs sampler Key idea: Generate a Markov chain by updating the component of (X 1,..., X p ) in turn by drawing from the full conditionals: X (t) j Two drawbacks:
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationSection 4.2 Logarithmic Functions & Applications
34 Section 4.2 Logarithmic Functions & Applications Recall that exponential functions are one-to-one since every horizontal line passes through at most one point on the graph of y = b x. So, an exponential
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression
More informationLectures on Probability and Statistical Models
Lectures on Probability and Statistical Models Phil Pollett Professor of Mathematics The University of Queensland c These materials can be used for any educational purpose provided they are are not altered
More informationIntroduction to rare event simulation
1/35 rare event simulation (some works with colleagues H. Cancela, V. Demers, P. L Ecuyer, G. Rubino) INRIA - Centre Bretagne Atlantique, Rennes Aussois, June 2008, AEP9 Outline 2/35 1 2 3 4 5 Introduction:
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationOptimality conditions for unconstrained optimization. Outline
Optimality conditions for unconstrained optimization Daniel P. Robinson Department of Applied Mathematics and Statistics Johns Hopkins University September 13, 2018 Outline 1 The problem and definitions
More informationThe Geometric Random Walk: More Applications to Gambling
MATH 540 The Geometric Random Walk: More Applications to Gambling Dr. Neal, Spring 2008 We now shall study properties of a random walk process with only upward or downward steps that is stopped after the
More informationMonte Carlo Methods. Leon Gu CSD, CMU
Monte Carlo Methods Leon Gu CSD, CMU Approximate Inference EM: y-observed variables; x-hidden variables; θ-parameters; E-step: q(x) = p(x y, θ t 1 ) M-step: θ t = arg max E q(x) [log p(y, x θ)] θ Monte
More informationParametric Equations, Function Composition and the Chain Rule: A Worksheet
Parametric Equations, Function Composition and the Chain Rule: A Worksheet Prof.Rebecca Goldin Oct. 8, 003 1 Parametric Equations We have seen that the graph of a function f(x) of one variable consists
More informationSUFFICIENT STATISTICS
SUFFICIENT STATISTICS. Introduction Let X (X,..., X n ) be a random sample from f θ, where θ Θ is unknown. We are interested using X to estimate θ. In the simple case where X i Bern(p), we found that the
More informationVariational sampling approaches to word confusability
Variational sampling approaches to word confusability John R. Hershey, Peder A. Olsen and Ramesh A. Gopinath {jrhershe,pederao,rameshg}@us.ibm.com IBM, T. J. Watson Research Center Information Theory and
More informationDEEP LEARNING CHAPTER 3 PROBABILITY & INFORMATION THEORY
DEEP LEARNING CHAPTER 3 PROBABILITY & INFORMATION THEORY OUTLINE 3.1 Why Probability? 3.2 Random Variables 3.3 Probability Distributions 3.4 Marginal Probability 3.5 Conditional Probability 3.6 The Chain
More information1 Lecture 18: The chain rule
1 Lecture 18: The chain rule 1.1 Outline Comparing the graphs of sin(x) an sin(2x). The chain rule. The erivative of a x. Some examples. 1.2 Comparing the graphs of sin(x) an sin(2x) We graph f(x) = sin(x)
More informationP (A G) dp G P (A G)
First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume
More informationf(x) f(z) c x z > 0 1
INVERSE AND IMPLICIT FUNCTION THEOREMS I use df x for the linear transformation that is the differential of f at x.. INVERSE FUNCTION THEOREM Definition. Suppose S R n is open, a S, and f : S R n is a
More informationChapter 6. Estimation of Confidence Intervals for Nodal Maximum Power Consumption per Customer
Chapter 6 Estimation of Confidence Intervals for Nodal Maximum Power Consumption per Customer The aim of this chapter is to calculate confidence intervals for the maximum power consumption per customer
More informationMonte Carlo Simulation. CWR 6536 Stochastic Subsurface Hydrology
Monte Carlo Simulation CWR 6536 Stochastic Subsurface Hydrology Steps in Monte Carlo Simulation Create input sample space with known distribution, e.g. ensemble of all possible combinations of v, D, q,
More informationAverage-cost temporal difference learning and adaptive control variates
Average-cost temporal difference learning and adaptive control variates Sean Meyn Department of ECE and the Coordinated Science Laboratory Joint work with S. Mannor, McGill V. Tadic, Sheffield S. Henderson,
More informationTo get horizontal and slant asymptotes algebraically we need to know about end behaviour for rational functions.
Concepts: Horizontal Asymptotes, Vertical Asymptotes, Slant (Oblique) Asymptotes, Transforming Reciprocal Function, Sketching Rational Functions, Solving Inequalities using Sign Charts. Rational Function
More information016A Homework 10 Solution
016A Homework 10 Solution Jae-young Park November 2, 2008 4.1 #14 Write each expression in the form of 2 kx or 3 kx, for a suitable constant k; (3 x 3 x/5 ) 5, (16 1/4 16 3/4 ) 3x Solution (3 x 3 x/5 )
More informationProblem Set 8
Eli H. Ross eross@mit.edu Alberto De Sole November, 8.5 Problem Set 8 Exercise 36 Let X t and Y t be two independent Poisson processes with rate parameters λ and µ respectively, measuring the number of
More informationBrownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion
Brownian Motion An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2010 Background We have already seen that the limiting behavior of a discrete random walk yields a derivation of
More informationTD Learning. Sean Meyn. Department of Electrical and Computer Engineering University of Illinois & the Coordinated Science Laboratory
TD Learning Sean Meyn Department of Electrical and Computer Engineering University of Illinois & the Coordinated Science Laboratory Joint work with I.-K. Cho, Illinois S. Mannor, McGill V. Tadic, Sheffield
More informationAn Overview of Methods for Applying Semi-Markov Processes in Biostatistics.
An Overview of Methods for Applying Semi-Markov Processes in Biostatistics. Charles J. Mode Department of Mathematics and Computer Science Drexel University Philadelphia, PA 19104 Overview of Topics. I.
More informationMa 530. Special Methods for First Order Equations. Separation of Variables. Consider the equation. M x,y N x,y y 0
Ma 530 Consider the equation Special Methods for First Order Equations Mx, Nx, 0 1 This equation is first order and first degree. The functions Mx, and Nx, are given. Often we write this as Mx, Nx,d 0
More informationStochastic Modelling Unit 1: Markov chain models
Stochastic Modelling Unit 1: Markov chain models Russell Gerrard and Douglas Wright Cass Business School, City University, London June 2004 Contents of Unit 1 1 Stochastic Processes 2 Markov Chains 3 Poisson
More informationRobustesse des techniques de Monte Carlo dans l analyse d événements rares
Institut National de Recherche en Informatique et Automatique Institut de Recherche en Informatique et Systèmes Aléatoires Robustesse des techniques de Monte Carlo dans l analyse d événements rares H.
More informationApril 9, 2009 Name The problems count as marked. The total number of points available is 160. Throughout this test, show your work.
April 9, 009 Name The problems count as marked The total number of points available is 160 Throughout this test, show your work 1 (15 points) Consider the cubic curve f(x) = x 3 + 3x 36x + 17 (a) Build
More informationIntroduction to Probability and Statistics Slides 3 Chapter 3
Introduction to Probability and Statistics Slides 3 Chapter 3 Ammar M. Sarhan, asarhan@mathstat.dal.ca Department of Mathematics and Statistics, Dalhousie University Fall Semester 2008 Dr. Ammar M. Sarhan
More informationInformation geometry for bivariate distribution control
Information geometry for bivariate distribution control C.T.J.Dodson + Hong Wang Mathematics + Control Systems Centre, University of Manchester Institute of Science and Technology Optimal control of stochastic
More informationSMSTC (2007/08) Probability.
SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................
More informationDifferentiation Review, Part 1 (Part 2 follows; there are answers at the end of each part.)
Differentiation Review 1 Name Differentiation Review, Part 1 (Part 2 follows; there are answers at the end of each part.) Derivatives Review: Summary of Rules Each derivative rule is summarized for you
More informationStat 516, Homework 1
Stat 516, Homework 1 Due date: October 7 1. Consider an urn with n distinct balls numbered 1,..., n. We sample balls from the urn with replacement. Let N be the number of draws until we encounter a ball
More informationRandom Numbers and Simulation
Random Numbers and Simulation Generating random numbers: Typically impossible/unfeasible to obtain truly random numbers Programs have been developed to generate pseudo-random numbers: Values generated
More informationSubject CT4 Models. October 2015 Examination INDICATIVE SOLUTION
Institute of Actuaries of India Subject CT4 Models October 2015 Examination INDICATIVE SOLUTION Introduction The indicative solution has been written by the Examiners with the aim of helping candidates.
More informationStochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring Luc Rey-Bellet
Stochastic Processes and Monte-Carlo Methods University of Massachusetts: Spring 2010 Luc Rey-Bellet Contents 1 Random variables and Monte-Carlo method 3 1.1 Review of probability............................
More informationThe Monte Carlo Computation Error of Transition Probabilities
Zuse Institute Berlin Takustr. 7 495 Berlin Germany ADAM NILSN The Monte Carlo Computation rror of Transition Probabilities ZIB Report 6-37 (July 206) Zuse Institute Berlin Takustr. 7 D-495 Berlin Telefon:
More informationA random variable is a quantity whose value is determined by the outcome of an experiment.
Random Variables A random variable is a quantity whose value is determined by the outcome of an experiment. Before the experiment is carried out, all we know is the range of possible values. Birthday example
More informationA COMPARISON OF OUTPUT-ANALYSIS METHODS FOR SIMULATIONS OF PROCESSES WITH MULTIPLE REGENERATION SEQUENCES. James M. Calvin Marvin K.
Proceedings of the 00 Winter Simulation Conference E. Yücesan, C.-H. Chen, J. L. Snowdon, and J. M. Charnes, eds. A COMPARISON OF OUTPUT-ANALYSIS METHODS FOR SIMULATIONS OF PROCESSES WITH MULTIPLE REGENERATION
More information