Lecture 2 : CS6205 Advanced Modeling and Simulation
|
|
- Bernard Booth
- 6 years ago
- Views:
Transcription
1 Lecture 2 : CS6205 Advanced Modeling and Simulation Lee Hwee Kuan 21 Aug For the purpose of learning stochastic simulations for the first time. We shall only consider probabilities on finite discrete events. For example, a coin flip or a dice throw generated finite discrete events like head-tail or numbers 1 to 6 respectively. On the other hand, an example of continuous event will be the height of a person. Let Ω be a finite discrete set. For the outcome of a dice throw, the members of Ω will be numbers 1, 2, 6. The number of elements is 6 which is of course finite. Let x Ω be a random variable, then we can define a probability distribution P (x) such that, 0 P (x) 1 (1) P (x) = 1 (2) x Ω For the dice throw, we can assign the probabilities P (x = 1) = 1/6, P (x = 2) = 1/6, P (x = 6) = 1/6. 1 Pseudo-random number generators Pseudo-random number generators are routines that returns pseudo-random bit patterns. Set of all possible bit patterns are clearly a finite discrete set. However, often these bit patterns are converted into floating point and the pseudo-random bits are normalized into a number between zero and one. It is enough to generate random bits with a uniform distribution. It will be shown later that non-uniform random bits can be derived from uniform random bits. What does uniform random bits means? Consider a bit string of lenth n. Let Ω be the set of all possible bits in this bit string. There are 2 n possible bit strings. Let U() be a routine that returns a random bit and x be a random variable with x = U(). Then the probability distribution P (x) = 1/2 n for all x. 1.1 General implementation of uniform random number routines As discussed in the first lecture, a computer cannot generate true randomness. Most routines generate a sequence of numbers X 0, X 1, X 2, and claim that they are random and uncorrelated. In fact, by design, the number X n depends on some of the numbers X q, q < n. Hence all pseduo-random number generators that I know are not strictly correlated as claimed. 1.2 Congruential random number generator Given a number X n 1, to generate the next uncorrelated number, take this number X n 1, multiply it by some constant number c and then add it to another constant number a 0. Finally, take modulus. X n = (cx n 1 + a 0 )MODN max (3) 1
2 2.5e+09 't' 2e e+09 1e+09 5e e+08 1e e+09 2e e+09 Figure 1: Random pattern generated by a congruential random number generator in particular when we choose a 0 = 0, c = 16807, N max = and X 0 = 5, we get the sequence, (4) which looks pretty random. If we plot the consecutive points in the xy plane, we get a random pattern as shown in Fig. 1. This random number generator is call the congruential random number generator. 2
3 1.3 Shift Register Random Number Generator Another random number generator called the shift register generator is as follows, X n = X n p XOR X n q (5) Assuming a random sequence has been generated up to X n 1, to generate the nth random number, take two previous random numbers in the sequence and perform bitwise XOR operations. One should choose the values of p and q carefully and generate the initial random sequence carefully to get a long sequence of random numbers. Good values of p and q are, (p, q) = (98, 27), (250, 103), (1279, 216), (1279, 418), (6) 1.4 Lagged Fibonacci Generators A variant of the shift register generator is, Again, the values of p and q must be chosen carefully. 1.5 What does random mean? (9689, 84), (9689, 471), (9689, 1836), (9689, 2444), (9689, 4187) X n = X n p X n q (7) I have showed you how we can generate a sequence of random numbers without actually defining what we mean by random. If we flip an unbiased coin, we say the outcome is random. But it is more than just random, it is random with equal probability of landing on head or tail. Suppose we stick a little piece of chewing gum on one side of the coin so that it is not balanced. Now flip the coin. The outcome will still be random, but this time with higher probability of landing on one side (say head) than the other. Now you see what is the difference between simply saying random and saying random with equal probability? Hence when we claim randomness, we must claim randomness with what probability distribution. For the case of congruential, shift register and lagged Fibonacci generators, they produce a sequence of random numbers with each number happening with equal probability (almost). We call this the uniform distribution. 2 Sampling From a Distribution Suppose we have random number generators that can generate a sequence of uniformly distributed numbers between zero and one U[0, 1]. We denote a sample from a uniform distribution as, x U[0, 1] (8) How can we make use of the values of x to generate another sequence of random numbers drawn from a different distribution? For example, we want to generate random numbers from Gaussian distribution, y 1 2π exp( y 2 /2) (9) 3
4 Using the Box-Muller method, we first calculate the cumulative distribution, F (x) = x 1 2π exp( y 2 /2)dy (10) Then set y as the inverse function of F (x), y = F 1 (x). Where x U[0, 1]. Performing integration, we get 3 Importance Sampling y 1 = 2 ln x 1 cos(2πx 2 ) y 2 = 2 ln x 1 sin(2πx 2 ) (11) We do not always know how to sample form a distribution effectively. For example when the distribution is not discrete or we could not enumerate all the probabilities due to a large probability space. In this case, importance sampling can be an effective way to perform sampling and calculating statistics. I have shown you how to sample from a Gaussian distribution. But sampling from a general function h(x) can be hard even when h(x) can be evaluated precisely. Say suppose we want to calculate the mean of x over the distribution h(x). x h(x) = xh(x)dx (13) If we can sample x i from h(x) then the mean is simply (12) x h(x) 1 n n xi (14) for some large values n. Now suppose we can only sample from a Gaussian g instead, x i g(0, 1), we can still estimate the mean of x over h(x) using, x h(x) 1 n n xi h(x i ) g(x i ) xh g g(x) (15) Eq. (15) is in principle correct except for cases when g goes to zero. At the point when g goes to zero, this point will not never be sampled. Hence the accurracy of mean estimation may suffer when h is far away from g. In the idea case, when h = g then the left-hand side and right-hand side of Eq. (15) are equal. If h g then the sampling will be accurate. 4 Markov Chain Monte Carlo Consider a game of gambling between two parties, A and B. Each is given $2, a coin is flipped and A passes $1 to B if the coin lands on head, otherwise B passes A $1. The game is played until one party wins all the money. There are five discrete outcomes (we often call outcome as state ), let x be the amount of money A owns (and B owns 4-x dollars). All possible outcomes (states) are x = 0, 1, 2, 3, 4. We can also discretize time into the number of times the coin is flipped, we denote it by t = 0, 1, 2, 3,, t = 0 is at the begining of the game and denote the amount of money A owns at time t as x t. Note that x t is a random variable, its value depends on 4
5 the side the coin lands. However, we know for sure x 0 = 2. In other words, we can say P 0 (x = 2) = 1, the subscript 0 is use to denote the probability distribution of x at t = 0. Also P 0 (x 2) = 0. The probability distribution of x after the coin is flipped once is P 1 (x = 0) = 0, P 1 (x = 1) = 0.5, P 1 (x = 2) = 0, P 1 (x = 3) = 0.5 and P 1 (x = 4) = 0. For simplicity in the notation, we shall drop the x = symbol and just write P 1 (0) = 0, P 1 (1) = 0.5, P 1 (2) = 0, P 1 (3) = 0.5, P 1 (4) = 0 (16) The notation can be simplified further by writing P as a vector P 1 (0) 0 P 1 (1) P 1 = P 1 (2) P 1 (3) = P 1 (4) 0 (17) Similarly P 0 = (0, 0, 1, 0, 0) T where T denotes the transpose. What will be the probability distribution of x after two coin flips? P 2 = 0.5 (18) What is the probability distribution after n flips? What is the probability distribution after n flips? We also notice the relationship, P 1 = M P 0 (19) where M is called the transition matrix. Explicitly, Eq. (19) is given by, = (20) Similar relationship can be found using the same M. P 2 = M P 1 (21) Using Eq. (19), P 2 = M 2 P0. In general, P n = M n P0 (22) A plot of P n versus n is shown in Fig. 2. Note that P n (0.5, 0, 0, 0, 0.5) T, which is a very intuitive result. 4.1 Transition Matrices How did I arrive at the explicit transition matrices in Eq. (20)? The basic principle is from Bayes theorem, P (x t, x t 1 ) = M(x t x t 1 )P t 1 (x t 1 ) (23) 5
6 1 0.8 P(x=0), P(x=4) P(x=1), P(x=3) P(x=2) Probability n Figure 2: Plot of P n versus n. P (x t, x t 1 ) is the joint distribution of observing x t and x t 1. M(x t x t 1 ) is the conditional probability and P t 1 (x t 1 ) is the probability of observing x t 1. Then P t (x t ) is, P t (x t ) = x t 1 P (x t, x t 1 ) = x t 1 M(x t x t 1 )P t 1 (x t 1 ) (24) If we write P t, P t 1 as vectors and M as a matrix, then Eq. (24) is in the form of Eq. (19). More examples of transition matrices. 4.2 Markov Chains P t = M P t 1 (25) The transition probabilities M(x t x t 1 ) is conditioned only on the previous time point. This is in fact a special case whereas a more general case is when the probabilities is conditioned on many previous time points. For example, M(x t x t 1, x t 2, x 0 ). In a sense, our gambling game only remembers the previous outcome or previous state. This forgetting the past property is called the Markovian property. This transition rules generate a chain of random variables x 0, x 1,. Note that in actual computation, the random variables x 0, x 1, take specific values x i Ω Homogenous Markov Chain Note that in our gambling game, the rules does not change with time, therefore the transition matrix does not depends on time. We call this a homogenous Markov Chain. 4.3 Properties of Transition Matrices (see Olle Haggstrom for details) Some universal properties of transition matrices are: 0 M(x t x t 1 ) 1 (26) 6
7 x t M(x t x t 1 ) = 1 (27) Given a state space Ω, set of all possible outcomes and a Markov Chain x 0, x 1,. We say that two states s i and s j Ω are connected if within a Markov Chain, there is a non-zero probability of reaching s j from s i. For example, if x 0 = s i then there exist k [1, ) such that x k = s j. Irreducibility and ergodicity Definition 4.1: A Markov Chain x 0, x 1 with state space Ω and transition matrix M is said to be irreducible if for all s i, s j Ω, we have that s i and s j are connected. Otherwise the Markov Chain is called reducible. (ref to Olle Haggstrom for the original definition). Another term for irreducibility is ergodicity. If you see these two words in the context of Markov Chain, they mean the same thing. Aperiodicity Consider a state s i Ω. We can read off the matrix element of the transition matrix, if M(s i s i ) > 0 then there is a chance that two consecutive elements of the Markov Chain remains at the state s i. That is, x k = s i and x k+1 = s i (assume x k = s i happens with non-zero probability). However, for some systems, M(s i s i ) = 0 but M n ((s i s i ) > 0 for some n. Define a set of numbers to be a 0, a 1, such that M a k (s i s i ) > 0. The period of s i is defined as gcd(a 0, a 1, ). The state s i is aperiodic if its period is one. Definition 4.2 A Markov Chain is said to be aperiodic if all its states are aperiodic. Otherwise the chain is said to be periodic. Corollary 4.1 Let x 0, x 1, be an irreducible and aperiodic Markov Chain with state space Ω and transition matrix M. Then there exist an N < such that Mi,j n > 0 for all i, j Ω and all n > N. 5 Homework 1. Implement the congruential random number generator with 5 different sets of numbers c, a 0, X 0 and with N max = (a) Plot the 2D plot as shown in Fig. 1. Clearly label in the plot what numbers c, a 0 and X 0 are used. (b) Derive ways to measure which of your 5 sequence gives a more uniform random number distribution. Support your conclusion with data. 2. (a) Suppose the following sequence of 6 random numbers is drawn from a uniform distribution How do you use these values, to draw 6 random numbers from the triangle distribution p(x) = 4x if 0 x < 0.5, p(x) = 4x + 4 for 0.5 x 1 and p(x) = 0 otherwise. Calculate the sampled mean value of this triangle distribution. (b) Suppose another sequence of random numbers is draw from the triangle distribution Use these numbers to estimate the sampled mean of a stepped distribution given by s(x) = 0.1 for 0 x < 0.4, s(x) = 4.4 for 0.4 x < 0.6 s(x) = 0.2 for 0.6 < x 1 and s(x) = 0 otherwise. 7
8 3. Given a random number generator that draws x U[0, 1], develop an algorithm to draw sequences of random number from a discrete distribution with x = 0, 1, 2, 3, 4, y = 0, 1, 2, 3, 4 and probability of sampling x, y as P (x, y) given in the table x\y (28) 4. Given a coin with head (H) and tail (T). Represent the probabilities of being a H (head) and T (tail) in the form of a column vector (P (H), P (T )) t, t representing the transpose. Apply the following rules to flip the coin, (i) Throw an unbiased dice with six sides, flip the coin if it is head and dice shows either 1 or 4. (ii) Throw an unbiased dice with six sides, flip the coin if it is tail and the dice shows either 1, 3 or 5 (a) what is the transition matrix of this operation? (b) Given that the coin shows H (head) now. What is the probability distribution of the coin after one operation described? That is the probability distribution after three operations? (c) What is the probability distribution of the coin after many flips (number of flips tends to infinity)? 6 References 1. Reference: Olle Haggstrom, Finite Markov Chains and Algorithmic Applications London Mathematical Society 2. D. P. Landau, K. Binder, A Guide to Monte Carlo Simulations in Statistical Physics, Cambridge University Press 3. Jun S. Liu, Monte Carlo Strategies in Scientific Computing, Springer 8
Chapter 35 out of 37 from Discrete Mathematics for Neophytes: Number Theory, Probability, Algorithms, and Other Stuff by J. M. Cargal.
35 Mixed Chains In this chapter we learn how to analyze Markov chains that consists of transient and absorbing states. Later we will see that this analysis extends easily to chains with (nonabsorbing)
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationBasic Probability space, sample space concepts and order of a Stochastic Process
The Lecture Contains: Basic Introduction Basic Probability space, sample space concepts and order of a Stochastic Process Examples Definition of Stochastic Process Marginal Distributions Moments Gaussian
More informationMarkov Chains (Part 3)
Markov Chains (Part 3) State Classification Markov Chains - State Classification Accessibility State j is accessible from state i if p ij (n) > for some n>=, meaning that starting at state i, there is
More informationINTRODUCTION TO MARKOV CHAINS AND MARKOV CHAIN MIXING
INTRODUCTION TO MARKOV CHAINS AND MARKOV CHAIN MIXING ERIC SHANG Abstract. This paper provides an introduction to Markov chains and their basic classifications and interesting properties. After establishing
More informationMATH 19B FINAL EXAM PROBABILITY REVIEW PROBLEMS SPRING, 2010
MATH 9B FINAL EXAM PROBABILITY REVIEW PROBLEMS SPRING, 00 This handout is meant to provide a collection of exercises that use the material from the probability and statistics portion of the course The
More informationRandom Number Generation. Stephen Booth David Henty
Random Number Generation Stephen Booth David Henty Introduction Random numbers are frequently used in many types of computer simulation Frequently as part of a sampling process: Generate a representative
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More information1 Probabilities. 1.1 Basics 1 PROBABILITIES
1 PROBABILITIES 1 Probabilities Probability is a tricky word usually meaning the likelyhood of something occuring or how frequent something is. Obviously, if something happens frequently, then its probability
More informationIntroduction to Machine Learning CMU-10701
Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov
More informationLecture 5 - Information theory
Lecture 5 - Information theory Jan Bouda FI MU May 18, 2012 Jan Bouda (FI MU) Lecture 5 - Information theory May 18, 2012 1 / 42 Part I Uncertainty and entropy Jan Bouda (FI MU) Lecture 5 - Information
More informationChapter 11 Advanced Topic Stochastic Processes
Chapter 11 Advanced Topic Stochastic Processes CHAPTER OUTLINE Section 1 Simple Random Walk Section 2 Markov Chains Section 3 Markov Chain Monte Carlo Section 4 Martingales Section 5 Brownian Motion Section
More informationCS 361: Probability & Statistics
February 12, 2018 CS 361: Probability & Statistics Random Variables Monty hall problem Recall the setup, there are 3 doors, behind two of them are indistinguishable goats, behind one is a car. You pick
More informationMath/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions
Math/Stats 45, Sec., Fall 4: Introduction to Probability Final Exam: Solutions. In a game, a contestant is shown two identical envelopes containing money. The contestant does not know how much money is
More information1 Probabilities. 1.1 Basics 1 PROBABILITIES
1 PROBABILITIES 1 Probabilities Probability is a tricky word usually meaning the likelyhood of something occuring or how frequent something is. Obviously, if something happens frequently, then its probability
More information. Find E(V ) and var(v ).
Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number
More informationthe time it takes until a radioactive substance undergoes a decay
1 Probabilities 1.1 Experiments with randomness Wewillusethetermexperimentinaverygeneralwaytorefertosomeprocess that produces a random outcome. Examples: (Ask class for some first) Here are some discrete
More information1 Sequences of events and their limits
O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For
More informationQuestion Paper Code : AEC11T03
Hall Ticket No Question Paper Code : AEC11T03 VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad Four Year B Tech III Semester Tutorial Question Bank 2013-14 (Regulations: VCE-R11)
More informationCS 125 Section #12 (More) Probability and Randomized Algorithms 11/24/14. For random numbers X which only take on nonnegative integer values, E(X) =
CS 125 Section #12 (More) Probability and Randomized Algorithms 11/24/14 1 Probability First, recall a couple useful facts from last time about probability: Linearity of expectation: E(aX + by ) = ae(x)
More informationCS 361: Probability & Statistics
February 19, 2018 CS 361: Probability & Statistics Random variables Markov s inequality This theorem says that for any random variable X and any value a, we have A random variable is unlikely to have an
More informationSolutionbank S1 Edexcel AS and A Level Modular Mathematics
Heinemann Solutionbank: Statistics S Page of Solutionbank S Exercise A, Question Write down whether or not each of the following is a discrete random variable. Give a reason for your answer. a The average
More information33 The Gambler's Ruin 1
33 The Gambler's Ruin 1 Figure 1 The Gambler's Ruin There are many variations of the gambler's ruin problem, but a principal one goes like this. We have i dollars out of a total of n. We flip a coin which
More informationReview of Probability. CS1538: Introduction to Simulations
Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed
More informationChapter 16 focused on decision making in the face of uncertainty about one future
9 C H A P T E R Markov Chains Chapter 6 focused on decision making in the face of uncertainty about one future event (learning the true state of nature). However, some decisions need to take into account
More informationMultivariate probability distributions and linear regression
Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,
More informationMAT 271E Probability and Statistics
MAT 271E Probability and Statistics Spring 2011 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 16.30, Wednesday EEB? 10.00 12.00, Wednesday
More informationHMMT February 2018 February 10, 2018
HMMT February 018 February 10, 018 Algebra and Number Theory 1. For some real number c, the graphs of the equation y = x 0 + x + 18 and the line y = x + c intersect at exactly one point. What is c? 18
More informationP (E) = P (A 1 )P (A 2 )... P (A n ).
Lecture 9: Conditional probability II: breaking complex events into smaller events, methods to solve probability problems, Bayes rule, law of total probability, Bayes theorem Discrete Structures II (Summer
More informationQuiz 1 Date: Monday, October 17, 2016
10-704 Information Processing and Learning Fall 016 Quiz 1 Date: Monday, October 17, 016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED.. Write your name, Andrew
More informationMarkov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains
Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time
More information2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.
CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14
CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten
More informationMath 180B Homework 4 Solutions
Math 80B Homework 4 Solutions Note: We will make repeated use of the following result. Lemma. Let (X n ) be a time-homogeneous Markov chain with countable state space S, let A S, and let T = inf { n 0
More informationBayesian Methods with Monte Carlo Markov Chains II
Bayesian Methods with Monte Carlo Markov Chains II Henry Horng-Shing Lu Institute of Statistics National Chiao Tung University hslu@stat.nctu.edu.tw http://tigpbp.iis.sinica.edu.tw/courses.htm 1 Part 3
More informationDiscrete Structures for Computer Science
Discrete Structures for Computer Science William Garrison bill@cs.pitt.edu 6311 Sennott Square Lecture #24: Probability Theory Based on materials developed by Dr. Adam Lee Not all events are equally likely
More informationSimultaneous Equations Solve for x and y (What are the values of x and y): Summation What is the value of the following given x = j + 1. x i.
1 Algebra Simultaneous Equations Solve for x and y (What are the values of x and y): x + 2y = 6 x - y = 3 Summation What is the value of the following given x = j + 1. Summation Calculate the following:
More informationMAT Mathematics in Today's World
MAT 1000 Mathematics in Today's World Last Time We discussed the four rules that govern probabilities: 1. Probabilities are numbers between 0 and 1 2. The probability an event does not occur is 1 minus
More informationHomework 4 Solution, due July 23
Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var
More informationWhat is a random variable
OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr
More informationMarkov Processes Hamid R. Rabiee
Markov Processes Hamid R. Rabiee Overview Markov Property Markov Chains Definition Stationary Property Paths in Markov Chains Classification of States Steady States in MCs. 2 Markov Property A discrete
More informationUnitary evolution: this axiom governs how the state of the quantum system evolves in time.
CS 94- Introduction Axioms Bell Inequalities /7/7 Spring 7 Lecture Why Quantum Computation? Quantum computers are the only model of computation that escape the limitations on computation imposed by the
More information1 Gambler s Ruin Problem
Coyright c 2017 by Karl Sigman 1 Gambler s Ruin Problem Let N 2 be an integer and let 1 i N 1. Consider a gambler who starts with an initial fortune of $i and then on each successive gamble either wins
More informationLecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019
Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial
More informationMonte Carlo and cold gases. Lode Pollet.
Monte Carlo and cold gases Lode Pollet lpollet@physics.harvard.edu 1 Outline Classical Monte Carlo The Monte Carlo trick Markov chains Metropolis algorithm Ising model critical slowing down Quantum Monte
More informationA simple algorithm that will generate a sequence of integers between 0 and m is:
Using the Pseudo-Random Number generator Generating random numbers is a useful technique in many numerical applications in Physics. This is because many phenomena in physics are random, and algorithms
More informationRandom processes and probability distributions. Phys 420/580 Lecture 20
Random processes and probability distributions Phys 420/580 Lecture 20 Random processes Many physical processes are random in character: e.g., nuclear decay (Poisson distributed event count) P (k, τ) =
More informationReview of probabilities
CS 1675 Introduction to Machine Learning Lecture 5 Density estimation Milos Hauskrecht milos@pitt.edu 5329 Sennott Square Review of probabilities 1 robability theory Studies and describes random processes
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationSimulation. Where real stuff starts
1 Simulation Where real stuff starts ToC 1. What is a simulation? 2. Accuracy of output 3. Random Number Generators 4. How to sample 5. Monte Carlo 6. Bootstrap 2 1. What is a simulation? 3 What is a simulation?
More informationStatistics 433 Practice Final Exam: Cover Sheet and Marking Sheet
Statistics 433 Practice Final Exam: Cover Sheet and Marking Sheet YOUR NAME INSTRUCTIONS: No notes, no calculators, and no communications devices are permitted. Please keep all materials away from your
More informationStatistics 100A Homework 1 Solutions
Problem Statistics 00A Homework Solutions Ryan Rosario Suppose we flip a fair coin 4 times independently. () What is the sample space? By definition, the sample space, denoted as Ω, is the set of all possible
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationLecture 6: Markov Chain Monte Carlo
Lecture 6: Markov Chain Monte Carlo D. Jason Koskinen koskinen@nbi.ku.dk Photo by Howard Jackman University of Copenhagen Advanced Methods in Applied Statistics Feb - Apr 2016 Niels Bohr Institute 2 Outline
More informationSample Spaces, Random Variables
Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted
More informationHomework set 2 - Solutions
Homework set 2 - Solutions Math 495 Renato Feres Simulating a Markov chain in R Generating sample sequences of a finite state Markov chain. The following is a simple program for generating sample sequences
More informationWinter 2019 Math 106 Topics in Applied Mathematics. Lecture 9: Markov Chain Monte Carlo
Winter 2019 Math 106 Topics in Applied Mathematics Data-driven Uncertainty Quantification Yoonsang Lee (yoonsang.lee@dartmouth.edu) Lecture 9: Markov Chain Monte Carlo 9.1 Markov Chain A Markov Chain Monte
More informationLecture 2: Review of Basic Probability Theory
ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent
More informationProbability reminders
CS246 Winter 204 Mining Massive Data Sets Probability reminders Sammy El Ghazzal selghazz@stanfordedu Disclaimer These notes may contain typos, mistakes or confusing points Please contact the author so
More informationSTAT 418: Probability and Stochastic Processes
STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical
More informationDiscrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations
EECS 70 Discrete Mathematics and Probability Theory Fall 204 Anant Sahai Note 5 Random Variables: Distributions, Independence, and Expectations In the last note, we saw how useful it is to have a way of
More informationChapter 2.5 Random Variables and Probability The Modern View (cont.)
Chapter 2.5 Random Variables and Probability The Modern View (cont.) I. Statistical Independence A crucially important idea in probability and statistics is the concept of statistical independence. Suppose
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationSome Basic Concepts of Probability and Information Theory: Pt. 2
Some Basic Concepts of Probability and Information Theory: Pt. 2 PHYS 476Q - Southern Illinois University January 22, 2018 PHYS 476Q - Southern Illinois University Some Basic Concepts of Probability and
More informationMachine Learning. Bayes Basics. Marc Toussaint U Stuttgart. Bayes, probabilities, Bayes theorem & examples
Machine Learning Bayes Basics Bayes, probabilities, Bayes theorem & examples Marc Toussaint U Stuttgart So far: Basic regression & classification methods: Features + Loss + Regularization & CV All kinds
More informationCS 124 Math Review Section January 29, 2018
CS 124 Math Review Section CS 124 is more math intensive than most of the introductory courses in the department. You re going to need to be able to do two things: 1. Perform some clever calculations to
More informationName of the Student:
SUBJECT NAME : Probability & Queueing Theory SUBJECT CODE : MA 6453 MATERIAL NAME : Part A questions REGULATION : R2013 UPDATED ON : November 2017 (Upto N/D 2017 QP) (Scan the above QR code for the direct
More informationE X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.
E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,
More informationLecture 02: Summations and Probability. Summations and Probability
Lecture 02: Overview In today s lecture, we shall cover two topics. 1 Technique to approximately sum sequences. We shall see how integration serves as a good approximation of summation of sequences. 2
More informationMATH 151, FINAL EXAM Winter Quarter, 21 March, 2014
Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists
More informationLecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events
Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek
More informationCS 237: Probability in Computing
CS 237: Probability in Computing Wayne Snyder Computer Science Department Boston University Lecture 11: Geometric Distribution Poisson Process Poisson Distribution Geometric Distribution The Geometric
More informationLecture 20. Randomness and Monte Carlo. J. Chaudhry. Department of Mathematics and Statistics University of New Mexico
Lecture 20 Randomness and Monte Carlo J. Chaudhry Department of Mathematics and Statistics University of New Mexico J. Chaudhry (UNM) CS 357 1 / 40 What we ll do: Random number generators Monte-Carlo integration
More informationLecture 1: Basics of Probability
Lecture 1: Basics of Probability (Luise-Vitetta, Chapter 8) Why probability in data science? Data acquisition is noisy Sampling/quantization external factors: If you record your voice saying machine learning
More informationToday: Fundamentals of Monte Carlo
Today: Fundamentals of Monte Carlo What is Monte Carlo? Named at Los Alamos in 940 s after the casino. Any method which uses (pseudo)random numbers as an essential part of the algorithm. Stochastic - not
More informationISyE 3044 Fall 2017 Test #1a Solutions
1 NAME ISyE 344 Fall 217 Test #1a Solutions This test is 75 minutes. You re allowed one cheat sheet. Good luck! 1. Suppose X has p.d.f. f(x) = 4x 3, < x < 1. Find E[ 2 X 2 3]. Solution: By LOTUS, we have
More informationRecursive Estimation
Recursive Estimation Raffaello D Andrea Spring 08 Problem Set : Bayes Theorem and Bayesian Tracking Last updated: March, 08 Notes: Notation: Unless otherwise noted, x, y, and z denote random variables,
More informationMAS275 Probability Modelling Exercises
MAS75 Probability Modelling Exercises Note: these questions are intended to be of variable difficulty. In particular: Questions or part questions labelled (*) are intended to be a bit more challenging.
More informationTheoretical Cryptography, Lecture 10
Theoretical Cryptography, Lecture 0 Instructor: Manuel Blum Scribe: Ryan Williams Feb 20, 2006 Introduction Today we will look at: The String Equality problem, revisited What does a random permutation
More informationSTAT 414: Introduction to Probability Theory
STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises
More informationThe PAC Learning Framework -II
The PAC Learning Framework -II Prof. Dan A. Simovici UMB 1 / 1 Outline 1 Finite Hypothesis Space - The Inconsistent Case 2 Deterministic versus stochastic scenario 3 Bayes Error and Noise 2 / 1 Outline
More informationA brief review of basics of probabilities
brief review of basics of probabilities Milos Hauskrecht milos@pitt.edu 5329 Sennott Square robability theory Studies and describes random processes and their outcomes Random processes may result in multiple
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More informationLecture #5. Dependencies along the genome
Markov Chains Lecture #5 Background Readings: Durbin et. al. Section 3., Polanski&Kimmel Section 2.8. Prepared by Shlomo Moran, based on Danny Geiger s and Nir Friedman s. Dependencies along the genome
More informationThe problems that follow illustrate the methods covered in class. They are typical of the types of problems that will be on the tests.
NUMERICAL ANALYSIS PRACTICE PROBLEMS JAMES KEESLING The problems that follow illustrate the methods covered in class. They are typical of the types of problems that will be on the tests.. Solving Equations
More informationStochastic Processes
qmc082.tex. Version of 30 September 2010. Lecture Notes on Quantum Mechanics No. 8 R. B. Griffiths References: Stochastic Processes CQT = R. B. Griffiths, Consistent Quantum Theory (Cambridge, 2002) DeGroot
More informationWorkshop on Heterogeneous Computing, 16-20, July No Monte Carlo is safe Monte Carlo - more so parallel Monte Carlo
Workshop on Heterogeneous Computing, 16-20, July 2012 No Monte Carlo is safe Monte Carlo - more so parallel Monte Carlo K. P. N. Murthy School of Physics, University of Hyderabad July 19, 2012 K P N Murthy
More informationToday: Fundamentals of Monte Carlo
Today: Fundamentals of Monte Carlo What is Monte Carlo? Named at Los Alamos in 1940 s after the casino. Any method which uses (pseudo)random numbers as an essential part of the algorithm. Stochastic -
More informationMAKING MONEY FROM FAIR GAMES: EXAMINING THE BOREL-CANTELLI LEMMA
MAKING MONEY FROM FAIR GAMES: EXAMINING THE BOREL-CANTELLI LEMMA SAM CANNON Abstract. In this paper we discuss and prove the Borel-Cantelli Lemma. We then show two interesting applications of the Borel-
More informationThis exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.
Probability and Statistics FS 2017 Session Exam 22.08.2017 Time Limit: 180 Minutes Name: Student ID: This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationChapter 10 Markov Chains and Transition Matrices
Finite Mathematics (Mat 119) Lecture week 3 Dr. Firozzaman Department of Mathematics and Statistics Arizona State University Chapter 10 Markov Chains and Transition Matrices A Markov Chain is a sequence
More informationCONTENTS. Preface List of Symbols and Notation
CONTENTS Preface List of Symbols and Notation xi xv 1 Introduction and Review 1 1.1 Deterministic and Stochastic Models 1 1.2 What is a Stochastic Process? 5 1.3 Monte Carlo Simulation 10 1.4 Conditional
More informationDiscrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10
EECS 70 Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 Introduction to Basic Discrete Probability In the last note we considered the probabilistic experiment where we flipped
More informationCS 70 Discrete Mathematics and Probability Theory Spring 2018 Ayazifar and Rao Final
CS 70 Discrete Mathematics and Probability Theory Spring 2018 Ayazifar and Rao Final PRINT Your Name:, (Last) (First) READ AND SIGN The Honor Code: As a member of the UC Berkeley community, I act with
More informationThe Theory behind PageRank
The Theory behind PageRank Mauro Sozio Telecom ParisTech May 21, 2014 Mauro Sozio (LTCI TPT) The Theory behind PageRank May 21, 2014 1 / 19 A Crash Course on Discrete Probability Events and Probability
More informationMARKOV PROCESSES. Valerio Di Valerio
MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some
More informationMath Stochastic Processes & Simulation. Davar Khoshnevisan University of Utah
Math 5040 1 Stochastic Processes & Simulation Davar Khoshnevisan University of Utah Module 1 Generation of Discrete Random Variables Just about every programming language and environment has a randomnumber
More informationRandom number generators and random processes. Statistics and probability intro. Peg board example. Peg board example. Notes. Eugeniy E.
Random number generators and random processes Eugeniy E. Mikhailov The College of William & Mary Lecture 11 Eugeniy Mikhailov (W&M) Practical Computing Lecture 11 1 / 11 Statistics and probability intro
More informationMarkov chain Monte Carlo Lecture 9
Markov chain Monte Carlo Lecture 9 David Sontag New York University Slides adapted from Eric Xing and Qirong Ho (CMU) Limitations of Monte Carlo Direct (unconditional) sampling Hard to get rare events
More information