Special Mathematics. Tutorial 13. Markov chains
|
|
- Pauline Greene
- 5 years ago
- Views:
Transcription
1 Tutorial 13 Markov chains "The future starts today, not tomorrow." Pope John Paul II A sequence of trials of an experiment is a nite Markov chain if: the outcome of each experiment is one of a nite set of states Ω = {i 1, i 2,... i n }. the outcome of an experiment depends only on the present state, and not on any past states: P (X k+1 = j k+1 X k = j k, X k 1 = j k 1,... X 0 = j 0 ) = P (X k+1 = j k+1 X k = j k ) for some states j 0, j 1,... j k+1 from Ω. We will work with time-homogeneous Markov chains, i.e. the probability transition matrix P does not depend on k, and then: P 11 P P 1n P n1 P n2... P nn where P (X k+1 = j X k = i) = P ij. The probability that the system is in state i after k steps is denoted by: p i (k) = P (X k = i) which means the following distribution for the random variables X k : X k : One has the property: i1 i 2... i n p i1 (k) p i2 (k)... p in (k) p i (k) = or written in a vectorial way: n p j (k 1) P ji j=1 p(k) = p(k 1) P Suppose a Markov chain has initial probability vector: p(0) = (p i1 (0), p i2 (0),..., p in (0))
2 and transition matrix P, then the probability vector after n repetitions (steps) of the experiment is: The following identity holds: p(n) = p(0) P n P (X 0 = j 0, X 1 = j 1, X 2 = j 2,..., X k = j k ) = p j0 (0)P j0 j 1 P j1 j 2... P jk 1 j k for arbitrary states j 0,... j k. Absorbing Markov chains absorbing Markov states: a state i is absorbing if p ii = 1 A Markov chain is an absorbing Markov chain if the chain has at least one absorbing state and it is possible to go from any nonabsorbing state to an absorbing state. let P be the transition matrix of an absorbing Markov chain. Rearrange the rows and columns so that the absorbing states come rst. Matrix P will have the form I 0 R Q the fundamental matrix is dened as F = (I Q) 1 and it can be shown that: P n I 0, n F R 0 the matrix F R gives the matrix of probabilities that a particular initial nonabsorbing state will lead to a particular absorbing state. Regular Markov chains A Markov chain is a regular Markov chain if its transition matrix is regular i.e. some power of it has all the entries positive. for a regular Markov chain there exists a unique probability vector v such that for every probability vector v 0 : v 0 P n v, n the vector v is called the equilibrium vector and it gives the long-range trend of the Markov chain the vector v = (v 1, v 2,..., v n ) is found using the identities: and v v v 1 + v v n = 1.
3 Solved problems Problem 1. At the end of June 40% of the voters were registered as liberal, 45% as conservative, and 15% as independent. Over a one-month period, the liberals retained 80% of their constituency, while 15% switched to conservative and 5% to independent. The conservatives retained 70% and lost 30% to liberals. The independent retained 60% and lost 20% each to liberals and conservatives. Assume that these trends continue. a. Write a transition matrix using this information. b. Find the percent of each type of voter at the end of August. c. If the elections are in October 2018 which party will have the most chances to win the elections? Solution: the transition matrix, using the states L (liberal), C (conservative) and I (independent), is: identify the initial probability vector: p(0) = (0.40, 0.45, 0.15) after two months the probability vector is computed using the formula: p(2) = p(0) P 2 observe that P 2 is a regular matrix and thus we have a regular Markov chain. formulate the main property of a regular Markov chain: for a regular Markov chain there exists a unique probability vector v such that for every probability vector v 0 : v 0 P n v, n nd this equilibrium vector v = (v 1, v 2, v 3 ) which gives the long-range trend of the Markov chain, from the equations: and v v v 1 + v 2 + v 3 = 1. the vector v gives the situation in October 2018.
4 Problem 2. A large group of mice is kept in a cage having connected compartments A, B, and C. Mice in compartment A move to B with probability 0.3 and to C with probability 0.4 Mice in B move to A or C with probability 0.2 and 0.25, respectively. The door of compartment C can not be opened from inside. Find the probability that a mouse from compartment A will eventually end up in compartment C. Solution: The probability transition matrix of the attached Markov chain is: A B C A B C Thus C is an absorbing state absorbing Markov states: a state i is absorbing if p ii = 1 A Markov chain is an absorbing Markov chain if the chain has at least one absorbing state and it is possible to go from any nonabsorbing state to an absorbing state. let P be the transition matrix of an absorbing Markov chain. Rearrange the rows and columns so that the absorbing states come rst. Matrix P will have the form I 0 R Q the fundamental matrix is dened as F = (I Q) 1 and it can be shown that: P n I 0, n F R 0 the matrix F R gives the matrix of probabilities that a particular initial nonabsorbing state will lead to a particular absorbing state. Rearranging the states one gets: C A B C A B thus R = and Q = Finally one has F and F R Thus a mouse from compartment A has 99% chance to end up trapped in compartment C.
5 Proposed problems Problem 1. Write the transition diagram corresponding to the transition matrix: and reversely write the transition matrix corresponding to the diagram: Problem 2. At "Politehnica" University a student has a chance of 15% of unking out during a given year, a 25% chance to repeat the year and 60% chance to nish the year. For a 3rd year student the possible states are: 3rd year student, 4th year student, has unked out, has graduated. Find a transition matrix. Find the probability that a 3rd year student will graduate. Problem 3. A market analyst is interested in whether consumers prefer Dell or Gateway computers. two market surveys taken one year apart reveals the following: 10% of Dell owners had switched to Gateway and the rest continued with Dell. 35% of Gateway owners had switched to Dell and the rest continued with Gateway. Find the distribution of the market after a long period of time. Problem 4. A security guard can stand in front of any one of the three doors of a building, and every minute he decides whether to move to another door chosen at random. If he is at middle door, he is equally likely to stay where he is, move to the door to the left, or move to the door to the right. If he is at the door on either end, he is equally likely to stay where he is or to move to the middle door. Write the transition probability matrix and prove that it corresponds to a regular Markov chain. Find the long-range trend for the fraction of time the guard spends in front of each door.
6 Problem 5. Let Ω = {C, R, S, G} denote the space of weather conditions, where C=cloudy, R=rainy, S=snowy and G=good. We suppose to have a probability transition matrix P as follows: If on Monday the weather is good what is the weather forecast for Wednesday? (i.e. the chances it will be cloudy, rainy, snowy or good ). Find the chance that on Tuesday it will be rainy, on Wednesday it will be cloudy and on Thursday it will rain again. Problem 6. We simplify the previous problem assuming now only three possible weather conditions C,R and G with the probability matrix: If on Monday is rainy what is the weather forecast for the Christmas Day? Problem 7. A computer system can operate in two dierent modes. Every hour, it remains in the same mode or switches to a dierent mode according to the transition probability matrix: If the system is in Mode I at 5 : 30 pm, what is the probability that it will be in Mode I at 7 : 30 pm on the same day? Draw the state transition diagram for the corresponding Markov chains of these two problems.
Math 166: Topics in Contemporary Mathematics II
Math 166: Topics in Contemporary Mathematics II Xin Ma Texas A&M University November 26, 2017 Xin Ma (TAMU) Math 166 November 26, 2017 1 / 14 A Review A Markov process is a finite sequence of experiments
More informationDiscrete Markov Chain. Theory and use
Discrete Markov Chain. Theory and use Andres Vallone PhD Student andres.vallone@predoc.uam.es 2016 Contents 1 Introduction 2 Concept and definition Examples Transitions Matrix Chains Classification 3 Empirical
More informationAPPM 4/5560 Markov Processes Fall 2019, Some Review Problems for Exam One
APPM /556 Markov Processes Fall 9, Some Review Problems for Exam One (Note: You will not have to invert any matrices on the exam and you will not be expected to solve systems of or more unknowns. There
More informationMath Camp Notes: Linear Algebra II
Math Camp Notes: Linear Algebra II Eigenvalues Let A be a square matrix. An eigenvalue is a number λ which when subtracted from the diagonal elements of the matrix A creates a singular matrix. In other
More informationT HINK ABOUT IT MARKOV CHAINS. Transition Matrix In sociology, it is convenient to classify people by income
Markov Chains MARKOV CHAINS T HINK ABOUT IT If we know the probability that the child of a lower-class parent becomes middle-class or upperclass, and we know similar information for the child of a middle-class
More informationISE/OR 760 Applied Stochastic Modeling
ISE/OR 760 Applied Stochastic Modeling Topic 2: Discrete Time Markov Chain Yunan Liu Department of Industrial and Systems Engineering NC State University Yunan Liu (NC State University) ISE/OR 760 1 /
More information2 DISCRETE-TIME MARKOV CHAINS
1 2 DISCRETE-TIME MARKOV CHAINS 21 FUNDAMENTAL DEFINITIONS AND PROPERTIES From now on we will consider processes with a countable or finite state space S {0, 1, 2, } Definition 1 A discrete-time discrete-state
More informationMarkov Chains. As part of Interdisciplinary Mathematical Modeling, By Warren Weckesser Copyright c 2006.
Markov Chains As part of Interdisciplinary Mathematical Modeling, By Warren Weckesser Copyright c 2006 1 Introduction A (finite) Markov chain is a process with a finite number of states (or outcomes, or
More informationMarkov Chains. Chapter 16. Markov Chains - 1
Markov Chains Chapter 16 Markov Chains - 1 Why Study Markov Chains? Decision Analysis focuses on decision making in the face of uncertainty about one future event. However, many decisions need to consider
More informationMATH3200, Lecture 31: Applications of Eigenvectors. Markov Chains and Chemical Reaction Systems
Lecture 31: Some Applications of Eigenvectors: Markov Chains and Chemical Reaction Systems Winfried Just Department of Mathematics, Ohio University April 9 11, 2018 Review: Eigenvectors and left eigenvectors
More informationTMA 4265 Stochastic Processes Semester project, fall 2014 Student number and
TMA 4265 Stochastic Processes Semester project, fall 2014 Student number 730631 and 732038 Exercise 1 We shall study a discrete Markov chain (MC) {X n } n=0 with state space S = {0, 1, 2, 3, 4, 5, 6}.
More informationThank you for choosing AIMS!
Thank you for choosing AIMS! Please use this free activity in your classroom, and watch your students begin to experience the "Aha!" moments of real learning. We like hearing from you. Like us and share
More informationLecture 5: Introduction to Markov Chains
Lecture 5: Introduction to Markov Chains Winfried Just Department of Mathematics, Ohio University January 24 26, 2018 weather.com light The weather is a stochastic process. For now we can assume that this
More informationDoing Physics with Random Numbers
Doing Physics with Random Numbers Andrew J. Schultz Department of Chemical and Biological Engineering University at Buffalo The State University of New York Concepts Random numbers can be used to measure
More informationChapter 10. Finite-State Markov Chains. Introductory Example: Googling Markov Chains
Chapter 0 Finite-State Markov Chains Introductory Example: Googling Markov Chains Google means many things: it is an Internet search engine, the company that produces the search engine, and a verb meaning
More informationAnnouncements Monday, September 18
Announcements Monday, September 18 WeBWorK 1.4, 1.5 are due on Wednesday at 11:59pm. The first midterm is on this Friday, September 22. Midterms happen during recitation. The exam covers through 1.5. About
More information18.440: Lecture 33 Markov Chains
18.440: Lecture 33 Markov Chains Scott Sheffield MIT 1 Outline Markov chains Examples Ergodicity and stationarity 2 Outline Markov chains Examples Ergodicity and stationarity 3 Markov chains Consider a
More informationWeather Observation Journal
Weather Observation Journal Overview: Elders and scientists alike use Weather Observation Journals to note patterns in the weather and to have a written record of stories and information to share with
More informationWeather An Introduction to Weather
Non-fiction: Weather An Introduction To Weather Weather An Introduction to Weather Monday Tuesday Wednesday Thursday Friday What does the word weather mean to you? Everyone knows how to describe the weather.
More informationThe probability of going from one state to another state on the next trial depends only on the present experiment and not on past history.
c Dr Oksana Shatalov, Fall 2010 1 9.1: Markov Chains DEFINITION 1. Markov process, or Markov Chain, is an experiment consisting of a finite number of stages in which the outcomes and associated probabilities
More information3. If a forecast is too high when compared to an actual outcome, will that forecast error be positive or negative?
1. Does a moving average forecast become more or less responsive to changes in a data series when more data points are included in the average? 2. Does an exponential smoothing forecast become more or
More informationSTOCHASTIC MODELS LECTURE 1 MARKOV CHAINS. Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (ShenZhen) Sept.
STOCHASTIC MODELS LECTURE 1 MARKOV CHAINS Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (ShenZhen) Sept. 6, 2016 Outline 1. Introduction 2. Chapman-Kolmogrov Equations
More information18.600: Lecture 32 Markov Chains
18.600: Lecture 32 Markov Chains Scott Sheffield MIT Outline Markov chains Examples Ergodicity and stationarity Outline Markov chains Examples Ergodicity and stationarity Markov chains Consider a sequence
More informationASTR 101L: Motion of the Sun Take Home Lab
Name: CWID: Section: Introduction Objectives This lab is designed to help you understand the Sun s apparent motion in the sky over the course of the year. In Section 2 you are asked to answer some questions
More informationChapter 16 focused on decision making in the face of uncertainty about one future
9 C H A P T E R Markov Chains Chapter 6 focused on decision making in the face of uncertainty about one future event (learning the true state of nature). However, some decisions need to take into account
More informationToday. Next lecture. (Ch 14) Markov chains and hidden Markov models
Today (Ch 14) Markov chains and hidden Markov models Graphical representation Transition probability matrix Propagating state distributions The stationary distribution Next lecture (Ch 14) Markov chains
More information1 Normal Distribution.
Normal Distribution.. Introduction A Bernoulli trial is simple random experiment that ends in success or failure. A Bernoulli trial can be used to make a new random experiment by repeating the Bernoulli
More informationLesson Plan. AM 121: Introduction to Optimization Models and Methods. Lecture 17: Markov Chains. Yiling Chen SEAS. Stochastic process Markov Chains
AM : Introduction to Optimization Models and Methods Lecture 7: Markov Chains Yiling Chen SEAS Lesson Plan Stochastic process Markov Chains n-step probabilities Communicating states, irreducibility Recurrent
More informationWeather. science centers. created by: The Curriculum Corner.
Weather science centers created by: The Curriculum Corner Weather Centers 1. Weather Flap Book 2. Future Meteorologist / Make a Forecast 3. Ready for the Weather 4. Make a Match 5. What s the Temperature?
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 1
MATH 56A: STOCHASTIC PROCESSES CHAPTER. Finite Markov chains For the sake of completeness of these notes I decided to write a summary of the basic concepts of finite Markov chains. The topics in this chapter
More informationIE 336 Seat # Name. Closed book. One page of hand-written notes, front and back. No calculator. 60 minutes.
Closed book. One page of hand-written notes, front and back. No calculator. 60 minutes. Cover page and five pages of exam. Four questions. To receive full credit, show enough work to indicate your logic.
More informationDefinition and Examples of DTMCs
Definition and Examples of DTMCs Natarajan Gautam Department of Industrial and Systems Engineering Texas A&M University 235A Zachry, College Station, TX 77843-3131 Email: gautam@tamuedu Phone: 979-845-5458
More informationeach nonabsorbing state to each absorbing state.
Chapter 8 Markov Processes Absorbing States Markov Processes Markov process models are useful in studying the evolution of systems over repeated trials or sequential time periods or stages. They have been
More informationCombinations. April 12, 2006
Combinations April 12, 2006 Combinations, April 12, 2006 Binomial Coecients Denition. The number of distinct subsets with j elements that can be chosen from a set with n elements is denoted by ( n j).
More informationσ. We further know that if the sample is from a normal distribution then the sampling STAT 2507 Assignment # 3 (Chapters 7 & 8)
STAT 2507 Assignment # 3 (Chapters 7 & 8) DUE: Sections E, F Section G Section H Monday, March 16, in class Tuesday, March 17, in class Wednesday, March 18, in class Last Name Student # First Name Your
More informationCHAPTER 6. Markov Chains
CHAPTER 6 Markov Chains 6.1. Introduction A(finite)Markovchainisaprocess withafinitenumberofstates (or outcomes, or events) in which the probability of being in a particular state at step n+1depends only
More informationDefinition. A matrix is a rectangular array of numbers enclosed by brackets (plural: matrices).
Matrices (general theory). Definition. A matrix is a rectangular array of numbers enclosed by brackets (plural: matrices). Examples. 1 2 1 1 0 2 A= 0 0 7 B= 0 1 3 4 5 0 Terminology and Notations. Each
More informationNo class on Thursday, October 1. No office hours on Tuesday, September 29 and Thursday, October 1.
Stationary Distributions Monday, September 28, 2015 2:02 PM No class on Thursday, October 1. No office hours on Tuesday, September 29 and Thursday, October 1. Homework 1 due Friday, October 2 at 5 PM strongly
More informationDiscrete Time Markov Chain (DTMC)
Discrete Time Markov Chain (DTMC) John Boccio February 3, 204 Sources Taylor & Karlin, An Introduction to Stochastic Modeling, 3rd edition. Chapters 3-4. Ross, Introduction to Probability Models, 8th edition,
More informationIE 336 Seat # Name (clearly) Closed book. One page of hand-written notes, front and back. No calculator. 60 minutes.
Closed book. One page of hand-written notes, front and back. No calculator. 6 minutes. Cover page and four pages of exam. Fifteen questions. Each question is worth seven points. To receive full credit,
More informationGeneral Info. Grading
Syllabus & Policies General Info Lecture 1: Introduction, Set Theory, and Boolean Algebra Classroom: Perkins 2-072 Time: Mon - Fri, 2:00-3:15 pm Wed, 3:30-4:30 pm Sta 111 Colin Rundel May 13, 2014 Professor:
More informationWatching the Weather
Watching the Weather Topic Observing the weather Key Question What is the weather like today? Focus Students will observe and record weather conditions over a long period of time. Guiding Documents NCTM
More informationMatrix Multiplication
Matrix Multiplication Example (Cost Analysis, 45 in 2.4) The Mundo Candy Company makes three types of chocolate candy: Cheery Cherry, Mucho Mocha, and Almond Delight. The company produces its products
More informationMy Calendar Notebook
My Calendar Notebook 100 Days of School! Today s number + what number equals 100? + =100 Today is: Sunday Monday Tuesday Wednesday Thursday Friday Saturday The date is: The number before... The number
More informationLinear Algebra: Linear Systems and Matrices - Quadratic Forms and Deniteness - Eigenvalues and Markov Chains
Linear Algebra: Linear Systems and Matrices - Quadratic Forms and Deniteness - Eigenvalues and Markov Chains Joshua Wilde, revised by Isabel Tecu, Takeshi Suzuki and María José Boccardi August 3, 3 Systems
More informationCalculus and linear algebra for biomedical engineering Week 3: Matrices, linear systems of equations, and the Gauss algorithm
Calculus and linear algebra for biomedical engineering Week 3: Matrices, linear systems of equations, and the Gauss algorithm Hartmut Führ fuehr@matha.rwth-aachen.de Lehrstuhl A für Mathematik, RWTH Aachen
More informationWeather Observation Journal
Weather Observation Journal Levels Overview: Elders and scientists alike use Weather Observation Journals to note patterns in the weather and to have a written record of stories and information shared
More informationProbability, Random Processes and Inference
INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx
More informationDiscrete time Markov chains. Discrete Time Markov Chains, Definition and classification. Probability axioms and first results
Discrete time Markov chains Discrete Time Markov Chains, Definition and classification 1 1 Applied Mathematics and Computer Science 02407 Stochastic Processes 1, September 5 2017 Today: Short recap of
More informationCHAPTER 15 PROBABILITY Introduction
PROBABILLITY 271 PROBABILITY CHAPTER 15 It is remarkable that a science, which began with the consideration of games of chance, should be elevated to the rank of the most important subject of human knowledge.
More informationMarkov Chains Introduction
Markov Chains 4 4.1. Introduction In this chapter, we consider a stochastic process {X n,n= 0, 1, 2,...} that takes on a finite or countable number of possible values. Unless otherwise mentioned, this
More informationInstructions Answers Calculators must not
Instructions Answers This means write down your answer or show your working and your answer. Calculators You must not use a calculator in this test. 2 Milk Shakes 1. A shop kept a tally chart to show what
More informationLTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather
1. Markov chain LTCC. Exercises Let X 0, X 1, X 2,... be a Markov chain with state space {1, 2, 3, 4} and transition matrix 1/2 1/2 0 0 P = 0 1/2 1/3 1/6. 0 0 0 1 (a) What happens if the chain starts in
More informationWhere are we in CS 440?
Where are we in CS 440? Now leaving: sequential deterministic reasoning Entering: probabilistic reasoning and machine learning robability: Review of main concepts Chapter 3 Making decisions under uncertainty
More informationChapter 10 Markov Chains and Transition Matrices
Finite Mathematics (Mat 119) Lecture week 3 Dr. Firozzaman Department of Mathematics and Statistics Arizona State University Chapter 10 Markov Chains and Transition Matrices A Markov Chain is a sequence
More informationPhysics 121, April 29, The Second Law of Thermodynamics.
Physics 121, April 29, 2008. The Second Law of Thermodynamics. http://www.horizons.uc.edu/masterjuly1998/oncampus.htm Physics 121. April 29, 2008. Course Information Topics to be discussed today: The Second
More informationMarkov Chains and Transition Probabilities
Hinthada University Research Journal 215, Vol. 6, No. 1 3 Markov Chains and Transition Probabilities Ko Ko Oo Abstract Markov chain is widely applicable to the study of many real-world phenomene. We represent
More informationUnit Title: Weather, Climate, Climate Change Lesson Title: Will It Rain Tomorrow?
OVERVIEW Unit Title: Weather, Climate, Climate Change Lesson Title: Will It Rain Tomorrow? Length of Lesson in # of Hours: 3 # of Classes: 2 How does this lesson connect to previous or future work as exemplified
More informationExtending Learning Beyond the Classroom
Extending Learning Beyond the Classroom School is out, but learning continues! GRADE 1 Clayton County Public Schools Department of Curriculum, Instruction, & Assessment DR. EBONY T. LEE Director of Curriculum,
More informationChapter 1 0+7= 1+6= 2+5= 3+4= 4+3= 5+2= 6+1= 7+0= How would you write five plus two equals seven?
Chapter 1 0+7= 1+6= 2+5= 3+4= 4+3= 5+2= 6+1= 7+0= If 3 cats plus 4 cats is 7 cats, what does 4 olives plus 3 olives equal? olives How would you write five plus two equals seven? Chapter 2 Tom has 4 apples
More informationECE 6960: Adv. Random Processes & Applications Lecture Notes, Fall 2010
ECE 6960: Adv. Random Processes & Alications Lecture Notes, Fall 2010 Lecture 16 Today: (1) Markov Processes, (2) Markov Chains, (3) State Classification Intro Please turn in H 6 today. Read Chater 11,
More information18.175: Lecture 30 Markov chains
18.175: Lecture 30 Markov chains Scott Sheffield MIT Outline Review what you know about finite state Markov chains Finite state ergodicity and stationarity More general setup Outline Review what you know
More informationUnit 2 Maths Methods (CAS) Exam
Name: Teacher: Unit Maths Methods (CAS) Exam 1 014 Monday November 17 (9.00-10.45am) Reading time: 15 Minutes Writing time: 90 Minutes Instruction to candidates: Students are permitted to bring into the
More informationMarkov Chains. Chapter Introduction. 1. w(1) = (.5,.25,.25) w(2) = (.4375,.1875,.375) w(3) = (.40625, , ) 1 0.
Chapter 11 Markov Chains 111 Introduction 1 w(1) = (5 25 25) w(2) = (4375 1875 375) w(3) = (40625 203125 390625) ( 1 0 1 0 2 P = P 2 = 1 2 1 2 1 0 P n = 2 n 1 2 n 1 2 n 3 4 ( 1 0 1 0 ) 1 4 ) P 3 = ( 1
More informationOutlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC)
Markov Chains (2) Outlines Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC) 2 pj ( n) denotes the pmf of the random variable p ( n) P( X j) j We will only be concerned with homogenous
More informationDiscrete Mathematics & Mathematical Reasoning Induction
Discrete Mathematics & Mathematical Reasoning Induction Colin Stirling Informatics Colin Stirling (Informatics) Discrete Mathematics (Sections 5.1 & 5.2) Today 1 / 11 Another proof method: Mathematical
More informationGrades 7 & 8, Math Circles 24/25/26 October, Probability
Faculty of Mathematics Waterloo, Ontario NL 3G1 Centre for Education in Mathematics and Computing Grades 7 & 8, Math Circles 4/5/6 October, 017 Probability Introduction Probability is a measure of how
More informationCOVENANT UNIVERSITY NIGERIA TUTORIAL KIT OMEGA SEMESTER PROGRAMME: ECONOMICS
COVENANT UNIVERSITY NIGERIA TUTORIAL KIT OMEGA SEMESTER PROGRAMME: ECONOMICS COURSE: CBS 221 DISCLAIMER The contents of this document are intended for practice and leaning purposes at the undergraduate
More informationECE 541 Project Report: Modeling the Game of RISK Using Markov Chains
Contents ECE 541 Project Report: Modeling the Game of RISK Using Markov Chains Stochastic Signals and Systems Rutgers University, Fall 2014 Sijie Xiong, RUID: 151004243 Email: sx37@rutgers.edu 1 The Game
More informationK-ESS2-1. By: Natalie Rapson and Cassidy Smith
K-ESS2-1 By: Natalie Rapson and Cassidy Smith Standard K-ESS2-1 Use and share observations of local weather conditions to describe patterns over time. Clarification Statement Examples of qualitative observations
More informationLaw of large numbers for Markov chains
Chapter 14 Law of large numbers for Markov chains In this chapter we consider the equilibrium state of a Markov chain in the long run limit of many steps. This limit of observing the dynamic chain over
More informationTropical Update. 5 PM EDT Sunday, October 7, 2018 Tropical Storm Michael, Tropical Storm Leslie, & Invest 92L (30%)
Tropical Update 5 PM EDT Sunday, October 7, 2018 Tropical Storm Michael, Tropical Storm Leslie, & Invest 92L (30%) This update is intended for government and emergency response officials, and is provided
More informationChemical Reactions Investigation Two Data Record
Chemical Reactions Investigation Two Data Record Name: Date: 1. During this Investigation, you will analyze how changing the amounts of the reactants in a chemical reaction affects the amount of the products
More informationUncertainty Runs Rampant in the Universe C. Ebeling circa Markov Chains. A Stochastic Process. Into each life a little uncertainty must fall.
Uncertainty Runs Rampant in the Universe C. Ebeling circa 2000 Markov Chains A Stochastic Process Into each life a little uncertainty must fall. Our Hero - Andrei Andreyevich Markov Born: 14 June 1856
More informationHW1 Solutions. October 5, (20 pts.) Random variables, sample space and events Consider the random experiment of ipping a coin 4 times.
HW1 Solutions October 5, 2016 1. (20 pts.) Random variables, sample space and events Consider the random experiment of ipping a coin 4 times. 1. (2 pts.) Dene the appropriate random variables. Answer:
More informationJANUARY MONDAY TUESDAY WEDNESDAY THURSDAY FRIDAY SATURDAY SUNDAY
Vocabulary (01) The Calendar (012) In context: Look at the calendar. Then, answer the questions. JANUARY MONDAY TUESDAY WEDNESDAY THURSDAY FRIDAY SATURDAY SUNDAY 1 New 2 3 4 5 6 Year s Day 7 8 9 10 11
More informationPlease simplify your answers to the extent reasonable without a calculator. Show your work. Explain your answers, concisely.
Please simplify your answers to the extent reasonable without a calculator. Show your work. Explain your answers, concisely. 1. Consider a game which involves flipping a coin: winning $1 when it lands
More informationFriday, November 2, 2018 GLE/Standard: The fact that atoms are conserved, together
Mrs. Chausse s Physical Science Gifted Lesson Plan: Unit 7 Chemical Reactions November 1 16, 2018 Thursday, November 1, 2018 Objective: SWBAT balance a chemical equation that satisfies the Law of Conservation
More informationWorksheet 2 Problems
Technische Universität München WS 205/6 Lehrstuhl für Informatik V Scientific Computing Dr. T. Neckel 02..205/04..205 M.Sc. D. Jarema Worksheet 2 Problems Eigenvalue Problems and Algebraic Models (I) Exercise
More informationPropositional Calculus. Problems. Propositional Calculus 3&4. 1&2 Propositional Calculus. Johnson will leave the cabinet, and we ll lose the election.
1&2 Propositional Calculus Propositional Calculus Problems Jim Woodcock University of York October 2008 1. Let p be it s cold and let q be it s raining. Give a simple verbal sentence which describes each
More informationData, Statistics, and Probability Practice Questions
Data, Statistics, and Probability Practice Questions Directions: You MAY use your calculator. Questions 1 and refer to the following graph. Number of Planes Ordered 800 700 600 500 400 300 00 National
More informationReminders : Sign up to make up Labs (Tues/Thurs from 3 4 pm)
Monday, 5.11.15 Learning Target : I can identify nuclear reactions based on the characteristics of their chemical equations. Homework: Bingo packet 2 due Thursday What is a fission reaction? What is a
More informationUnit 1: Weather. Real-World Math 2
Unit 1: Weather Key Concepts Discuss temperatures below zero. Draw a number line to review negative numbers. Do some whole-class practice with sample questions. For example, If it was degrees when you
More informationLectures on Probability and Statistical Models
Lectures on Probability and Statistical Models Phil Pollett Professor of Mathematics The University of Queensland c These materials can be used for any educational purpose provided they are are not altered
More informationAnnouncements Wednesday, October 25
Announcements Wednesday, October 25 The midterm will be returned in recitation on Friday. The grade breakdown is posted on Piazza. You can pick it up from me in office hours before then. Keep tabs on your
More informationON A CONJECTURE OF WILLIAM HERSCHEL
ON A CONJECTURE OF WILLIAM HERSCHEL By CHRISTOPHER C. KRUT A THESIS PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF
More informationDiscrete Mathematics & Mathematical Reasoning Induction
Discrete Mathematics & Mathematical Reasoning Induction Colin Stirling Informatics Colin Stirling (Informatics) Discrete Mathematics (Sections 5.1 & 5.2) Today 1 / 12 Another proof method: Mathematical
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2 a: Conditional Probability and Bayes Rule
2E1395 - Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2 a: Conditional Probability and Bayes Rule Exercise 2A1 We can call X the observation (X i indicates that the program
More informationn α 1 α 2... α m 1 α m σ , A =
The Leslie Matrix The Leslie matrix is a generalization of the above. It is a matrix which describes the increases in numbers in various age categories of a population year-on-year. As above we write p
More informationAnnouncements Wednesday, September 20
Announcements Wednesday, September 20 WeBWorK 1.4, 1.5 are due on Wednesday at 11:59pm. The first midterm is on this Friday, September 22. Midterms happen during recitation. The exam covers through 1.5.
More informationConsider an experiment that may have different outcomes. We are interested to know what is the probability of a particular set of outcomes.
CMSC 310 Artificial Intelligence Probabilistic Reasoning and Bayesian Belief Networks Probabilities, Random Variables, Probability Distribution, Conditional Probability, Joint Distributions, Bayes Theorem
More informationStochastic modelling of epidemic spread
Stochastic modelling of epidemic spread Julien Arino Centre for Research on Inner City Health St Michael s Hospital Toronto On leave from Department of Mathematics University of Manitoba Julien Arino@umanitoba.ca
More informationMATH 118 FINAL EXAM STUDY GUIDE
MATH 118 FINAL EXAM STUDY GUIDE Recommendations: 1. Take the Final Practice Exam and take note of questions 2. Use this study guide as you take the tests and cross off what you know well 3. Take the Practice
More information8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains
8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains 8.1 Review 8.2 Statistical Equilibrium 8.3 Two-State Markov Chain 8.4 Existence of P ( ) 8.5 Classification of States
More informationBishop Kelley High School Summer Math Program Course: Algebra II B
016 017 Summer Math Program Course: NAME: DIRECTIONS: Show all work in the packet. You may not use a calculator. No matter when you have math, this packet is due on the first day of class This material
More informationOperating Instructions 5 Day Weather Station with Color Screen Model: DG-TH8805 INDOOR UNIT
Operating Instructions 5 Day Weather Station with Color Screen Model: DG-TH8805 INDOOR UNIT OUTDOOR SENSOR FEATURES Buttons: MODE,,,MEM, CH, HISTORY, 5 day weather forecast in the following combinations:
More informationProbability- describes the pattern of chance outcomes
Chapter 6 Probability the study of randomness Probability- describes the pattern of chance outcomes Chance behavior is unpredictable in the short run, but has a regular and predictable pattern in the long
More informationIntroduction to Artificial Intelligence. Prof. Inkyu Moon Dept. of Robotics Engineering, DGIST
Introduction to Artificial Intelligence Prof. Inkyu Moon Dept. of Robotics Engineering, DGIST Chapter 3 Uncertainty management in rule-based expert systems To help interpret Bayesian reasoning in expert
More informationAnnouncements Monday, September 17
Announcements Monday, September 17 WeBWorK 3.3, 3.4 are due on Wednesday at 11:59pm. The first midterm is on this Friday, September 21. Midterms happen during recitation. The exam covers through 3.4. About
More informationMarkov Chains (Part 3)
Markov Chains (Part 3) State Classification Markov Chains - State Classification Accessibility State j is accessible from state i if p ij (n) > for some n>=, meaning that starting at state i, there is
More information