HIDDEN MARKOV CHANGE POINT ESTIMATION

Size: px
Start display at page:

Download "HIDDEN MARKOV CHANGE POINT ESTIMATION"

Transcription

1 Communications on Stochastic Analysis Vol. 9, No. 3 ( Serials Publications HIDDEN MARKOV CHANGE POINT ESTIMATION ROBERT J. ELLIOTT* AND SEBASTIAN ELLIOTT Abstract. A hidden Markov model is considered where the dynamics of the hidden process change at a random change point. In principle this gives rise to a non-linear filter but closed form recursive estimates are obtained for the conditional distribution of the hidden process and of. 1. Introduction Hidden Markov models have found many applications from speech processing to regime switching dynamics in financial models. An early description is given in the paper [4] by Rabiner and Huang and a fuller treatment can be found in the book Hidden Markov Models: Estimation and Control by Elliott, Aggoun and Moore. Recent results can be found in [3]. In this paper we consider the situation where a discrete time Markov chain X is observed indirectly through a second Markov chain Y. However, the dynamics of X undergo a change at a random time. Given the observed process Y, filtered recursive estimates for the conditional distribution of X and the change point are derived. This paper uses the expectation maximization, EM algorithm as discussed in [1] and [2]. 2. Chain Dynamics Suppose X = {X k, k = 0, 1,...} is a discrete time finite state Markov chain defined on a probability space (Ω, F, P. Without lose of generality, the state space can be taken to be the set of unit vectors {e 1, e 2,...e N } in R N, where N is the number of elements of the state space and e i = (0, 0,..., 0, 1, 0,..., 0 is a standard basis vector in R N. Suppose τ {1, 2,...} is a random time and P (τ = k = p k 0. Write = P (τ > k = l=k+1 p l. The random time τ represents a random time at which there is a change in the dynamics of the chain X. Received ; Communicated by the editors Mathematics Subject Classification. 60J10; 60J22. Key words and phrases. Hidden Markov chain, change points, estimation. * Professor Robert J. Elliott wishes to thank the ARO for support under project W911NF

2 368 ROBERT J. ELLIOTT AND SEBASTIAN ELLIOTT In fact, if k τ, suppose a 1 ji = P (X k+1 = e j X k = e i = P (X 1 = e j X 0 = e i. Write A 1 for the matrix (a 1 ji, 1 i, j N. If k > τ, suppose a 2 ji = P (X k+1 = e j X k = e i, and write A 2 for the matrix (a 2 ji, 1 i, j N. Write Z k = I k τ. The state space of Z will be mapped onto the unit vectors ē 1 = (1, 0, ē 2 = (0, 1 by considering the process Define the filtrations Z k = (1 Z k ē 1 + Z k ē 2 R 2. F X k = σ{x i, i k}, F Z k = σ{z i, i k}, = F X k F Z k. Lemma 2.1. Suppose A k = A 1 Z k, ē 1 +A 2 Z k, ē 2, then X k+1 = A k X k +M k+1 R N, where M is a sequence of Martingale increments, that is E[M k+1 ] = 0 R N. Proof. E[M k+1 ] = E[X k+1 A k X k F X k F Z k ] = E[X k+1 X k, Z k ] (A 1 Z k, ē 1 + A 2 Z k, ē 2 X k = E[X k+1 ( Z k, ē 1 + Z k, ē 2 X k, Z k ] (A 1 Z k, ē 1 + A 2 Z k, ē 2 X k = (A 1 Z k, ē 1 + A 2 Z k, ē 2 X k = 0. (A 1 Z k, ē 1 + A 2 Z k, ē 2 X k TRANSITIONS OF Z: Note P (Z k+1 = ē 1 F Z k = P (Z k+1 = ē 1 Z k. Now P (Z k+1 = ē 1 Z k = ē 1 = P (τ > k + 1 τ > k = +1, P (Z k+1 = ē 2 Z k = ē 1 = P (τ = k + 1 τ > k = p k+1, P (Z k+1 = ē 1 Z k = ē 2 = P (τ > k + 1 τ k = 0, P (Z k+1 = ē 2 Z k = ē 2 = P (τ k + 1 τ k = 1.

3 Lemma 2.2. Write = E[N k+1 F Z k ] = 0 R2. HIDDEN MARKOV CHANGE POINT ESTIMATION p k+1. Then Z k+1 = Z k + N k+1 R 2, where 1 ( Fk+1 3. Observations. The chain X is not observed directly. Rather, there is another finite state process Y which is a noisy observation of X. Suppose the finite state space of Y is identified with the unit vectors where {f 1, f 2,..., f M } of R M, f j = (0, 0,..., 1,..., 0 R N. We can have M > N, M < N or M = N. Suppose c j,i = P (Y k = f j X k = e i 0. Note M c ji = 1. Write C = (c ji, 1 i N, 1 j M, Fk Y = σ{y j, 0 j k}. Lemma 3.1. Y k = CX k + L k R M, where E[L k F X k ] = 0 RM. Proof. E[L k F X k ] = E[Y k CX k F X k ] = E[Y k CX k X k ] = E[Y k X k ] CX k = E[Y k Y k, f j X k, e i X k ] CX k = = = 0 R M. E[ Y k, f j X k, e i X k ]f j CX k X k, e i c ji f j CX k JOINT DISTRIBUTIONS: Lemma 3.2. E[X k+1 Z k+1 X k, Z k ] = A 1 X k + A 2 X k ( Fk+1 F p k k+1 ( 0 1 Z k, ē 1 Z k, ē 2. (3.1

4 370 ROBERT J. ELLIOTT AND SEBASTIAN ELLIOTT Proof. E[X k+1 Z k+1 X k, Z k ] = E[E[X k+1 Z k+1 X k+1, X k, Z k ] X k, Z k ] = E[X k+1 E[Z k+1 Z k ] X k, Z k ] = E[X k+1 E[Z k+1 ( Z k, ē 1 + Z k, ē 2 Z k ] X k, Z k ] = E[X k+1 ( Fk+1 F p k k+1 Z k, ē 1 X k, Z k ] ( 0 + E[X k+1 Z 1 k, ē 2 X k, Z k ] = A 1 X k ( Fk+1 p k+1 Z k, ē 1 + A 2 X k ( 0 Z 1 k, ē 2. ( Reference Probability. The above dynamics of X, Z, Y are under the real world probability P. Suppose we have another probability P under which X and Z have the same dynamics as above, but under which Y is a process independent of X and Z and under which Y is a sequence of independent, identity distributed random variables with Lemma 4.1. Let P (Y k = f j F X k F Z k F Y k 1 = P (Y k = f j = 1/M. λ k Λ k = CX k, f j Y k, f j, (4.1 k λ i. (4.2 i=0 The probability P can be defined by setting dp d P G k = Λ k, where G k = F X k F Y k F Z k. Then under P the dynamics of X and Z are unchanged and That is, under P where E[L k F X k ] = 0 RM. P (Y k = f j X k = e i = c ji. Y k = CX k + L k,

5 HIDDEN MARKOV CHANGE POINT ESTIMATION 371 Proof. Consider first Ē[λ k G k 1 X k ] = = 1. Ē[ CX k, f j Y k, f j X k ] CX k, f j Ē[ Y k, f j ] c ji X k, e i From Bayes theorem, (see [1], with dp d P G k = Λ k, E[ Y k, f j G k 1 X k ] = Ē[Λ k Y k, f j G k 1 X k ] Ē[Λ k G k 1 X k ] = Λ k 1Ē[λ k Y k, f j G k 1 X k ] Λ k 1 Ē[λ k G k 1 X k ] M Ē[( CX k, f r Y k, f r Y k, f j G k 1 X k ] r=1 Ē[ CX k, f j Y k, f j G k 1 X k ] = CX k, f j. Therefore, if X k = e i, P (Y k = f j X k = e i = E[ Y k, f j X k = e i ] = c ji. 5. Filters. We wish to estimate both X and Z, given the noisy observations Y. This is required under the real world probability P. That is, we wish to determine E[X k Z k Fk Y ] RN 2. However, it is easier to work under the reference probability P for which the dynamics of X and Z remain unchanged but the Y sequence is i.i.d. with P (Y k = f j = 1/M for all k and j. Using Bayes rule, Write E[X k Z k F Y k ] = Ē[Λ kx k Z k F Y k ] Ē[Λ k F Y k ]. q k := Ē[Λ kx k Z k F Y k ] R N 2, q 1 k := Ē[Λ kx k Z k, ē 1 F Y k ], q 2 k := Ē[Λ kx k Z k, ē 2 F Y k ].

6 372 ROBERT J. ELLIOTT AND SEBASTIAN ELLIOTT for unnormalized conditional expectations given the observations Fk Y Then to time k. q k = (q 1 k, q 2 k. Note that, with 1 denoting the vector of 1 s in R N or R 2 : 1 X k Z k 1 = 1. Therefore, 1 q k 1 = Ē[1 Λ k X k Z k 1 F Y k ] = Ē[Λ k F Y k ]. Consequently, if we know q k, the denominator Ē[Λ k Fk Y ] is just the sum of all components of q k. The filter is now a recursive estimate for q k given the observations Y. Theorem 5.1. Proof. q 1 k+1 q 2 k+1 A 1 q 1 k, e i +1 c ji e i Y k+1, f j, r (5.1 [ A 1 q 1 k, e i p k+1 + A 2 q 2 k, e i ]c ji e i Y k+1, f j. (5.2 q 1 k+1 = Ē[Λ kλ k+1 X k+1 Z k+1, ē 1 Fk+1] Y Ē[Λ k CX k+1, f j X k+1 Z k+1, ē 1 Fk+1] Y Y k+1, f j Ē[Λ k X k+1, e i Z k+1, ē 1 Fk+1]c Y ji e i Y k+1, f j r=1 c ji e i Y k+1, f j 2 Ē[Λ k X k+1, e i Z k+1, ē 1 Z k, ē r Fk+1] Y A 1 q 1 k, e i +1 c ji e i Y k+1, f j.

7 HIDDEN MARKOV CHANGE POINT ESTIMATION 373 For q 2 : q 2 k+1 = Ē[Λ kλ k+1 X k+1 Z k+1, ē 2 Fk+1] Y Ē[Λ k CX k+1, f j X k+1 Z k+1, ē 2 Fk+1] Y Y k+1, f j Ē[Λ k X k+1, e i Z k+1, ē 2 Fk+1]c Y ji e i Y k+1, f j r=1 2 Ē[Λ k X k+1, e i Z k+1, ē 2 Z k, ē r Fk+1] Y c ji e i Y k+1, f j [ A 1 qk, 1 e i p k+1 c ji e i Y k+1, f j + A 2 q 2 k, e i c ji e i Y k+1, f j ]. Remark 5.2. The filter is initialized by assuming the change point τ has not occured, so Z 0 = ē 1, and by taking an initial distribution p 0 = q 0 for X 0. Corollary 5.3. The normalized conditional distributions of X k and Z k are then given by p 1 k = E[X k Z k, ē 1 Fk Y qk 1 ] = 1 q k 1, (5.3 p 2 k = E[X k Z k, ē 2 Fk Y qk 2 ] = 1 q k 1. (5.4 Given the observations, the conditional probability of the change point τ is then P (τ > k = E[ Z k, ē 1 F Y k ] = 1 q 1 k. (5.5 Remark 5.4. Recall M is the number of elements in the state space of the observation process Y. The right hand sides of the recursions (5.5 and (5.6 of Theorem 5.1 involve a factor M. However, this will cancel when we consider the normalized forms in (5.7 and (5.8. Consequently, equation (5.5 and (5.6 can be modified to give recursions for unnormalized quantities q 1 k, q2 k as: q 1 k+1 = q 2 k+1 = A 1 q 1 k, e i +1 c ji e i Y k+1, f j, (5.6 [ A 1 q 1 k, e i p k+1 + A 2 q 2 k, e i ]c ji e i Y k+1, f j. (5.7 Again the initialization is Z 0 = ē 1 and q 0 = p 0 R N.

8 374 ROBERT J. ELLIOTT AND SEBASTIAN ELLIOTT 6. A Viterbi Recursion. As noted in our earlier papers, the Viterbi filter is given by replacing the expected value by a maximum likelihood. That is, the Viterbi estimation is given by a sequence of vectors q 1 k = [q 1 k (1, q 1 k (2,..., q 1 k (N], qk 2 = [qk 2 (1, qk 2 (2,..., qk 2 (N]. defined recursively by Z 0 = ē 1, q0 1 = p 0, q0 2 = 0 R N, and q 1 k+1(i := A 1 q 1 k, e i +1 max(c ji Y k+1, f j, (6.1 j qk+1(i 2 := [ A 1 qk 1, e i p k+1 + A 2 qk 2, e i ] max(c ji Y k+1, f j. (6.2 j 7. Conclusion. Hidden Markov chains, that is Markov chains observed indirectly through the observations of a second Markov chain, have been extensively studied. For a bibliography see the book [1] by Aggoun, Elliott and Moore. In this paper we have considered a hidden Markov chain X whose dynamics undergo a change at a random time τ. Given an observed process filtered estimates for the conditional distribution of X and the change point time τ are derived. References 1. Aggoun, L., Elliott, R. J. and Moore, J. B.: Hidden Markov models: estimation and control, Springer - Verlag, erlin Heidelberg - New York, Dembo, A. and Zeitouni, O.: Parameter estimation of partially observed continuous time stochastic processes via the EM algorithm, Stoch. Proc. Appl. 23 (1986, no. 1, Elliott, R. J. and Malcolm, W. P.: Data- recursive smoother formulae for paritally observed discrete-time Markov chains, Stoch. Anal. Appl. 24 (2006, no. 3, Rabiner, L. R. and Juang, B. H.: An introduction to Hidden Markov Models, IEEE ASSP Magazine, (January 1986, Robert J. Elliott: Haskayne School of Busiless, University of Calgary, Calgary, AB T2N1N4, Canada address: relliott@ucalgary.edu Sebastian Elliott: Elliott Stochastics LLC. 31 Varsity Estates Park NW, Calgary, AB Canada address: ellio02@gmail.com

A NEW NONLINEAR FILTER

A NEW NONLINEAR FILTER COMMUNICATIONS IN INFORMATION AND SYSTEMS c 006 International Press Vol 6, No 3, pp 03-0, 006 004 A NEW NONLINEAR FILTER ROBERT J ELLIOTT AND SIMON HAYKIN Abstract A discrete time filter is constructed

More information

An application of hidden Markov models to asset allocation problems

An application of hidden Markov models to asset allocation problems Finance Stochast. 1, 229 238 (1997) c Springer-Verlag 1997 An application of hidden Markov models to asset allocation problems Robert J. Elliott 1, John van der Hoek 2 1 Department of Mathematical Sciences,

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project

More information

Hidden Markov Models. By Parisa Abedi. Slides courtesy: Eric Xing

Hidden Markov Models. By Parisa Abedi. Slides courtesy: Eric Xing Hidden Markov Models By Parisa Abedi Slides courtesy: Eric Xing i.i.d to sequential data So far we assumed independent, identically distributed data Sequential (non i.i.d.) data Time-series data E.g. Speech

More information

A Higher-Order Interactive Hidden Markov Model and Its Applications Wai-Ki Ching Department of Mathematics The University of Hong Kong

A Higher-Order Interactive Hidden Markov Model and Its Applications Wai-Ki Ching Department of Mathematics The University of Hong Kong A Higher-Order Interactive Hidden Markov Model and Its Applications Wai-Ki Ching Department of Mathematics The University of Hong Kong Abstract: In this talk, a higher-order Interactive Hidden Markov Model

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Hidden Markov Models. Aarti Singh Slides courtesy: Eric Xing. Machine Learning / Nov 8, 2010

Hidden Markov Models. Aarti Singh Slides courtesy: Eric Xing. Machine Learning / Nov 8, 2010 Hidden Markov Models Aarti Singh Slides courtesy: Eric Xing Machine Learning 10-701/15-781 Nov 8, 2010 i.i.d to sequential data So far we assumed independent, identically distributed data Sequential data

More information

Linear Dynamical Systems

Linear Dynamical Systems Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations

More information

A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models

A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models Jeff A. Bilmes (bilmes@cs.berkeley.edu) International Computer Science Institute

More information

Statistical Problem. . We may have an underlying evolving system. (new state) = f(old state, noise) Input data: series of observations X 1, X 2 X t

Statistical Problem. . We may have an underlying evolving system. (new state) = f(old state, noise) Input data: series of observations X 1, X 2 X t Markov Chains. Statistical Problem. We may have an underlying evolving system (new state) = f(old state, noise) Input data: series of observations X 1, X 2 X t Consecutive speech feature vectors are related

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 Hidden Markov Models Barnabás Póczos & Aarti Singh Slides courtesy: Eric Xing i.i.d to sequential data So far we assumed independent, identically distributed

More information

Note Set 5: Hidden Markov Models

Note Set 5: Hidden Markov Models Note Set 5: Hidden Markov Models Probabilistic Learning: Theory and Algorithms, CS 274A, Winter 2016 1 Hidden Markov Models (HMMs) 1.1 Introduction Consider observed data vectors x t that are d-dimensional

More information

Master 2 Informatique Probabilistic Learning and Data Analysis

Master 2 Informatique Probabilistic Learning and Data Analysis Master 2 Informatique Probabilistic Learning and Data Analysis Faicel Chamroukhi Maître de Conférences USTV, LSIS UMR CNRS 7296 email: chamroukhi@univ-tln.fr web: chamroukhi.univ-tln.fr 2013/2014 Faicel

More information

Modern Discrete Probability Branching processes

Modern Discrete Probability Branching processes Modern Discrete Probability IV - Branching processes Review Sébastien Roch UW Madison Mathematics November 15, 2014 1 Basic definitions 2 3 4 Galton-Watson branching processes I Definition A Galton-Watson

More information

AMARKOV modulated Markov process (MMMP) is a

AMARKOV modulated Markov process (MMMP) is a IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 57, NO. 2, FEBRUARY 2009 463 An EM Algorithm for Markov Modulated Markov Processes Yariv Ephraim, Fellow, IEEE, and William J. J. Roberts, Senior Member, IEEE

More information

Linear Dynamical Systems (Kalman filter)

Linear Dynamical Systems (Kalman filter) Linear Dynamical Systems (Kalman filter) (a) Overview of HMMs (b) From HMMs to Linear Dynamical Systems (LDS) 1 Markov Chains with Discrete Random Variables x 1 x 2 x 3 x T Let s assume we have discrete

More information

1 What is a hidden Markov model?

1 What is a hidden Markov model? 1 What is a hidden Markov model? Consider a Markov chain {X k }, where k is a non-negative integer. Suppose {X k } embedded in signals corrupted by some noise. Indeed, {X k } is hidden due to noise and

More information

Hybrid HMM/MLP models for time series prediction

Hybrid HMM/MLP models for time series prediction Bruges (Belgium), 2-23 April 999, D-Facto public., ISBN 2-649-9-X, pp. 455-462 Hybrid HMM/MLP models for time series prediction Joseph Rynkiewicz SAMOS, Université Paris I - Panthéon Sorbonne Paris, France

More information

Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim ***

Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim *** JOURNAL OF THE CHUNGCHEONG MATHEMATICAL SOCIETY Volume 19, No. 4, December 26 GIRSANOV THEOREM FOR GAUSSIAN PROCESS WITH INDEPENDENT INCREMENTS Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim *** Abstract.

More information

order is number of previous outputs

order is number of previous outputs Markov Models Lecture : Markov and Hidden Markov Models PSfrag Use past replacements as state. Next output depends on previous output(s): y t = f[y t, y t,...] order is number of previous outputs y t y

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

STA 414/2104: Machine Learning

STA 414/2104: Machine Learning STA 414/2104: Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistics! rsalakhu@cs.toronto.edu! http://www.cs.toronto.edu/~rsalakhu/ Lecture 9 Sequential Data So far

More information

Multiscale Systems Engineering Research Group

Multiscale Systems Engineering Research Group Hidden Markov Model Prof. Yan Wang Woodruff School of Mechanical Engineering Georgia Institute of echnology Atlanta, GA 30332, U.S.A. yan.wang@me.gatech.edu Learning Objectives o familiarize the hidden

More information

Ph.D. Qualifying Exam Monday Tuesday, January 4 5, 2016

Ph.D. Qualifying Exam Monday Tuesday, January 4 5, 2016 Ph.D. Qualifying Exam Monday Tuesday, January 4 5, 2016 Put your solution to each problem on a separate sheet of paper. Problem 1. (5106) Find the maximum likelihood estimate of θ where θ is a parameter

More information

DOOB S DECOMPOSITION THEOREM FOR NEAR-SUBMARTINGALES

DOOB S DECOMPOSITION THEOREM FOR NEAR-SUBMARTINGALES Communications on Stochastic Analysis Vol. 9, No. 4 (215) 467-476 Serials Publications www.serialspublications.com DOOB S DECOMPOSITION THEOREM FOR NEAR-SUBMARTINGALES HUI-HSIUNG KUO AND KIMIAKI SAITÔ*

More information

Non-homogeneous random walks on a semi-infinite strip

Non-homogeneous random walks on a semi-infinite strip Non-homogeneous random walks on a semi-infinite strip Chak Hei Lo Joint work with Andrew R. Wade World Congress in Probability and Statistics 11th July, 2016 Outline Motivation: Lamperti s problem Our

More information

Hidden Markov Models and Gaussian Mixture Models

Hidden Markov Models and Gaussian Mixture Models Hidden Markov Models and Gaussian Mixture Models Hiroshi Shimodaira and Steve Renals Automatic Speech Recognition ASR Lectures 4&5 23&27 January 2014 ASR Lectures 4&5 Hidden Markov Models and Gaussian

More information

Shankar Shivappa University of California, San Diego April 26, CSE 254 Seminar in learning algorithms

Shankar Shivappa University of California, San Diego April 26, CSE 254 Seminar in learning algorithms Recognition of Visual Speech Elements Using Adaptively Boosted Hidden Markov Models. Say Wei Foo, Yong Lian, Liang Dong. IEEE Transactions on Circuits and Systems for Video Technology, May 2004. Shankar

More information

Soo Hak Sung and Andrei I. Volodin

Soo Hak Sung and Andrei I. Volodin Bull Korean Math Soc 38 (200), No 4, pp 763 772 ON CONVERGENCE OF SERIES OF INDEENDENT RANDOM VARIABLES Soo Hak Sung and Andrei I Volodin Abstract The rate of convergence for an almost surely convergent

More information

First-order random coefficient autoregressive (RCA(1)) model: Joint Whittle estimation and information

First-order random coefficient autoregressive (RCA(1)) model: Joint Whittle estimation and information ACTA ET COMMENTATIONES UNIVERSITATIS TARTUENSIS DE MATHEMATICA Volume 9, Number, June 05 Available online at http://acutm.math.ut.ee First-order random coefficient autoregressive RCA model: Joint Whittle

More information

Inference for Stochastic Processes

Inference for Stochastic Processes Inference for Stochastic Processes Robert L. Wolpert Revised: June 19, 005 Introduction A stochastic process is a family {X t } of real-valued random variables, all defined on the same probability space

More information

IMPLIED DISTRIBUTIONS IN MULTIPLE CHANGE POINT PROBLEMS

IMPLIED DISTRIBUTIONS IN MULTIPLE CHANGE POINT PROBLEMS IMPLIED DISTRIBUTIONS IN MULTIPLE CHANGE POINT PROBLEMS J. A. D. ASTON 1,2, J. Y. PENG 3 AND D. E. K. MARTIN 4 1 CENTRE FOR RESEARCH IN STATISTICAL METHODOLOGY, WARWICK UNIVERSITY 2 INSTITUTE OF STATISTICAL

More information

Applications of Mathematics

Applications of Mathematics Stochastic Mechanics Random Media Signal Processing and Image Synthesis Mathematical Economics Stochastic Optimization Stochastic Control Applications of Mathematics Stochastic Modelling and Applied Probability

More information

Risk-Sensitive Filtering and Smoothing via Reference Probability Methods

Risk-Sensitive Filtering and Smoothing via Reference Probability Methods IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL. 42, NO. 11, NOVEMBER 1997 1587 [5] M. A. Dahleh and J. B. Pearson, Minimization of a regulated response to a fixed input, IEEE Trans. Automat. Contr., vol.

More information

Course 495: Advanced Statistical Machine Learning/Pattern Recognition

Course 495: Advanced Statistical Machine Learning/Pattern Recognition Course 495: Advanced Statistical Machine Learning/Pattern Recognition Lecturer: Stefanos Zafeiriou Goal (Lectures): To present discrete and continuous valued probabilistic linear dynamical systems (HMMs

More information

Topics in Probability Theory and Stochastic Processes Steven R. Dunbar. Notation and Problems of Hidden Markov Models

Topics in Probability Theory and Stochastic Processes Steven R. Dunbar. Notation and Problems of Hidden Markov Models Steven R. Dunbar Department of Mathematics 203 Avery Hall University of Nebraska-Lincoln Lincoln, NE 68588-0130 http://www.math.unl.edu Voice: 402-472-3731 Fax: 402-472-8466 Topics in Probability Theory

More information

Supervised Learning Hidden Markov Models. Some of these slides were inspired by the tutorials of Andrew Moore

Supervised Learning Hidden Markov Models. Some of these slides were inspired by the tutorials of Andrew Moore Supervised Learning Hidden Markov Models Some of these slides were inspired by the tutorials of Andrew Moore A Markov System S 2 Has N states, called s 1, s 2.. s N There are discrete timesteps, t=0, t=1,.

More information

State Space and Hidden Markov Models

State Space and Hidden Markov Models State Space and Hidden Markov Models Kunsch H.R. State Space and Hidden Markov Models. ETH- Zurich Zurich; Aliaksandr Hubin Oslo 2014 Contents 1. Introduction 2. Markov Chains 3. Hidden Markov and State

More information

Brief Introduction of Machine Learning Techniques for Content Analysis

Brief Introduction of Machine Learning Techniques for Content Analysis 1 Brief Introduction of Machine Learning Techniques for Content Analysis Wei-Ta Chu 2008/11/20 Outline 2 Overview Gaussian Mixture Model (GMM) Hidden Markov Model (HMM) Support Vector Machine (SVM) Overview

More information

Hidden Markov Models Hamid R. Rabiee

Hidden Markov Models Hamid R. Rabiee Hidden Markov Models Hamid R. Rabiee 1 Hidden Markov Models (HMMs) In the previous slides, we have seen that in many cases the underlying behavior of nature could be modeled as a Markov process. However

More information

Hidden Markov Models. AIMA Chapter 15, Sections 1 5. AIMA Chapter 15, Sections 1 5 1

Hidden Markov Models. AIMA Chapter 15, Sections 1 5. AIMA Chapter 15, Sections 1 5 1 Hidden Markov Models AIMA Chapter 15, Sections 1 5 AIMA Chapter 15, Sections 1 5 1 Consider a target tracking problem Time and uncertainty X t = set of unobservable state variables at time t e.g., Position

More information

Plan for today. ! Part 1: (Hidden) Markov models. ! Part 2: String matching and read mapping

Plan for today. ! Part 1: (Hidden) Markov models. ! Part 2: String matching and read mapping Plan for today! Part 1: (Hidden) Markov models! Part 2: String matching and read mapping! 2.1 Exact algorithms! 2.2 Heuristic methods for approximate search (Hidden) Markov models Why consider probabilistics

More information

A Modified Baum Welch Algorithm for Hidden Markov Models with Multiple Observation Spaces

A Modified Baum Welch Algorithm for Hidden Markov Models with Multiple Observation Spaces IEEE TRANSACTIONS ON SPEECH AND AUDIO PROCESSING, VOL. 9, NO. 4, MAY 2001 411 A Modified Baum Welch Algorithm for Hidden Markov Models with Multiple Observation Spaces Paul M. Baggenstoss, Member, IEEE

More information

SUPPLEMENT TO TESTING FOR REGIME SWITCHING: A COMMENT (Econometrica, Vol. 80, No. 4, July 2012, )

SUPPLEMENT TO TESTING FOR REGIME SWITCHING: A COMMENT (Econometrica, Vol. 80, No. 4, July 2012, ) Econometrica Supplementary Material SUPPLEMENT TO TESTING FOR REGIME SWITCHING: A COMMENT Econometrica, Vol. 80, No. 4, July 01, 1809 181 BY ANDREW V. CARTER ANDDOUGLAS G. STEIGERWALD We present proof

More information

Poisson INAR processes with serial and seasonal correlation

Poisson INAR processes with serial and seasonal correlation Poisson INAR processes with serial and seasonal correlation Márton Ispány University of Debrecen, Faculty of Informatics Joint result with Marcelo Bourguignon, Klaus L. P. Vasconcellos, and Valdério A.

More information

Human-Oriented Robotics. Temporal Reasoning. Kai Arras Social Robotics Lab, University of Freiburg

Human-Oriented Robotics. Temporal Reasoning. Kai Arras Social Robotics Lab, University of Freiburg Temporal Reasoning Kai Arras, University of Freiburg 1 Temporal Reasoning Contents Introduction Temporal Reasoning Hidden Markov Models Linear Dynamical Systems (LDS) Kalman Filter 2 Temporal Reasoning

More information

Math-Stat-491-Fall2014-Notes-V

Math-Stat-491-Fall2014-Notes-V Math-Stat-491-Fall2014-Notes-V Hariharan Narayanan November 18, 2014 Martingales 1 Introduction Martingales were originally introduced into probability theory as a model for fair betting games. Essentially

More information

Lecture 11: Hidden Markov Models

Lecture 11: Hidden Markov Models Lecture 11: Hidden Markov Models Cognitive Systems - Machine Learning Cognitive Systems, Applied Computer Science, Bamberg University slides by Dr. Philip Jackson Centre for Vision, Speech & Signal Processing

More information

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1 Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet. For ξ, ξ 2, i.i.d. with P(ξ i = ± = /2 define the discrete-time random walk W =, W n = ξ +... + ξ n. (i Formulate and prove the property

More information

Robert Collins CSE586 CSE 586, Spring 2015 Computer Vision II

Robert Collins CSE586 CSE 586, Spring 2015 Computer Vision II CSE 586, Spring 2015 Computer Vision II Hidden Markov Model and Kalman Filter Recall: Modeling Time Series State-Space Model: You have a Markov chain of latent (unobserved) states Each state generates

More information

Reading Group on Deep Learning Session 1

Reading Group on Deep Learning Session 1 Reading Group on Deep Learning Session 1 Stephane Lathuiliere & Pablo Mesejo 2 June 2016 1/31 Contents Introduction to Artificial Neural Networks to understand, and to be able to efficiently use, the popular

More information

ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions

ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions Problem Solutions : Yates and Goodman, 9.5.3 9.1.4 9.2.2 9.2.6 9.3.2 9.4.2 9.4.6 9.4.7 and Problem 9.1.4 Solution The joint PDF of X and Y

More information

MARKOV CHAINS AND HIDDEN MARKOV MODELS

MARKOV CHAINS AND HIDDEN MARKOV MODELS MARKOV CHAINS AND HIDDEN MARKOV MODELS MERYL SEAH Abstract. This is an expository paper outlining the basics of Markov chains. We start the paper by explaining what a finite Markov chain is. Then we describe

More information

Maximum-Likelihood fitting

Maximum-Likelihood fitting CMP 0b Lecture F. Sigworth Maximum-Likelihood fitting One of the issues I want to address in this lecture is the fitting of distributions dwell times. We want to find the best curve to draw over a histogram,

More information

THERE has been a growing interest in using switching diffusion

THERE has been a growing interest in using switching diffusion 1706 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 53, NO. 5, MAY 2007 Two-Time-Scale Approximation for Wonham Filters Qing Zhang, Senior Member, IEEE, G. George Yin, Fellow, IEEE, Johb B. Moore, Life

More information

Optimal State Estimation for Boolean Dynamical Systems using a Boolean Kalman Smoother

Optimal State Estimation for Boolean Dynamical Systems using a Boolean Kalman Smoother Optimal State Estimation for Boolean Dynamical Systems using a Boolean Kalman Smoother Mahdi Imani and Ulisses Braga-Neto Department of Electrical and Computer Engineering Texas A&M University College

More information

Additive functionals of infinite-variance moving averages. Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535

Additive functionals of infinite-variance moving averages. Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535 Additive functionals of infinite-variance moving averages Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535 Departments of Statistics The University of Chicago Chicago, Illinois 60637 June

More information

Hidden Markov Models,99,100! Markov, here I come!

Hidden Markov Models,99,100! Markov, here I come! Hidden Markov Models,99,100! Markov, here I come! 16.410/413 Principles of Autonomy and Decision-Making Pedro Santana (psantana@mit.edu) October 7 th, 2015. Based on material by Brian Williams and Emilio

More information

Law of total probability and Bayes theorem in Riesz spaces

Law of total probability and Bayes theorem in Riesz spaces Law of total probability and Bayes theorem in Riesz spaces Liang Hong Abstract. This note generalizes the notion of conditional probability to Riesz spaces using the order-theoretic approach. With the

More information

Research Article A Dependent Hidden Markov Model of Credit Quality

Research Article A Dependent Hidden Markov Model of Credit Quality International Stochastic Analysis Volume 2012, Article ID 719237, 13 pages doi:10.1155/2012/719237 Research Article A Dependent Hidden Marov Model of Credit Quality Małgorzata Witoria Koroliewicz Centre

More information

MARKOVIANITY OF A SUBSET OF COMPONENTS OF A MARKOV PROCESS

MARKOVIANITY OF A SUBSET OF COMPONENTS OF A MARKOV PROCESS MARKOVIANITY OF A SUBSET OF COMPONENTS OF A MARKOV PROCESS P. I. Kitsul Department of Mathematics and Statistics Minnesota State University, Mankato, MN, USA R. S. Liptser Department of Electrical Engeneering

More information

RESEARCH ARTICLE. Online quantization in nonlinear filtering

RESEARCH ARTICLE. Online quantization in nonlinear filtering Journal of Statistical Computation & Simulation Vol. 00, No. 00, Month 200x, 3 RESEARCH ARTICLE Online quantization in nonlinear filtering A. Feuer and G. C. Goodwin Received 00 Month 200x; in final form

More information

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS APPLICATIONES MATHEMATICAE 29,4 (22), pp. 387 398 Mariusz Michta (Zielona Góra) OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS Abstract. A martingale problem approach is used first to analyze

More information

Machine Learning & Data Mining Caltech CS/CNS/EE 155 Hidden Markov Models Last Updated: Feb 7th, 2017

Machine Learning & Data Mining Caltech CS/CNS/EE 155 Hidden Markov Models Last Updated: Feb 7th, 2017 1 Introduction Let x = (x 1,..., x M ) denote a sequence (e.g. a sequence of words), and let y = (y 1,..., y M ) denote a corresponding hidden sequence that we believe explains or influences x somehow

More information

Recall: Modeling Time Series. CSE 586, Spring 2015 Computer Vision II. Hidden Markov Model and Kalman Filter. Modeling Time Series

Recall: Modeling Time Series. CSE 586, Spring 2015 Computer Vision II. Hidden Markov Model and Kalman Filter. Modeling Time Series Recall: Modeling Time Series CSE 586, Spring 2015 Computer Vision II Hidden Markov Model and Kalman Filter State-Space Model: You have a Markov chain of latent (unobserved) states Each state generates

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Random walk on a polygon

Random walk on a polygon IMS Lecture Notes Monograph Series Recent Developments in Nonparametric Inference and Probability Vol. 50 (2006) 4 c Institute of Mathematical Statistics, 2006 DOI: 0.24/0749270600000058 Random walk on

More information

Hidden Markov Models for precipitation

Hidden Markov Models for precipitation Hidden Markov Models for precipitation Pierre Ailliot Université de Brest Joint work with Peter Thomson Statistics Research Associates (NZ) Page 1 Context Part of the project Climate-related risks for

More information

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes This section introduces Lebesgue-Stieltjes integrals, and defines two important stochastic processes: a martingale process and a counting

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

Basic math for biology

Basic math for biology Basic math for biology Lei Li Florida State University, Feb 6, 2002 The EM algorithm: setup Parametric models: {P θ }. Data: full data (Y, X); partial data Y. Missing data: X. Likelihood and maximum likelihood

More information

Hidden Markov Models. George Konidaris

Hidden Markov Models. George Konidaris Hidden Markov Models George Konidaris gdk@cs.brown.edu Fall 2018 Recall: Bayesian Network Flu Allergy Sinus Nose Headache Recall: BN Flu Allergy Flu P True 0.6 False 0.4 Sinus Allergy P True 0.2 False

More information

(Exact and efficient) Simulation of Conditioned Markov Processes

(Exact and efficient) Simulation of Conditioned Markov Processes (Exact and efficient) Simulation of Conditioned Markov Processes Omiros Papaspiliopoulos Universitat Pompeu Fabra, Barcelona http://www.econ.upf.edu/ omiros GrStoc α, ΛEYKA A 2009 Acknowledgement: K.Kalogeropoulos

More information

HMM part 1. Dr Philip Jackson

HMM part 1. Dr Philip Jackson Centre for Vision Speech & Signal Processing University of Surrey, Guildford GU2 7XH. HMM part 1 Dr Philip Jackson Probability fundamentals Markov models State topology diagrams Hidden Markov models -

More information

Introduction to Hidden Markov Modeling (HMM) Daniel S. Terry Scott Blanchard and Harel Weinstein labs

Introduction to Hidden Markov Modeling (HMM) Daniel S. Terry Scott Blanchard and Harel Weinstein labs Introduction to Hidden Markov Modeling (HMM) Daniel S. Terry Scott Blanchard and Harel Weinstein labs 1 HMM is useful for many, many problems. Speech Recognition and Translation Weather Modeling Sequence

More information

Lecture 4: State Estimation in Hidden Markov Models (cont.)

Lecture 4: State Estimation in Hidden Markov Models (cont.) EE378A Statistical Signal Processing Lecture 4-04/13/2017 Lecture 4: State Estimation in Hidden Markov Models (cont.) Lecturer: Tsachy Weissman Scribe: David Wugofski In this lecture we build on previous

More information

On Finite-Time Ruin Probabilities in a Risk Model Under Quota Share Reinsurance

On Finite-Time Ruin Probabilities in a Risk Model Under Quota Share Reinsurance Applied Mathematical Sciences, Vol. 11, 217, no. 53, 269-2629 HIKARI Ltd, www.m-hikari.com https://doi.org/1.12988/ams.217.7824 On Finite-Time Ruin Probabilities in a Risk Model Under Quota Share Reinsurance

More information

Stochastic process for macro

Stochastic process for macro Stochastic process for macro Tianxiao Zheng SAIF 1. Stochastic process The state of a system {X t } evolves probabilistically in time. The joint probability distribution is given by Pr(X t1, t 1 ; X t2,

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

A Note on the Central Limit Theorem for a Class of Linear Systems 1

A Note on the Central Limit Theorem for a Class of Linear Systems 1 A Note on the Central Limit Theorem for a Class of Linear Systems 1 Contents Yukio Nagahata Department of Mathematics, Graduate School of Engineering Science Osaka University, Toyonaka 560-8531, Japan.

More information

(GENERALIZED) COMPACT-OPEN TOPOLOGY. 1. Introduction

(GENERALIZED) COMPACT-OPEN TOPOLOGY. 1. Introduction STRONG α-favorability OF THE (GENERALIZED) COMPACT-OPEN TOPOLOGY Peter J. Nyikos and László Zsilinszky Abstract. Strong α-favorability of the compact-open topology on the space of continuous functions,

More information

Page 1. References. Hidden Markov models and multiple sequence alignment. Markov chains. Probability review. Example. Markovian sequence

Page 1. References. Hidden Markov models and multiple sequence alignment. Markov chains. Probability review. Example. Markovian sequence Page Hidden Markov models and multiple sequence alignment Russ B Altman BMI 4 CS 74 Some slides borrowed from Scott C Schmidler (BMI graduate student) References Bioinformatics Classic: Krogh et al (994)

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information

Random Walks Conditioned to Stay Positive

Random Walks Conditioned to Stay Positive 1 Random Walks Conditioned to Stay Positive Bob Keener Let S n be a random walk formed by summing i.i.d. integer valued random variables X i, i 1: S n = X 1 + + X n. If the drift EX i is negative, then

More information

A gentle introduction to Hidden Markov Models

A gentle introduction to Hidden Markov Models A gentle introduction to Hidden Markov Models Mark Johnson Brown University November 2009 1 / 27 Outline What is sequence labeling? Markov models Hidden Markov models Finding the most likely state sequence

More information

Hidden Markov Modelling

Hidden Markov Modelling Hidden Markov Modelling Introduction Problem formulation Forward-Backward algorithm Viterbi search Baum-Welch parameter estimation Other considerations Multiple observation sequences Phone-based models

More information

Plan Martingales cont d. 0. Questions for Exam 2. More Examples 3. Overview of Results. Reading: study Next Time: first exam

Plan Martingales cont d. 0. Questions for Exam 2. More Examples 3. Overview of Results. Reading: study Next Time: first exam Plan Martingales cont d 0. Questions for Exam 2. More Examples 3. Overview of Results Reading: study Next Time: first exam Midterm Exam: Tuesday 28 March in class Sample exam problems ( Homework 5 and

More information

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk Instructor: Victor F. Araman December 4, 2003 Theory and Applications of Stochastic Systems Lecture 0 B60.432.0 Exponential Martingale for Random Walk Let (S n : n 0) be a random walk with i.i.d. increments

More information

Towards inference for skewed alpha stable Levy processes

Towards inference for skewed alpha stable Levy processes Towards inference for skewed alpha stable Levy processes Simon Godsill and Tatjana Lemke Signal Processing and Communications Lab. University of Cambridge www-sigproc.eng.cam.ac.uk/~sjg Overview Motivation

More information

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University Survival Analysis: Counting Process and Martingale Lu Tian and Richard Olshen Stanford University 1 Lebesgue-Stieltjes Integrals G( ) is a right-continuous step function having jumps at x 1, x 2,.. b f(x)dg(x)

More information

Tools of stochastic calculus

Tools of stochastic calculus slides for the course Interest rate theory, University of Ljubljana, 212-13/I, part III József Gáll University of Debrecen Nov. 212 Jan. 213, Ljubljana Itô integral, summary of main facts Notations, basic

More information

Markov Chains and Hidden Markov Models

Markov Chains and Hidden Markov Models Chapter 1 Markov Chains and Hidden Markov Models In this chapter, we will introduce the concept of Markov chains, and show how Markov chains can be used to model signals using structures such as hidden

More information

7. Shortest Path Problems and Deterministic Finite State Systems

7. Shortest Path Problems and Deterministic Finite State Systems 7. Shortest Path Problems and Deterministic Finite State Systems In the next two lectures we will look at shortest path problems, where the objective is to find the shortest path from a start node to an

More information

arxiv: v2 [math.pr] 4 Feb 2009

arxiv: v2 [math.pr] 4 Feb 2009 Optimal detection of homogeneous segment of observations in stochastic sequence arxiv:0812.3632v2 [math.pr] 4 Feb 2009 Abstract Wojciech Sarnowski a, Krzysztof Szajowski b,a a Wroc law University of Technology,

More information

Mi-Hwa Ko. t=1 Z t is true. j=0

Mi-Hwa Ko. t=1 Z t is true. j=0 Commun. Korean Math. Soc. 21 (2006), No. 4, pp. 779 786 FUNCTIONAL CENTRAL LIMIT THEOREMS FOR MULTIVARIATE LINEAR PROCESSES GENERATED BY DEPENDENT RANDOM VECTORS Mi-Hwa Ko Abstract. Let X t be an m-dimensional

More information

Martingale Theory for Finance

Martingale Theory for Finance Martingale Theory for Finance Tusheng Zhang October 27, 2015 1 Introduction 2 Probability spaces and σ-fields 3 Integration with respect to a probability measure. 4 Conditional expectation. 5 Martingales.

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Density Propagation for Continuous Temporal Chains Generative and Discriminative Models

Density Propagation for Continuous Temporal Chains Generative and Discriminative Models $ Technical Report, University of Toronto, CSRG-501, October 2004 Density Propagation for Continuous Temporal Chains Generative and Discriminative Models Cristian Sminchisescu and Allan Jepson Department

More information

Large Deviations Techniques and Applications

Large Deviations Techniques and Applications Amir Dembo Ofer Zeitouni Large Deviations Techniques and Applications Second Edition With 29 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1 Introduction 1

More information