Bayesian analysis of a parametric semi-markov process applied to seismic data

Size: px
Start display at page:

Download "Bayesian analysis of a parametric semi-markov process applied to seismic data"

Transcription

1 Bayesian analysis of a parametric semi-markov process applied to seismic data Ilenia Epifani, Politecnico di Milano Joint work with Lucia Ladelli, Politecnico di Milano and Antonio Pievatolo, (IMATI-CNR) July 8, 2013

2 Seismic Region Classification of earthquakes Exploratory Data Analysis 6 E 8 E 10 E 12 E 14 E 16 E 18 E 46 N 46 N MR3 44 N 44 N 42 N 42 N 40 N 40 N 38 N 38 N 36 N 36 N 6 E 8 E 10 E 12 E 14 E 16 E 18 E Earthquakes from 1838 to 2002 with magnitude Mw 4.5 occurred in a tectonically homogeneous macroregion in the central Northern Apennines in Italy

3 Seismic Region Classification of earthquakes Exploratory Data Analysis Three types of earthquakes according to their severity Low Class 1: 4.5 apple magnitude < 4.9 Medium Class 2: 4.9 apple magnitude < 5.3 High Class 3: magnitude 5.3

4 Seismic Region Classification of earthquakes Exploratory Data Analysis Three types of earthquakes according to their severity Low Class 1: 4.5 apple magnitude < 4.9 Medium Class 2: 4.9 apple magnitude < 5.3 High Class 3: magnitude 5.3 Data are collected by the CPTI04 (2004) catalogue, Subdivision in seismogenic Macroregion provided by DISS Working Group December 2002 is the closing date of the CPT104 (2004) (Rotondi, 2010 and Rotondi and Varini, 2012)

5 Some Statistics Seismic Region Classification of earthquakes Exploratory Data Analysis Table: Number of observed transitions in the whole dataset (271) 308 (278) 347 (791) (406) 373 (434) 433 (667) (301) 210 (287) 385 (385) Table: Mean inter-occurrence times and sd in days (rounded)

6 Seismic Region Classification of earthquakes Exploratory Data Analysis Weibull qq-plots of earthquake inter-occurrence times 1 to 1 1 to 2 1 to 3 sample quant sample quant sample quant theor. quant theor. quant theor. quant. 2 to 1 2 to 2 2 to 3 sample quant sample quant sample quant theor. quant theor. quant theor. quant. 3 to 1 3 to 2 3 to 3 sample quant sample quant sample quant theor. quant theor. quant theor. quant.

7 Trajectories of the process Semi-Markov process From a modelling point of view we observe over a period of time (0, T ] a process in which different earthquakes with magnitude of class j 0, j 1,...,j n occur, with random inter-occurrence times x 1,...,x n The process starts at the time of an earthquake with magnitude of known class j 0 When the ending time T does not coincide with an earthquake then the time past from last earthquake to T is right-censored

8 Trajectories of the process Semi-Markov process We assume that 1 the states J 1,...,J n,...are a discrete MC(j 0, {1, 2, 3}, P) 2 the sojourn times X 1,...,X n,...conditionally on j 0, j 1,...,j n,... are independent with X n J n 1, J n F Jn 1 J n 3 F ij = Weibull( ij, ij )

9 Trajectories of the process Semi-Markov process We assume that 1 the states J 1,...,J n,...are a discrete MC(j 0, {1, 2, 3}, P) 2 the sojourn times X 1,...,X n,...conditionally on j 0, j 1,...,j n,... are independent with X n J n 1, J n F Jn 1 J n 3 F ij = Weibull( ij, ij ) (J n, X n ) n is a semi-markov process with Weibull intertimes

10 Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion 1 the transition matrix P and the Weibull parameters ( ij, ij ) i,j=1,2,3 are independent

11 Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion 1 the transition matrix P and the Weibull parameters ( ij, ij ) i,j=1,2,3 are independent 2 The rows of P are independent Dirichlet vectors

12 Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion 1 the transition matrix P and the Weibull parameters ( ij, ij ) i,j=1,2,3 are independent 2 The rows of P are independent Dirichlet vectors 3 The 9 pairs of Weibull parameters ( 11, 11 ),...,( 33, 33 ) are independent with ij ij GIG(m ij, b ij ( ij ), ij ) ij 0 -shifted Gamma density (Berger and Sun, 1993 and Bousquet, 2010)

13 The prior Data Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion ij ij GIG(m ij, b ij ( ij ), ij ) ij 0 -shifted Gamma density can be seen as the output of the following mechanism

14 Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion The prior ij ij GIG(m ij, b ij ( ij ), ij ) ij 0 -shifted Gamma density can be seen as the output of the following mechanism First Implement a Bayesian inference on some historical data x m,...,x 1 of size m with diffuse prior (, ) / c 1( 0)1( 0 ), c 0, 0 > 0 Then Use the resulting posterior distributions as prior but conveniently modified

15 Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion x m,...,x 1 are m sojourn times in the string (i, j) ˆx q = its qth sample percentile b( ) =ˆx q [(1 q) 1/m 1] 1 P l = m ln(ˆx q ) ln x i

16 Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion x m,...,x 1 are m sojourn times in the string (i, j) ˆx q = its qth sample percentile b( ) =ˆx q [(1 q) 1/m 1] 1 P l = m ln(ˆx q ) ln x i m ( ) ( ) 2, 3,... GIG(m, b( ), ) 0 -shifted (m, l),

17 Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion x m,...,x 1 are m sojourn times in the string (i, j) ˆx q = its qth sample percentile b( ) =ˆx q [(1 q) 1/m 1] 1 P l = m ln(ˆx q ) ln x i m ( ) ( ) 2, 3,... GIG(m, b( ), ) 0 -shifted (m, l), 0 = 2 m

18 Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion x m,...,x 1 are m sojourn times in the string (i, j) ˆx q = its qth sample percentile b( ) =ˆx q [(1 q) 1/m 1] 1 P l = m ln(ˆx q ) ln x i m ( ) ( ) 2, 3,... GIG(m, b( ), ) 0 -shifted (m, l), 0 = 2 m 1 GIG(1, x 1, )

19 Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion x m,...,x 1 are m sojourn times in the string (i, j) ˆx q = its qth sample percentile b( ) =ˆx q [(1 q) 1/m 1] 1 P l = m ln(ˆx q ) ln x i m ( ) ( ) 2, 3,... GIG(m, b( ), ) 0 -shifted (m, l), 0 = 2 m 1 GIG(1, x1 1, ) 1 ( 0 < < 1 ), 0

20 Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion x m,...,x 1 are m sojourn times in the string (i, j) ˆx q = its qth sample percentile b( ) =ˆx q [(1 q) 1/m 1] 1 P l = m ln(ˆx q ) ln x i m ( ) ( ) 2, 3,... GIG(m, b( ), ) 0 -shifted (m, l), 0 = 2 m 1 GIG(1, x1 1, ) 1 ( 0 < < 1 ), 0 = 2 3, 1 = 10 0

21 Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion x m,...,x 1 are m sojourn times in the string (i, j) ˆx q = its qth sample percentile b( ) =ˆx q [(1 q) 1/m 1] 1 P l = m ln(ˆx q ) ln x i m ( ) ( ) 2, 3,... GIG(m, b( ), ) 0 -shifted (m, l), 0 = 2 m 1 GIG(1, x1 1, ) 1 ( 0 < < 1 ), 0 = 2 3, 1 =

22 Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion x m,...,x 1 are m sojourn times in the string (i, j) ˆx q = its qth sample percentile b( ) =ˆx q [(1 q) 1/m 1] 1 P l = m ln(ˆx q ) ln x i m ( ) ( ) 2, 3,... GIG(m, b( ), ) 0 -shifted (m, l), 0 = 2 m 1 GIG(1, x1 1, ) 0 1 ( 0 < < 1 ), 0 = 2 3, 1 = 10 ( GIG(1, X, ) 0 X Unif (...) ( 0 < < 1 )

23 Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion The Likelihood of semi-markov data depends only on

24 Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion The Likelihood of semi-markov data depends only on 1 the number of visits to each string (i, j): (N 11, N 12, N 13 ), (N 21, N 22, N 23 ), (N 31, N 32, N 33 ) 2 the times x (1) ij, x (2) ij,...,spent in state i at 1st, 2nd,..., visit to (i, j) (all independent and identically distributed for each string)

25 Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion The Likelihood of semi-markov data depends only on 1 the number of visits to each string (i, j): (N 11, N 12, N 13 ), (N 21, N 22, N 23 ), (N 31, N 32, N 33 ) 2 the times x (1) ij, x (2) ij,...,spent in state i at 1st, 2nd,..., visit to (i, j) (all independent and identically distributed for each string) 3 if time u from last earthquake is right-censored, introducing the future unosservable state j, the full likelihood factorizes L full = Y p N ij ij Y Y f (x ( ) ij ij, ij, N ij ) S urvival (u ij, ij ) p jnj ij ij

26 For this Bayesian Statistical model, Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion JAGS is able to run an exact Gibbs sampler (P,,,j )

27 For this Bayesian Statistical model, Prior Distributions Elicitation of the hyperparameters Likelihood of semi-markov data JAGS implementantion JAGS is able to run an exact Gibbs sampler (P,,,j ) J has discrete prior (p jn,1, p jn,2, p jn,3) (N 11, N 12, N 13 ) is multinomial-distributed with probability vector (p 11, p 12, p 13 ) Dirichlet exact times xij 1,...,x n ij ij are Weibull last right-censored time u is handled by a special instruction provided by the Jags language if m = 0, 1 the non-standard prior of is coded using the zero-trick other prior distributions are routine

28 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models Table: ( ) Table: ( )

29 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models Couple of states Intertimes (days) posterior predictive 2.5% and 97.5% quantiles 95% posterior predictive intervals of the intertimes (1,1) (2,1) (3,1) (1,2) (2,2) (3,2) (1,3) (2,3) (3,3) mean median

30 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models (0.102) (0.252) (0.103) (0.173) (0.232) (0.473) (0.179) (0.150) (0.618) Table: Posterior means followed by (sd) of the shape parameters ij

31 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models (0.102) (0.252) (0.103) (0.173) (0.232) (0.473) (0.179) (0.150) (0.618) Table: Posterior means followed by (sd) of the shape parameters ij Table: Posterior means of the transition matrix P

32 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models To predict what type of event and when it is most likely to occur, Cross state-probability P ij t 0 x that the next earthquake occurs within a time interval x and is of a given magnitude j, conditionally on the waiting time from last earthquake t 0 and on its magnitude i

33 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models To predict what type of event and when it is most likely to occur, Cross state-probability P ij t 0 x that the next earthquake occurs within a time interval x and is of a given magnitude j, conditionally on the waiting time from last earthquake t 0 and on its magnitude i In CPTI04 catalogue: last recorded event had been in class i = 2 and had occurred about t 0 = 32 months earlier x = 1 Month 2 Months 3 Months 4 Months 5 Months 6 Months Year 2 Years 3 Years 4 Years j = j = j =

34 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models To assess the predictive capability of the model

35 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models To assess the predictive capability of the model re-estimating the Cross state-probabilities, using only the data up to 31 December 2001, 31 December 2000, and so on backwards down to 1992.

36 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models To assess the predictive capability of the model re-estimating the Cross state-probabilities, using only the data up to 31 December 2001, 31 December 2000, and so on backwards down to Dec 31st CSP good performace for bad predictive performance for

37 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models Look at transition matrix & cross state-probability to state Which mechanism govern the earthquake generation: Time or Slip Predictable models?

38 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models In a time predictable model when a maximal energy threshold is reached, some fraction of it is released; therefore, the waiting time distribution depends on the current event type, but not on the next event type The strength of an event does not depend on the strength of the previous one:

39 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models In a time predictable model when a maximal energy threshold is reached, some fraction of it is released; therefore, the waiting time distribution depends on the current event type, but not on the next event type The strength of an event does not depend on the strength of the previous one: P has equal rows

40 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models In a time predictable model when a maximal energy threshold is reached, some fraction of it is released; therefore, the waiting time distribution depends on the current event type, but not on the next event type The strength of an event does not depend on the strength of the previous one: P has equal rows Given i, P ij t 0 x are proportional to each other as j = 1, 2, 3 8 x

41 Data Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models In a time predictable model when a maximal energy threshold is reached, some fraction of it is released; therefore, the waiting time distribution depends on the current event type, but not on the next event type The strength of an event does not depend on the strength of the previous one: P has equal rows Given i, P ij t 0 x are proportional to each other as j = 1, 2, 3 8 x Ratios of couples of CSPs from 1 Ratio of CSPs ratio of CSP (1,1) to (1,2) ratio of CSP (1,1) to (1,3) ratio of CSP (1,2) to (1,3) 1M 3M 5M 1Y 2Y 3Y 4Y 5Y 6Y 7Y 8Y 9Y

42 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models In a slip predictable model after an earthquake, energy falls to a minimal threshold and increases until the next event, where it starts to increase again from the same threshold. Here, the magnitude of an event depends on the length of the waiting time, but not on the magnitude of the previous one:

43 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models In a slip predictable model after an earthquake, energy falls to a minimal threshold and increases until the next event, where it starts to increase again from the same threshold. Here, the magnitude of an event depends on the length of the waiting time, but not on the magnitude of the previous one: P has equal rows

44 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models In a slip predictable model after an earthquake, energy falls to a minimal threshold and increases until the next event, where it starts to increase again from the same threshold. Here, the magnitude of an event depends on the length of the waiting time, but not on the magnitude of the previous one: P has equal rows Given j, P ij t 0 x are equal to each other as i = 1, 2, 3 8 x

45 Data Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models In a slip predictable model after an earthquake, energy falls to a minimal threshold and increases until the next event, where it starts to increase again from the same threshold. Here, the magnitude of an event depends on the length of the waiting time, but not on the magnitude of the previous one: P has equal rows Given j, P ij t 0 x are equal to each other as i = 1, 2, 3 8 x Ratio of couples of CSPs to 2 Ratio of CSPs ratio of CSP (1,2) to (2,2) ratio of CSP (1,2) to (3,2) ratio of CSP (2,2) to (3,2) 1M 3M 5M 1Y 2Y 3Y 4Y 5Y 6Y 7Y 8Y 9Y

46 Historical and current datasets Model fitting Parameters Estimate Forecasting Time or Slip Predictable models Neither a SPM nor a TPM seem to be supported by the posterior distributions of the parameters, because of the behavior of the waiting times

47 An example cost function When to start the jobs? How long should them last?

48 An example cost function When to start the jobs? How long should them last? Two kind of maintenance works on buildings Restoration of the strength the building had before the damage occurred. It s a short/medium term decision strengthening of the building (i.e. increase the strength of the building). It s a long term decision Focus on Restoration.

49 An example cost function When to start the jobs? How long should them last? Two kind of maintenance works on buildings Restoration of the strength the building had before the damage occurred. It s a short/medium term decision strengthening of the building (i.e. increase the strength of the building). It s a long term decision Focus on Restoration. The cost to be minimize is a function of time to next earthquake, given the earthquake type and the restoration starting and ending times

50 An example cost function c R = restoration cost c E (j) =collapse cost (due to next earthquake of type j) a 1, a 2 = restoration starting and ending times (decisional parameters); a 1 apple a 2 = time of next eartquake All time parameters have as origin the time of occurrence of latest earthquake J n = i Assume c R < c E (j).

51 An example cost function c R = restoration cost c E (j) =collapse cost (due to next earthquake of type j) a 1, a 2 = restoration starting and ending times (decisional parameters); a 1 apple a 2 = time of next eartquake All time parameters have as origin the time of occurrence of latest earthquake J n = i Assume c R < c E (j). 8 >< c R if >a 2 a C(,j; a 1, c R, a 2, c E (j)) = 1 a >: 2 a 1 c R + a 2 a 2 a 1 c E (j) if a 1 apple apple a 2 c E (j) if <a 1

52 An example cost function cost restoration start restoration end time of next eartquake

53 An example cost function Cost of collapse or restoration cost ending time starting time as a function of restoration starting and ending times given next earthquake time and type

54 Average Surfaces pointwise using p J, Jn=i(j, t) An example cost function t ' P ij 0 t (P ij 0 t = P(J = j, 2 (t, t + t] J n = i))

55 Average Surfaces pointwise using p J, Jn=i(j, t) An example cost function t ' P ij 0 t (P ij 0 t = P(J = j, 2 (t, t + t] J n = i)) Unfortunately The average surface preserves the same monotonicity patterns of individual surfaces and the best decision would be a 1 = a 2 = 0: not feasible

56 Average Surfaces pointwise using p J, Jn=i(j, t) An example cost function t ' P ij 0 t (P ij 0 t = P(J = j, 2 (t, t + t] J n = i)) Unfortunately The average surface preserves the same monotonicity patterns of individual surfaces and the best decision would be a 1 = a 2 = 0: not feasible But one has to choose (a 1, a 2 ) which meets external constraints at an acceptable nonzero cost (risk)

57 Average Surfaces pointwise using p J, Jn=i(j, t) An example cost function t ' P ij 0 t (P ij 0 t = P(J = j, 2 (t, t + t] J n = i)) Unfortunately The average surface preserves the same monotonicity patterns of individual surfaces and the best decision would be a 1 = a 2 = 0: not feasible But one has to choose (a 1, a 2 ) which meets external constraints at an acceptable nonzero cost (risk) Or one can assume c R > c E (1), for earthquake type 1 only, so that not all surfaces have the same monotonicity pattern

58 Average Surfaces pointwise using p J, Jn=i(j, t) An example cost function t ' P ij 0 t (P ij 0 t = P(J = j, 2 (t, t + t] J n = i)) Unfortunately The average surface preserves the same monotonicity patterns of individual surfaces and the best decision would be a 1 = a 2 = 0: not feasible But one has to choose (a 1, a 2 ) which meets external constraints at an acceptable nonzero cost (risk) Or one can assume c R > c E (1), for earthquake type 1 only, so that not all surfaces have the same monotonicity pattern BUT: everything about this decision problem remains to be done

59 I. Epifani, L. Ladelli, A. Pievatolo (2013). Bayesian estimation for a parametric Markov Renewal model applied to seismic data,

60 I. Epifani, L. Ladelli, A. Pievatolo (2013). Bayesian estimation for a parametric Markov Renewal model applied to seismic data, Thank you for your attention!

arxiv: v3 [stat.me] 4 Apr 2014

arxiv: v3 [stat.me] 4 Apr 2014 Bayesian estimation for a parametric Markov Renewal model applied to seismic data arxiv:1301.6494v3 [stat.me] 4 Apr 2014 Ilenia Epifani Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano,

More information

Bayesian estimation for a parametric Markov Renewal model applied to seismic data

Bayesian estimation for a parametric Markov Renewal model applied to seismic data Electronic Journal of Statistics Vol. 8 (2014) 2264 2295 ISSN: 1935-7524 DOI: 10.1214/14-EJS952 Bayesian estimation for a parametric Markov Renewal model applied to seismic data Ilenia Epifani Politecnico

More information

IMATI-Milano Department

IMATI-Milano Department IMATI-Milano Department Main research topics: Data analysis Decision analysis Evaluation of seismic risk Industrial statistics Multivariate approximation Nonparametric Bayesian inference Reliability analysis

More information

Parameter Estimation

Parameter Estimation Parameter Estimation Chapters 13-15 Stat 477 - Loss Models Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 1 / 23 Methods for parameter estimation Methods for parameter estimation Methods

More information

\ fwf The Institute for Integrating Statistics in Decision Sciences

\ fwf The Institute for Integrating Statistics in Decision Sciences # \ fwf The Institute for Integrating Statistics in Decision Sciences Technical Report TR-2007-8 May 22, 2007 Advances in Bayesian Software Reliability Modelling Fabrizio Ruggeri CNR IMATI Milano, Italy

More information

Mixture modelling of recurrent event times with long-term survivors: Analysis of Hutterite birth intervals. John W. Mac McDonald & Alessandro Rosina

Mixture modelling of recurrent event times with long-term survivors: Analysis of Hutterite birth intervals. John W. Mac McDonald & Alessandro Rosina Mixture modelling of recurrent event times with long-term survivors: Analysis of Hutterite birth intervals John W. Mac McDonald & Alessandro Rosina Quantitative Methods in the Social Sciences Seminar -

More information

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I SYDE 372 Introduction to Pattern Recognition Probability Measures for Classification: Part I Alexander Wong Department of Systems Design Engineering University of Waterloo Outline 1 2 3 4 Why use probability

More information

FAV i R This paper is produced mechanically as part of FAViR. See for more information.

FAV i R This paper is produced mechanically as part of FAViR. See  for more information. Bayesian Claim Severity Part 2 Mixed Exponentials with Trend, Censoring, and Truncation By Benedict Escoto FAV i R This paper is produced mechanically as part of FAViR. See http://www.favir.net for more

More information

Bayesian Inference Technique for Data mining for Yield Enhancement in Semiconductor Manufacturing Data

Bayesian Inference Technique for Data mining for Yield Enhancement in Semiconductor Manufacturing Data Bayesian Inference Technique for Data mining for Yield Enhancement in Semiconductor Manufacturing Data Presenter: M. Khakifirooz Co-authors: C-F Chien, Y-J Chen National Tsing Hua University ISMI 2015,

More information

Bayesian semiparametric inference for the accelerated failure time model using hierarchical mixture modeling with N-IG priors

Bayesian semiparametric inference for the accelerated failure time model using hierarchical mixture modeling with N-IG priors Bayesian semiparametric inference for the accelerated failure time model using hierarchical mixture modeling with N-IG priors Raffaele Argiento 1, Alessandra Guglielmi 2, Antonio Pievatolo 1, Fabrizio

More information

Evaluating the value of structural heath monitoring with longitudinal performance indicators and hazard functions using Bayesian dynamic predictions

Evaluating the value of structural heath monitoring with longitudinal performance indicators and hazard functions using Bayesian dynamic predictions Evaluating the value of structural heath monitoring with longitudinal performance indicators and hazard functions using Bayesian dynamic predictions C. Xing, R. Caspeele, L. Taerwe Ghent University, Department

More information

Part 7: Hierarchical Modeling

Part 7: Hierarchical Modeling Part 7: Hierarchical Modeling!1 Nested data It is common for data to be nested: i.e., observations on subjects are organized by a hierarchy Such data are often called hierarchical or multilevel For example,

More information

Probabilistic modeling. The slides are closely adapted from Subhransu Maji s slides

Probabilistic modeling. The slides are closely adapted from Subhransu Maji s slides Probabilistic modeling The slides are closely adapted from Subhransu Maji s slides Overview So far the models and algorithms you have learned about are relatively disconnected Probabilistic modeling framework

More information

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo Group Prof. Daniel Cremers 10a. Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative is Markov Chain

More information

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 Plan for today and next week Today and next time: Bayesian networks (Bishop Sec. 8.1) Conditional

More information

Directed Graphical Models

Directed Graphical Models CS 2750: Machine Learning Directed Graphical Models Prof. Adriana Kovashka University of Pittsburgh March 28, 2017 Graphical Models If no assumption of independence is made, must estimate an exponential

More information

CS340 Winter 2010: HW3 Out Wed. 2nd February, due Friday 11th February

CS340 Winter 2010: HW3 Out Wed. 2nd February, due Friday 11th February CS340 Winter 2010: HW3 Out Wed. 2nd February, due Friday 11th February 1 PageRank You are given in the file adjency.mat a matrix G of size n n where n = 1000 such that { 1 if outbound link from i to j,

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 7 Approximate

More information

SPRING 2007 EXAM C SOLUTIONS

SPRING 2007 EXAM C SOLUTIONS SPRING 007 EXAM C SOLUTIONS Question #1 The data are already shifted (have had the policy limit and the deductible of 50 applied). The two 350 payments are censored. Thus the likelihood function is L =

More information

Bayesian Learning. Bayesian Learning Criteria

Bayesian Learning. Bayesian Learning Criteria Bayesian Learning In Bayesian learning, we are interested in the probability of a hypothesis h given the dataset D. By Bayes theorem: P (h D) = P (D h)p (h) P (D) Other useful formulas to remember are:

More information

Lecture 5: Bayesian Network

Lecture 5: Bayesian Network Lecture 5: Bayesian Network Topics of this lecture What is a Bayesian network? A simple example Formal definition of BN A slightly difficult example Learning of BN An example of learning Important topics

More information

Bayesian Methods: Naïve Bayes

Bayesian Methods: Naïve Bayes Bayesian Methods: aïve Bayes icholas Ruozzi University of Texas at Dallas based on the slides of Vibhav Gogate Last Time Parameter learning Learning the parameter of a simple coin flipping model Prior

More information

More on HMMs and other sequence models. Intro to NLP - ETHZ - 18/03/2013

More on HMMs and other sequence models. Intro to NLP - ETHZ - 18/03/2013 More on HMMs and other sequence models Intro to NLP - ETHZ - 18/03/2013 Summary Parts of speech tagging HMMs: Unsupervised parameter estimation Forward Backward algorithm Bayesian variants Discriminative

More information

Part 8: GLMs and Hierarchical LMs and GLMs

Part 8: GLMs and Hierarchical LMs and GLMs Part 8: GLMs and Hierarchical LMs and GLMs 1 Example: Song sparrow reproductive success Arcese et al., (1992) provide data on a sample from a population of 52 female song sparrows studied over the course

More information

ECO 513 Fall 2009 C. Sims HIDDEN MARKOV CHAIN MODELS

ECO 513 Fall 2009 C. Sims HIDDEN MARKOV CHAIN MODELS ECO 513 Fall 2009 C. Sims HIDDEN MARKOV CHAIN MODELS 1. THE CLASS OF MODELS y t {y s, s < t} p(y t θ t, {y s, s < t}) θ t = θ(s t ) P[S t = i S t 1 = j] = h ij. 2. WHAT S HANDY ABOUT IT Evaluating the

More information

Gentle Introduction to Infinite Gaussian Mixture Modeling

Gentle Introduction to Infinite Gaussian Mixture Modeling Gentle Introduction to Infinite Gaussian Mixture Modeling with an application in neuroscience By Frank Wood Rasmussen, NIPS 1999 Neuroscience Application: Spike Sorting Important in neuroscience and for

More information

Checking the Reliability of Reliability Models.

Checking the Reliability of Reliability Models. Checking the Reliability of Reliability Models. Seminario de Estadística, CIMAT Abril 3, 007. Víctor Aguirre Torres Departmento de Estadística, ITAM Área de Probabilidad y Estadística, CIMAT Credits. Partially

More information

A time-to-event random variable T has survivor function F de ned by: F (t) := P ft > tg. The maximum likelihood estimator of F is ^F.

A time-to-event random variable T has survivor function F de ned by: F (t) := P ft > tg. The maximum likelihood estimator of F is ^F. Empirical Likelihood F. P. Treasure 12-May-05 3010a Objective: to obtain point- and interval- estimates of time-to-event probabilities using a non-parametric, empirical, maximum likelihood estimate of

More information

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics STA414/2104 Lecture 11: Gaussian Processes Department of Statistics www.utstat.utoronto.ca Delivered by Mark Ebden with thanks to Russ Salakhutdinov Outline Gaussian Processes Exam review Course evaluations

More information

Multistate Modeling and Applications

Multistate Modeling and Applications Multistate Modeling and Applications Yang Yang Department of Statistics University of Michigan, Ann Arbor IBM Research Graduate Student Workshop: Statistics for a Smarter Planet Yang Yang (UM, Ann Arbor)

More information

Bayesian Decision Theory

Bayesian Decision Theory Bayesian Decision Theory Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Bayesian Decision Theory Bayesian classification for normal distributions Error Probabilities

More information

COS513 LECTURE 8 STATISTICAL CONCEPTS

COS513 LECTURE 8 STATISTICAL CONCEPTS COS513 LECTURE 8 STATISTICAL CONCEPTS NIKOLAI SLAVOV AND ANKUR PARIKH 1. MAKING MEANINGFUL STATEMENTS FROM JOINT PROBABILITY DISTRIBUTIONS. A graphical model (GM) represents a family of probability distributions

More information

Bayes methods for categorical data. April 25, 2017

Bayes methods for categorical data. April 25, 2017 Bayes methods for categorical data April 25, 2017 Motivation for joint probability models Increasing interest in high-dimensional data in broad applications Focus may be on prediction, variable selection,

More information

2 Inference for Multinomial Distribution

2 Inference for Multinomial Distribution Markov Chain Monte Carlo Methods Part III: Statistical Concepts By K.B.Athreya, Mohan Delampady and T.Krishnan 1 Introduction In parts I and II of this series it was shown how Markov chain Monte Carlo

More information

Seismic Hazard Assessment for Specified Area

Seismic Hazard Assessment for Specified Area ESTIATION OF AXIU REGIONAL AGNITUDE m At present there is no generally accepted method for estimating the value of the imum regional magnitude m. The methods for evaluating m fall into two main categories:

More information

The Bayesian Choice. Christian P. Robert. From Decision-Theoretic Foundations to Computational Implementation. Second Edition.

The Bayesian Choice. Christian P. Robert. From Decision-Theoretic Foundations to Computational Implementation. Second Edition. Christian P. Robert The Bayesian Choice From Decision-Theoretic Foundations to Computational Implementation Second Edition With 23 Illustrations ^Springer" Contents Preface to the Second Edition Preface

More information

(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis

(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis Summarizing a posterior Given the data and prior the posterior is determined Summarizing the posterior gives parameter estimates, intervals, and hypothesis tests Most of these computations are integrals

More information

Probabilistic procedure to estimate the macroseismic intensity attenuation in the Italian volcanic districts

Probabilistic procedure to estimate the macroseismic intensity attenuation in the Italian volcanic districts Probabilistic procedure to estimate the macroseismic intensity attenuation in the Italian volcanic districts G. Zonno 1, R. Azzaro 2, R. Rotondi 3, S. D'Amico 2, T. Tuvè 2 and G. Musacchio 1 1 Istituto

More information

Modeling conditional distributions with mixture models: Theory and Inference

Modeling conditional distributions with mixture models: Theory and Inference Modeling conditional distributions with mixture models: Theory and Inference John Geweke University of Iowa, USA Journal of Applied Econometrics Invited Lecture Università di Venezia Italia June 2, 2005

More information

Curve Fitting Re-visited, Bishop1.2.5

Curve Fitting Re-visited, Bishop1.2.5 Curve Fitting Re-visited, Bishop1.2.5 Maximum Likelihood Bishop 1.2.5 Model Likelihood differentiation p(t x, w, β) = Maximum Likelihood N N ( t n y(x n, w), β 1). (1.61) n=1 As we did in the case of the

More information

Project S1: Analysis of the seismic potential in Italy for the evaluation of the seismic hazard

Project S1: Analysis of the seismic potential in Italy for the evaluation of the seismic hazard Agreement INGV-DPC 2007-2009 Project S1: Analysis of the seismic potential in Italy for the evaluation of the seismic hazard Responsibles: Salvatore Barba, Istituto Nazionale di Geofisica e Vulcanologia,

More information

CTDL-Positive Stable Frailty Model

CTDL-Positive Stable Frailty Model CTDL-Positive Stable Frailty Model M. Blagojevic 1, G. MacKenzie 2 1 Department of Mathematics, Keele University, Staffordshire ST5 5BG,UK and 2 Centre of Biostatistics, University of Limerick, Ireland

More information

STA 414/2104: Machine Learning

STA 414/2104: Machine Learning STA 414/2104: Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistics! rsalakhu@cs.toronto.edu! http://www.cs.toronto.edu/~rsalakhu/ Lecture 9 Sequential Data So far

More information

Modelling Operational Risk Using Bayesian Inference

Modelling Operational Risk Using Bayesian Inference Pavel V. Shevchenko Modelling Operational Risk Using Bayesian Inference 4y Springer 1 Operational Risk and Basel II 1 1.1 Introduction to Operational Risk 1 1.2 Defining Operational Risk 4 1.3 Basel II

More information

Bayesian Inference. Chapter 4: Regression and Hierarchical Models

Bayesian Inference. Chapter 4: Regression and Hierarchical Models Bayesian Inference Chapter 4: Regression and Hierarchical Models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Advanced Statistics and Data Mining Summer School

More information

Contents. Part I: Fundamentals of Bayesian Inference 1

Contents. Part I: Fundamentals of Bayesian Inference 1 Contents Preface xiii Part I: Fundamentals of Bayesian Inference 1 1 Probability and inference 3 1.1 The three steps of Bayesian data analysis 3 1.2 General notation for statistical inference 4 1.3 Bayesian

More information

Fast Approximate MAP Inference for Bayesian Nonparametrics

Fast Approximate MAP Inference for Bayesian Nonparametrics Fast Approximate MAP Inference for Bayesian Nonparametrics Y. Raykov A. Boukouvalas M.A. Little Department of Mathematics Aston University 10th Conference on Bayesian Nonparametrics, 2015 1 Iterated Conditional

More information

Statistics & Data Sciences: First Year Prelim Exam May 2018

Statistics & Data Sciences: First Year Prelim Exam May 2018 Statistics & Data Sciences: First Year Prelim Exam May 2018 Instructions: 1. Do not turn this page until instructed to do so. 2. Start each new question on a new sheet of paper. 3. This is a closed book

More information

Bayesian Point Process Modeling for Extreme Value Analysis, with an Application to Systemic Risk Assessment in Correlated Financial Markets

Bayesian Point Process Modeling for Extreme Value Analysis, with an Application to Systemic Risk Assessment in Correlated Financial Markets Bayesian Point Process Modeling for Extreme Value Analysis, with an Application to Systemic Risk Assessment in Correlated Financial Markets Athanasios Kottas Department of Applied Mathematics and Statistics,

More information

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007 Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember

More information

Supplement to A Hierarchical Approach for Fitting Curves to Response Time Measurements

Supplement to A Hierarchical Approach for Fitting Curves to Response Time Measurements Supplement to A Hierarchical Approach for Fitting Curves to Response Time Measurements Jeffrey N. Rouder Francis Tuerlinckx Paul L. Speckman Jun Lu & Pablo Gomez May 4 008 1 The Weibull regression model

More information

Bayesian Analysis for Natural Language Processing Lecture 2

Bayesian Analysis for Natural Language Processing Lecture 2 Bayesian Analysis for Natural Language Processing Lecture 2 Shay Cohen February 4, 2013 Administrativia The class has a mailing list: coms-e6998-11@cs.columbia.edu Need two volunteers for leading a discussion

More information

Abstract. 1. Introduction

Abstract. 1. Introduction Abstract Repairable system reliability: recent developments in CBM optimization A.K.S. Jardine, D. Banjevic, N. Montgomery, A. Pak Department of Mechanical and Industrial Engineering, University of Toronto,

More information

7. Shortest Path Problems and Deterministic Finite State Systems

7. Shortest Path Problems and Deterministic Finite State Systems 7. Shortest Path Problems and Deterministic Finite State Systems In the next two lectures we will look at shortest path problems, where the objective is to find the shortest path from a start node to an

More information

Nonparametric Estimation for Semi-Markov Processes Based on K-Sample Paths with Application to Reliability

Nonparametric Estimation for Semi-Markov Processes Based on K-Sample Paths with Application to Reliability Nonparametric Estimation for Semi-Markov Processes Based on K-Sample Paths with Application to Reliability Nikolaos Limnios 1 and Brahim Ouhbi 2 1 Laboratoire de Mathématiques Appliquées, Université de

More information

Lecture 14 Bayesian Models for Spatio-Temporal Data

Lecture 14 Bayesian Models for Spatio-Temporal Data Lecture 14 Bayesian Models for Spatio-Temporal Data Dennis Sun Stats 253 August 13, 2014 Outline of Lecture 1 Recap of Bayesian Models 2 Empirical Bayes 3 Case 1: Long-Lead Forecasting of Sea Surface Temperatures

More information

Index. 0 1 loss function, 467

Index. 0 1 loss function, 467 Index 0 1 loss function, 467 a priori,374 abs command, 685 absolutely continuous jointly, 85 absolutely continuous random variable, 52 acceptance probability, 644 action space, 464 additive, 5 additive

More information

Bayesian inference on earthquake size distribution: a case study in Italy

Bayesian inference on earthquake size distribution: a case study in Italy Bayesian inference on earthquake size distribution: a case study in Italy Licia Faenza 1, Carlo Meletti 2 and Laura Sandri 3 1 Istituto Nazionale di Geofisica e Vulcanologia, CNT; via di Vigna Murata 65,

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Parameter Estimation December 14, 2015 Overview 1 Motivation 2 3 4 What did we have so far? 1 Representations: how do we model the problem? (directed/undirected). 2 Inference: given a model and partially

More information

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA Intro: Course Outline and Brief Intro to Marina Vannucci Rice University, USA PASI-CIMAT 04/28-30/2010 Marina Vannucci

More information

Checking the Reliability of Reliability Models.

Checking the Reliability of Reliability Models. Checking the Reliability of Reliability. INFORMS 007 Puerto Rico. July 8-11. Víctor Aguirre Torres Stat Department Instituto Tecnológico Autónomo de México ITAM Credits. Partially Sponsored by Asociación

More information

Econ 2140, spring 2018, Part IIa Statistical Decision Theory

Econ 2140, spring 2018, Part IIa Statistical Decision Theory Econ 2140, spring 2018, Part IIa Maximilian Kasy Department of Economics, Harvard University 1 / 35 Examples of decision problems Decide whether or not the hypothesis of no racial discrimination in job

More information

L. Danciu, D. Giardini, J. Wößner Swiss Seismological Service ETH-Zurich Switzerland

L. Danciu, D. Giardini, J. Wößner Swiss Seismological Service ETH-Zurich Switzerland BUILDING CAPACITIES FOR ELABORATION OF NDPs AND NAs OF THE EUROCODES IN THE BALKAN REGION Experience on the field of seismic hazard zonation SHARE Project L. Danciu, D. Giardini, J. Wößner Swiss Seismological

More information

Steven L. Scott. Presented by Ahmet Engin Ural

Steven L. Scott. Presented by Ahmet Engin Ural Steven L. Scott Presented by Ahmet Engin Ural Overview of HMM Evaluating likelihoods The Likelihood Recursion The Forward-Backward Recursion Sampling HMM DG and FB samplers Autocovariance of samplers Some

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

Bayesian Networks. instructor: Matteo Pozzi. x 1. x 2. x 3 x 4. x 5. x 6. x 7. x 8. x 9. Lec : Urban Systems Modeling

Bayesian Networks. instructor: Matteo Pozzi. x 1. x 2. x 3 x 4. x 5. x 6. x 7. x 8. x 9. Lec : Urban Systems Modeling 12735: Urban Systems Modeling Lec. 09 Bayesian Networks instructor: Matteo Pozzi x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 1 outline example of applications how to shape a problem as a BN complexity of the inference

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project

More information

Theory of Stochastic Processes 8. Markov chain Monte Carlo

Theory of Stochastic Processes 8. Markov chain Monte Carlo Theory of Stochastic Processes 8. Markov chain Monte Carlo Tomonari Sei sei@mist.i.u-tokyo.ac.jp Department of Mathematical Informatics, University of Tokyo June 8, 2017 http://www.stat.t.u-tokyo.ac.jp/~sei/lec.html

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate

More information

Ages of stellar populations from color-magnitude diagrams. Paul Baines. September 30, 2008

Ages of stellar populations from color-magnitude diagrams. Paul Baines. September 30, 2008 Ages of stellar populations from color-magnitude diagrams Paul Baines Department of Statistics Harvard University September 30, 2008 Context & Example Welcome! Today we will look at using hierarchical

More information

Bayesian Inference. Chapter 4: Regression and Hierarchical Models

Bayesian Inference. Chapter 4: Regression and Hierarchical Models Bayesian Inference Chapter 4: Regression and Hierarchical Models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Master in Business Administration and Quantitative

More information

Pattern Classification

Pattern Classification Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley & Sons, 2000 with the permission of the authors

More information

Describing Contingency tables

Describing Contingency tables Today s topics: Describing Contingency tables 1. Probability structure for contingency tables (distributions, sensitivity/specificity, sampling schemes). 2. Comparing two proportions (relative risk, odds

More information

Topic Modelling and Latent Dirichlet Allocation

Topic Modelling and Latent Dirichlet Allocation Topic Modelling and Latent Dirichlet Allocation Stephen Clark (with thanks to Mark Gales for some of the slides) Lent 2013 Machine Learning for Language Processing: Lecture 7 MPhil in Advanced Computer

More information

Lecture 9. Time series prediction

Lecture 9. Time series prediction Lecture 9 Time series prediction Prediction is about function fitting To predict we need to model There are a bewildering number of models for data we look at some of the major approaches in this lecture

More information

Longitudinal breast density as a marker of breast cancer risk

Longitudinal breast density as a marker of breast cancer risk Longitudinal breast density as a marker of breast cancer risk C. Armero (1), M. Rué (2), A. Forte (1), C. Forné (2), H. Perpiñán (1), M. Baré (3), and G. Gómez (4) (1) BIOstatnet and Universitat de València,

More information

Technical Appendix Detailing Estimation Procedures used in A. Flexible Approach to Modeling Ultimate Recoveries on Defaulted

Technical Appendix Detailing Estimation Procedures used in A. Flexible Approach to Modeling Ultimate Recoveries on Defaulted Technical Appendix Detailing Estimation Procedures used in A Flexible Approach to Modeling Ultimate Recoveries on Defaulted Loans and Bonds (Web Supplement) Section 1 of this Appendix provides a brief

More information

Bayesian Models in Machine Learning

Bayesian Models in Machine Learning Bayesian Models in Machine Learning Lukáš Burget Escuela de Ciencias Informáticas 2017 Buenos Aires, July 24-29 2017 Frequentist vs. Bayesian Frequentist point of view: Probability is the frequency of

More information

Language Information Processing, Advanced. Topic Models

Language Information Processing, Advanced. Topic Models Language Information Processing, Advanced Topic Models mcuturi@i.kyoto-u.ac.jp Kyoto University - LIP, Adv. - 2011 1 Today s talk Continue exploring the representation of text as histogram of words. Objective:

More information

Statistical Inference and Methods

Statistical Inference and Methods Department of Mathematics Imperial College London d.stephens@imperial.ac.uk http://stats.ma.ic.ac.uk/ das01/ 31st January 2006 Part VI Session 6: Filtering and Time to Event Data Session 6: Filtering and

More information

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007 Particle Filtering a brief introductory tutorial Frank Wood Gatsby, August 2007 Problem: Target Tracking A ballistic projectile has been launched in our direction and may or may not land near enough to

More information

The Mixture Approach for Simulating New Families of Bivariate Distributions with Specified Correlations

The Mixture Approach for Simulating New Families of Bivariate Distributions with Specified Correlations The Mixture Approach for Simulating New Families of Bivariate Distributions with Specified Correlations John R. Michael, Significance, Inc. and William R. Schucany, Southern Methodist University The mixture

More information

Topic Models. Brandon Malone. February 20, Latent Dirichlet Allocation Success Stories Wrap-up

Topic Models. Brandon Malone. February 20, Latent Dirichlet Allocation Success Stories Wrap-up Much of this material is adapted from Blei 2003. Many of the images were taken from the Internet February 20, 2014 Suppose we have a large number of books. Each is about several unknown topics. How can

More information

TABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1

TABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1 TABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1 1.1 The Probability Model...1 1.2 Finite Discrete Models with Equally Likely Outcomes...5 1.2.1 Tree Diagrams...6 1.2.2 The Multiplication Principle...8

More information

POSTERIOR PROPRIETY IN SOME HIERARCHICAL EXPONENTIAL FAMILY MODELS

POSTERIOR PROPRIETY IN SOME HIERARCHICAL EXPONENTIAL FAMILY MODELS POSTERIOR PROPRIETY IN SOME HIERARCHICAL EXPONENTIAL FAMILY MODELS EDWARD I. GEORGE and ZUOSHUN ZHANG The University of Texas at Austin and Quintiles Inc. June 2 SUMMARY For Bayesian analysis of hierarchical

More information

Stat 5101 Lecture Notes

Stat 5101 Lecture Notes Stat 5101 Lecture Notes Charles J. Geyer Copyright 1998, 1999, 2000, 2001 by Charles J. Geyer May 7, 2001 ii Stat 5101 (Geyer) Course Notes Contents 1 Random Variables and Change of Variables 1 1.1 Random

More information

Statistical inference for Markov deterioration models of bridge conditions in the Netherlands

Statistical inference for Markov deterioration models of bridge conditions in the Netherlands Statistical inference for Markov deterioration models of bridge conditions in the Netherlands M.J.Kallen & J.M. van Noortwijk HKV Consultants, Lelystad, and Delft University of Technology, Delft, Netherlands

More information

Study Notes on the Latent Dirichlet Allocation

Study Notes on the Latent Dirichlet Allocation Study Notes on the Latent Dirichlet Allocation Xugang Ye 1. Model Framework A word is an element of dictionary {1,,}. A document is represented by a sequence of words: =(,, ), {1,,}. A corpus is a collection

More information

Bayesian Inference for Clustered Extremes

Bayesian Inference for Clustered Extremes Newcastle University, Newcastle-upon-Tyne, U.K. lee.fawcett@ncl.ac.uk 20th TIES Conference: Bologna, Italy, July 2009 Structure of this talk 1. Motivation and background 2. Review of existing methods Limitations/difficulties

More information

An EM-Algorithm Based Method to Deal with Rounded Zeros in Compositional Data under Dirichlet Models. Rafiq Hijazi

An EM-Algorithm Based Method to Deal with Rounded Zeros in Compositional Data under Dirichlet Models. Rafiq Hijazi An EM-Algorithm Based Method to Deal with Rounded Zeros in Compositional Data under Dirichlet Models Rafiq Hijazi Department of Statistics United Arab Emirates University P.O. Box 17555, Al-Ain United

More information

Bayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1

Bayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1 Bayes Networks CS540 Bryan R Gibson University of Wisconsin-Madison Slides adapted from those used by Prof. Jerry Zhu, CS540-1 1 / 59 Outline Joint Probability: great for inference, terrible to obtain

More information

Bayesian Inference. Chapter 2: Conjugate models

Bayesian Inference. Chapter 2: Conjugate models Bayesian Inference Chapter 2: Conjugate models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Master in Business Administration and Quantitative Methods Master in

More information

Computational statistics

Computational statistics Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated

More information

Statistical Inference for Stochastic Epidemic Models

Statistical Inference for Stochastic Epidemic Models Statistical Inference for Stochastic Epidemic Models George Streftaris 1 and Gavin J. Gibson 1 1 Department of Actuarial Mathematics & Statistics, Heriot-Watt University, Riccarton, Edinburgh EH14 4AS,

More information

Design of Optimal Bayesian Reliability Test Plans for a Series System

Design of Optimal Bayesian Reliability Test Plans for a Series System Volume 109 No 9 2016, 125 133 ISSN: 1311-8080 (printed version); ISSN: 1314-3395 (on-line version) url: http://wwwijpameu ijpameu Design of Optimal Bayesian Reliability Test Plans for a Series System P

More information

Bayesian Inference and Life Testing Plan for the Weibull Distribution in Presence of Progressive Censoring

Bayesian Inference and Life Testing Plan for the Weibull Distribution in Presence of Progressive Censoring Bayesian Inference and Life Testing Plan for the Weibull Distribution in Presence of Progressive Censoring Debasis KUNDU Department of Mathematics and Statistics Indian Institute of Technology Kanpur Pin

More information

Latent Dirichlet Bayesian Co-Clustering

Latent Dirichlet Bayesian Co-Clustering Latent Dirichlet Bayesian Co-Clustering Pu Wang 1, Carlotta Domeniconi 1, and athryn Blackmond Laskey 1 Department of Computer Science Department of Systems Engineering and Operations Research George Mason

More information

STOCHASTIC MODELING OF ENVIRONMENTAL TIME SERIES. Richard W. Katz LECTURE 5

STOCHASTIC MODELING OF ENVIRONMENTAL TIME SERIES. Richard W. Katz LECTURE 5 STOCHASTIC MODELING OF ENVIRONMENTAL TIME SERIES Richard W Katz LECTURE 5 (1) Hidden Markov Models: Applications (2) Hidden Markov Models: Viterbi Algorithm (3) Non-Homogeneous Hidden Markov Model (1)

More information

Fitting Narrow Emission Lines in X-ray Spectra

Fitting Narrow Emission Lines in X-ray Spectra Outline Fitting Narrow Emission Lines in X-ray Spectra Taeyoung Park Department of Statistics, University of Pittsburgh October 11, 2007 Outline of Presentation Outline This talk has three components:

More information

Bayesian Multivariate Extreme Value Thresholding for Environmental Hazards

Bayesian Multivariate Extreme Value Thresholding for Environmental Hazards Bayesian Multivariate Extreme Value Thresholding for Environmental Hazards D. Lupton K. Abayomi M. Lacer School of Industrial and Systems Engineering Georgia Institute of Technology Institute for Operations

More information