Monitoring and data filtering II. Dynamic Linear Models

Size: px
Start display at page:

Download "Monitoring and data filtering II. Dynamic Linear Models"

Transcription

1 Monitoring and data filtering II. Dynamic Linear Models Advanced Herd Management Cécile Cornou, IPH Dias 1

2 Program for monitoring and data filtering Friday 26 (morning) - Lecture for part I : use of control charts - Exercise 1 with R Tuesday 28 (today) - Lecture for part II : Dynamic Linear Models (DLMs) - Exercise 2 with R - Introduction to mandatory report Tuesday 28 (afternoon) - Thomas N. Madsen presents an application of DLM - Exercises (and MR?) Friday 03 (morning) - Kalman Filter and its relation to other techniques (sum up of the methods introduced) - Mandatory report Dias 2

3 Outline Introduction to the DLM (West and Harrison, chapter 2) Updating equations: Kalman Filter Discount factor as an aid to choose W Incorporate external information: Intervention General form of the DLM Examples Concluding remarks Dias 3

4 Introduction to the DLM A Simple DLM Time series Y t = (y 1,..., y n ) Observation equation: y t = µ t + v t, v t N(0, V t ) t t t t t Like before: v t = e s + e o The symbol µ t is the underlying true value at time t. System equation: µ t = µ t-1 + w t, w t N(0, W t ) The true value is not any longer assumed to be constant. A fair assumption in animal production Basically, we wish to detect large changes in µ t Dias 4

5 Introduction to the DLM A DLM with a trend Time series Y t = (y 1,..., y n ) Observation equation: y t = µ t + v t, v t N(0, V t ) t t t t t System equation: µ t = µ t-1 + β t-1 + w 1t, w 1t N(0, W 1t ) β t = β t-1 + w 2t, w 2t N(0, W 2t ) Dias 5

6 Introduction to the DLM Updating equations: Kalman Filter (a) Posterior for µ t-1 : (µ µ t-1 D t-1 ) N(m t-1, C t-1 ) (b) Prior for µ t : (µ t D t-1 ) N(m t-1, R t ) R t = C t-1 + W t (c) 1-step forecast: (Y t D t-1 ) N(f t, Q t ) f t = m t-1 Q t = R t + V t (d) Posterior for µ t : (µ t D t ) N(m t, C t ) m t = m t-1 + A t.e t C t = A t.v t A t = R t / Q t e t = Y t f t Initial Information: (µ 0 D 0 ) N(m 0, C 0 ) Dias 6

7 Introduction to the DLM Discount factor as an aid to choosing W t To run the model (assume with constant parameters) we need: m 0, C 0, V, W Discount factor can be used if W is unknown we know that W is a fixed proportion of C (West & Harrison 2.4.2) R t = C t-1 + W R t = C t-1 / δ Typically 0.8 < δ < 1 Dias 7

8 Introduction to the DLM Incorporate external information: intervention Types of external information: Known effect, experienced before (ex: change in breed for which we know the different performances) We want the model to adapt to the new known conditions 2. Unknown effect (ex: wave of heat, introduction of new animals in a group) We want the model to adapt to the new unknown conditions 3. Unknown effect we want to measure (ex: change of feed composition, new veterinary treatments) We want to measure the effect of a volontary change Dias 8

9 Introduction to the DLM Intervention - 1. Known effect We want the model to adapt to the new known conditions Ex: Kurrit example from West and Harrison (2.3.2) Estimated mean after change: 286 (vs. 143) Expected change = 143 ( ) Uncertainty: from 80 (pessimistic) to 200 (optimistic) σ = 30 ( = (200 80) / 4) : 4 st.dev (95% interval) Variance associated (Uncertainty) = 30 2 = 900 (ω 10 D 9, S 9 ) N (143, 900) Revised prior: (µ t D t-1 ) N(m t-1, R t ) (µ 10 D 9, S 9 ) N (286, 920) m 9 = 286 and R 10 = C 9 + W 10 = = 920 Dias 9

10 Introduction to the DLM Intervention - 2. Unknown effect (1/2) We want the model to adapt to the new unknown conditions Ex 1: wave of heat We can not adjust because we do not know the exact effect Ex 2: introduction of new animals in a group Consider a method aimed to detect oestrus by monitoring animal behaviour Incoming animals may modify the behaviour of the group Here, intervention aimed to increase model adaptation to new behaviour so to avoid alarms due to a known event Dias 10

11 Introduction to the DLM Intervention - 2. Unknown effect (2/2) In practice we can temporarily reduce the value of the discount factor so the evolution variance increases We put more weight on the new observations and forget about the past See also eating rank Dias 11

12 Introduction to the DLM Intervention - 3. Unknown effect We want to measure the effect of a volontary change Ex: A new feed is used and we want to estimate the associated change in daily gain We know that the new feed is used from time τ (0 < τ < n) y t = µ t + λ t I t + v t, v t N(0, V t ) µ t = µ t-1 + w t, w t N(0, W t ) With: I t : intervention effect that we want to measure λ t = 0 when t < τ λ t = 1 when t > τ Dias 12

13 The general DLM Generalisation from the 1.order pol. Model Simple, most widely used DLM Matrix notation allows to present the DLM in a general form and to treat more complex cases Four examples of application Monitoring activity level Monitoring activity types (MPKF) Monitoring eating behaviour Monitoring daily gain Dias 13

14 Modeling of the variable Dynamic Linear Models (DLMs) combined with Kalman Filter (KF) Let Y t = (y 1,, y n ) be a vector of key figures observed at time t. t 1 n Let θ t = (θ 1,, θ m ) be a vector of parameters describing the system at time t. General form of the DLM Observation Equation: Y t = F t θ t + ν t, ν t ~ N(0,V t ) System Equation: θ t = G t θ t-1 + ω t, ω t ~ N(0,W t ) DLM combined with Kalman Filter: estimate the underlying state vector θ t by its mean vector m t and its variance-covariance matrix C t. Dias 14

15 Matrix specification for general DLM (1/3) How do we relate to the local level model? Little note: matrix multiplication Consider matrices A and B, C is the product AB Dias 15

16 Matrix specification for general DLM (2/3) Observation equation θ t is the latent process θ t = ( µ t 0 ) F t is the design matrix F t = ( 1 0 ) Yt = F t θ t + v t 1 µ t 0 + v µ t t + v t 0 Yt = µ t + v t Dias 16

17 Matrix specification for general DLM (3/3) System equation θ t is the latent process θ t = ( µ t 0 ) G t is the system matrix G t = θ t = G t θ t-1 + w t 1 0 µ t w t µ t w t µ t = µ t-1 + w t Dias 17

18 Monitoring Deviations from the model Elements from KF used in monitoring deviations: f t : One step forecast mean e t : One step forecast error (e t = Y t f t ) Q t : One step forecast variance Monitoring methods: V-mask Tabular cusum Multi Process Kalman Filter Dias 18

19 Monitoring Deviations from the model V-mask (parameters d and Ψ) Applied on the cumulative sum (cusum) of the standardized errors u t = e t / Q t = t t C t u t = u t + ct 1 1 t= 1 Tabular Cusum (parameters K and H) Create a cusum: accumulate u i, using a reference value (K) Alarm when cusum exceeds a decision interval (H) Dias 19

20 Example 1. Monitoring activity level Context Development of Group housing in EU results of Council Directive 2001/88/EEC Difficulties identifying and accessing individual sow Idea Store data in a chip and transmit info to the farmer s PC Sensor in the chip allows to monitor activity of the sow Assumption Body Activity of sows is expected to change around the onset of oestrus Objective Develop an automated oestrus detection method for group housed sows using sows acceleration measurements Method Use of Dynamic Linear Models to model the sows activity Use of control methods that detects model deviations at the onset of oestrus Dias 20

21 Oestrus Detection Oestrus Detection I (from day 4) BPT (3 x / day) in the mating section Sow not inseminated Transfered to gestation section Oestrus Detection II (from day 21) BPT (3 x / day) in the gestation section Golden Standard Detect whether activity pattern changes at onset of oestrus Weaning Transfer Acceleration measurements d0 BPT I d7 d10 d21 BPT II d30 Dias 21

22 Data collection Place, Animals, Housing and Feeding 1 production herd, March sows in group of days Activity Measurements Acceleration in 2 and 3 dimensions Four measurements per second Transfer PC via Blue Tooth Video Recordings Four cameras used as web cam Oestrus Detection Golden standard Detect whether activity pattern changes at onset of oestrus Dias 22

23 Definition of the DLM Use hourly averages of the length of the acceleration vector Y t = acc = (acc x2 + acc y2 + acc z2 ) t µ t = 0 θ F' = ( 1,0) t G t = I V t = unknow and constant W t = 0 (In normal condition: no change in activity) Model initialized by mean of Reference Analysis Model observations (Y t ) weighted by number of observations per hour Missing observation: e t =0 Dias 23

24 Illustration Model Cusum V-mask Tabular Cusum Dias 24

25 Example 2. Monitoring sows activity types Daily Anoestrus Assumption Sow s behaviour is affected by physiological state / illness Oestrus: increase in activity Lameness: walking Daily Oestrus Accelerometer: measured any time / during whole reproductive cycle Objective Develop a method that automatically classify sows activity types Model selected activity types using DLM Classify each activity type using a Multi Process Kalman Filter Dias 25

26 Time series and activity types Activity types Extracts from time series of acceleration are associated to five activity types Feeding (FE) Rooting (RO) Walking (WA) Lying sternally (LS) Lying Laterally (LL) 3 dimensions: X,Y, Z ACC = (acc x2 + acc y2 + acc z2 ) Two data sets Activity filled whole data set / no overlapping Learning data set: 10 minutes of each activity type Estimate the model parameters Test data set: 10 x 2 minutes of each activity type Implement the classification method X Y Z ACC Dias 26

27 Acceleration data Dias 27

28 Modeling each activity type Use averages per second of acceleration data θ t µ t s ct = t F' t 2π 2π = 1,sin t,cos t T T G t = t I Model includes a periodic movement cyclic components V and W: estimated using the EM algorithm Learning data set 20 DLMs 5 activities x 4 axes (X, Y, Z, ACC) Dias 28

29 Classification method Multi Process Kalman Filter of class I At time t: Each DLM is analysed using the updating equations of the Kalman Filter: One step forecast mean f t One step forecast variance Q t Posterior Probabilities are estimated for each DLM p t ( i) φ ( i) p 1( i) t t Dias 29

30 Illustration: walking activity Results FE: best recognized WA: axis Z better RO: slow recognition LL: axis Y better LS: well recognized Perspectives Application? Active vs. Passive Illness Parturition Axis z Dias 30

31 Example 3. Modeling Feeding Behaviour Assumption Sow s feeding behaviour is affected by oestrus and illness Currently: list of sows that have not eaten is used to identify individuals Objective Develop a method that automatically detect oestrus, lameness and other health disorders for sows fed by ESF Model feeding behaviour (Feeding rank) using DLM Detect deviations by mean of control chart Dias 31

32 Data collection Place, Housing and Feeding 3 production herds, January 2005 January 2006 Herds 1 and 2: dynamic groups Herd 3: static groups Electronic Sow Feeders (ESF) Registration ESF Oestrus (BPT) Lameness and Health disorders Dias 32

33 Modeling of the variable Use daily Feeding Rank (Y t ) t µ t = βt θ ' = ( 1,0) F t G t = V t assumed unknow and constant W t estimated by discount factor Missing observation : e t =0 External information: subgroup of sows enters or leaves a group or both : Lower discounting Dias 33

34 Detection method Optimization of V-mask parameters for 3 conditions i) oestrus ii) lameness iii) Other health disorders Criteria Sensitivity of at least 50% Number of FP is minimum Dias 34

35 Illustration - Intervention Subgroup in Subgroup in + out Individual eating rank Model forecast Increase Adaptive Coefficient Dias 35

36 Results i) Oestrus detection Sensitivity ranges from 59 to 75% (vs. List of sows: 9 to 20%) ii) Lameness and iii) Other health disorders Sensitivity ranges from 41 to 70% (vs. List of sows: 22 to 39%) Too many false alarms Perspectives Include other variables: e.g. ear base temperature, activity Multivariate model Dias 36

37 Example Daily Gain (from first lecture) Matrices Specification Include Seasonal components Dias 37

38 Example Daily Gain (from first lecture) Daily gain, slaughter pigs Daily gain, slaughter pigs Trend g kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal 01 Period Observed gain Predicted gain Period Observed gain Predicted gain Level Season 1 Season 2 Season 3 Season 4 Seasons Errors Daily gain, slaughter pigs Daily gain, slaughter pigs g g g 2. kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal kvartal 01 Period Period Season 1 Season 2 Season 3 Season 4 Forecast error Lower limit Upper limit Dias 38

39 Concluding remarks Differents Models were presented Simple local level model DLM in its general form Examples The general form of the model allows to include cyclic patterns (as for eating activity, daily gain) Thomas Nejsum Madsen will present an approach based on sine functions to incorporate a diurnal pattern. Not necessarily as graphs automatic alarms (as V mask). Many handles to adjust dangerous Always combine with your knowledge on animal production Dias 39

Monitoring and data filtering III. The Kalman Filter and its relation with the other methods

Monitoring and data filtering III. The Kalman Filter and its relation with the other methods Monitoring and data filtering III. The Kalman Filter and its relation with the other methods Advanced Herd Management Cécile Cornou, IPH Dias 1 Gain (g) Before this part of the course Compare key figures

More information

Monitoring and data filtering II. Dan Jensen IPH, KU

Monitoring and data filtering II. Dan Jensen IPH, KU Monitoring and data filtering II Dan Jensen IPH, KU Outline Introduction to Dynamic Linear Models (DLM) - Conceptual introduction - Difference between the Classical methods and DLM - A very simple DLM

More information

Dynamic monitoring of mortality rate for sows and piglets

Dynamic monitoring of mortality rate for sows and piglets Dynamic monitoring of mortality rate for sows and piglets Claudia Bono C. Cornou A. R. Kristensen Dias 1 Background Problem Existing Management Information Systems (MIS) are static and typically computed

More information

Classifying Sows' Activity Types from Acceleration Patterns. An Application of the Multi- Process Kalman Filter

Classifying Sows' Activity Types from Acceleration Patterns. An Application of the Multi- Process Kalman Filter Classifying Sows' Activity Types from Acceleration Patterns. An Alication of the Multi- Process Kalman Filter Final preprint (uncorrected proof) of article published in Alied Animal Behaviour Science.

More information

Monitoring and data filtering II. Dynamic Linear Models

Monitoring and data filtering II. Dynamic Linear Models Ouline Monioring and daa filering II. Dynamic Linear Models (Wes and Harrison, chaper 2 Updaing equaions: Kalman Filer Discoun facor as an aid o choose W Incorporae exernal informaion: Inervenion General

More information

Model Probability Var = 1 Var = 2.25 Var = 25

Model Probability Var = 1 Var = 2.25 Var = 25 1 Background Detection of Oestrus by Monitoring Boar Visits Tage Ostersen 2 Data 3 Duration of Visits 4 Break 5 Frequency of Visits 6 Visits 7 Conclusion Advanced Herd Management 22/9 29 Slide 1/3 Slide

More information

Detection of Oestrus by Monitoring Boar Visits Tage Ostersen Advanced Herd Management 22/ Slide 1/30 1 Background 2 Data 3 Duration of Visits 4

Detection of Oestrus by Monitoring Boar Visits Tage Ostersen Advanced Herd Management 22/ Slide 1/30 1 Background 2 Data 3 Duration of Visits 4 Detection of Oestrus by Monitoring Boar Visits Tage Ostersen Advanced Herd Management 22/9 2009 Slide 1/30 1 Background 2 Data 3 Duration of Visits 4 Break 5 Frequency of Visits 6 Combining Frequency and

More information

Herd Management Science

Herd Management Science Herd Management Science Exercises and Supplementary Reading 2010 Edition Compiled on July 27, 2010 Anders Ringgaard Kristensen University of Copenhagen Faculty of Life Sciences 2 3 Preface These exercises

More information

Markov Decision Processes: Biosens II

Markov Decision Processes: Biosens II Markov Decision Processes: Biosens II E. Jørgensen & Lars R. Nielsen Department of Genetics and Biotechnology Faculty of Agricultural Sciences, University of Århus / 008 : Markov Decision Processes Examples

More information

CHAPTER 4 DYNAMIC PRODUCTION MONITORING IN PIG HERDS II: MODELING AND MONITORING FARROWING RATE AT HERD LEVEL

CHAPTER 4 DYNAMIC PRODUCTION MONITORING IN PIG HERDS II: MODELING AND MONITORING FARROWING RATE AT HERD LEVEL CHAPTER 4 DYNAMIC PRODUCTION MONITORING IN PIG HERDS II: MODELING AND MONITORING FARROWING RATE AT HERD LEVEL Claudia Bono, Cécile Cornou, Søren Lundbye-Christensen and Anders Ringgaard Kristensen Published

More information

1 Kalman Filter Introduction

1 Kalman Filter Introduction 1 Kalman Filter Introduction You should first read Chapter 1 of Stochastic models, estimation, and control: Volume 1 by Peter S. Maybec (available here). 1.1 Explanation of Equations (1-3) and (1-4) Equation

More information

CHAPTER 3 DYNAMIC PRODUCTION MONITORING IN PIG HERDS I: MODELING AND MONITORING LITTER SIZE AT HERD AND SOW LEVEL

CHAPTER 3 DYNAMIC PRODUCTION MONITORING IN PIG HERDS I: MODELING AND MONITORING LITTER SIZE AT HERD AND SOW LEVEL CHAPTER 3 DYNAMIC PRODUCTION MONITORING IN PIG HERDS I: MODELING AND MONITORING LITTER SIZE AT HERD AND SOW LEVEL Claudia Bono, Cécile Cornou and Anders Ringgaard Kristensen Published in Livestock Science

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de

More information

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49 State-space Model Eduardo Rossi University of Pavia November 2013 Rossi State-space Model Financial Econometrics - 2013 1 / 49 Outline 1 Introduction 2 The Kalman filter 3 Forecast errors 4 State smoothing

More information

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53 State-space Model Eduardo Rossi University of Pavia November 2014 Rossi State-space Model Fin. Econometrics - 2014 1 / 53 Outline 1 Motivation 2 Introduction 3 The Kalman filter 4 Forecast errors 5 State

More information

First Semester Dr. Abed Schokry SQC Chapter 9: Cumulative Sum and Exponential Weighted Moving Average Control Charts

First Semester Dr. Abed Schokry SQC Chapter 9: Cumulative Sum and Exponential Weighted Moving Average Control Charts Department of Industrial Engineering First Semester 2014-2015 Dr. Abed Schokry SQC Chapter 9: Cumulative Sum and Exponential Weighted Moving Average Control Charts Learning Outcomes After completing this

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche

More information

CS281 Section 4: Factor Analysis and PCA

CS281 Section 4: Factor Analysis and PCA CS81 Section 4: Factor Analysis and PCA Scott Linderman At this point we have seen a variety of machine learning models, with a particular emphasis on models for supervised learning. In particular, we

More information

Sequential Bayesian Updating

Sequential Bayesian Updating BS2 Statistical Inference, Lectures 14 and 15, Hilary Term 2009 May 28, 2009 We consider data arriving sequentially X 1,..., X n,... and wish to update inference on an unknown parameter θ online. In a

More information

Factor Analysis and Kalman Filtering (11/2/04)

Factor Analysis and Kalman Filtering (11/2/04) CS281A/Stat241A: Statistical Learning Theory Factor Analysis and Kalman Filtering (11/2/04) Lecturer: Michael I. Jordan Scribes: Byung-Gon Chun and Sunghoon Kim 1 Factor Analysis Factor analysis is used

More information

Previously on TT, Target Tracking: Lecture 2 Single Target Tracking Issues. Lecture-2 Outline. Basic ideas on track life

Previously on TT, Target Tracking: Lecture 2 Single Target Tracking Issues. Lecture-2 Outline. Basic ideas on track life REGLERTEKNIK Previously on TT, AUTOMATIC CONTROL Target Tracing: Lecture 2 Single Target Tracing Issues Emre Özan emre@isy.liu.se Division of Automatic Control Department of Electrical Engineering Linöping

More information

What is it all about? Introduction to Bayesian Networks. Method to reasoning under uncertainty. Where we reason using probabilities

What is it all about? Introduction to Bayesian Networks. Method to reasoning under uncertainty. Where we reason using probabilities What is it all about? Introduction to ayesian Networks Method to reasoning under uncertainty dvanced Herd Management 28th of september 2009 Where we reason using probabilities Tina irk Jensen Reasoning

More information

Introduction to Signal Detection and Classification. Phani Chavali

Introduction to Signal Detection and Classification. Phani Chavali Introduction to Signal Detection and Classification Phani Chavali Outline Detection Problem Performance Measures Receiver Operating Characteristics (ROC) F-Test - Test Linear Discriminant Analysis (LDA)

More information

X t = a t + r t, (7.1)

X t = a t + r t, (7.1) Chapter 7 State Space Models 71 Introduction State Space models, developed over the past 10 20 years, are alternative models for time series They include both the ARIMA models of Chapters 3 6 and the Classical

More information

Advanced topics from statistics

Advanced topics from statistics Advanced topics from statistics Anders Ringgaard Kristensen Advanced Herd Management Slide 1 Outline Covariance and correlation Random vectors and multivariate distributions The multinomial distribution

More information

Robotics 2 Target Tracking. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard

Robotics 2 Target Tracking. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard Robotics 2 Target Tracking Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard Linear Dynamical System (LDS) Stochastic process governed by is the state vector is the input vector is the process

More information

x. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).

x. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ). .8.6 µ =, σ = 1 µ = 1, σ = 1 / µ =, σ =.. 3 1 1 3 x Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ ). The Gaussian distribution Probably the most-important distribution in all of statistics

More information

Nowcasting Norwegian GDP

Nowcasting Norwegian GDP Nowcasting Norwegian GDP Knut Are Aastveit and Tørres Trovik May 13, 2007 Introduction Motivation The last decades of advances in information technology has made it possible to access a huge amount of

More information

Part I State space models

Part I State space models Part I State space models 1 Introduction to state space time series analysis James Durbin Department of Statistics, London School of Economics and Political Science Abstract The paper presents a broad

More information

Bayesian Inference. Chapter 4: Regression and Hierarchical Models

Bayesian Inference. Chapter 4: Regression and Hierarchical Models Bayesian Inference Chapter 4: Regression and Hierarchical Models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Master in Business Administration and Quantitative

More information

Planning and Optimal Control

Planning and Optimal Control Planning and Optimal Control 1. Markov Models Lars Schmidt-Thieme Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University of Hildesheim, Germany 1 / 22 Syllabus Tue.

More information

Time Series Analysis

Time Series Analysis Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture State space models, 1st part: Model: Sec. 10.1 The

More information

p L yi z n m x N n xi

p L yi z n m x N n xi y i z n x n N x i Overview Directed and undirected graphs Conditional independence Exact inference Latent variables and EM Variational inference Books statistical perspective Graphical Models, S. Lauritzen

More information

COM336: Neural Computing

COM336: Neural Computing COM336: Neural Computing http://www.dcs.shef.ac.uk/ sjr/com336/ Lecture 2: Density Estimation Steve Renals Department of Computer Science University of Sheffield Sheffield S1 4DP UK email: s.renals@dcs.shef.ac.uk

More information

Lecture 4: Dynamic models

Lecture 4: Dynamic models linear s Lecture 4: s Hedibert Freitas Lopes The University of Chicago Booth School of Business 5807 South Woodlawn Avenue, Chicago, IL 60637 http://faculty.chicagobooth.edu/hedibert.lopes hlopes@chicagobooth.edu

More information

Principal Component Analysis and Linear Discriminant Analysis

Principal Component Analysis and Linear Discriminant Analysis Principal Component Analysis and Linear Discriminant Analysis Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1/29

More information

THE USE OF KALMAN FILTRATION TO ESTIMATE CHANGES OF TRUNK INCLINATION ANGLE DURING WEIGHTLIFTING 1. INTRODUCTION

THE USE OF KALMAN FILTRATION TO ESTIMATE CHANGES OF TRUNK INCLINATION ANGLE DURING WEIGHTLIFTING 1. INTRODUCTION JOURNAL OF MEDICAL INFORMATICS & TECHNOLOGIES Vol. 15/2010, ISSN 1642-6037 Kalman filtration, filter algorithm, accelerometric sensor Grzegorz SAPOTA 1, Anna SAPOTA 1, Zygmunt WRÓBEL 1 THE USE OF KALMAN

More information

Dynamic Matrix-Variate Graphical Models A Synopsis 1

Dynamic Matrix-Variate Graphical Models A Synopsis 1 Proc. Valencia / ISBA 8th World Meeting on Bayesian Statistics Benidorm (Alicante, Spain), June 1st 6th, 2006 Dynamic Matrix-Variate Graphical Models A Synopsis 1 Carlos M. Carvalho & Mike West ISDS, Duke

More information

2.830J / 6.780J / ESD.63J Control of Manufacturing Processes (SMA 6303) Spring 2008

2.830J / 6.780J / ESD.63J Control of Manufacturing Processes (SMA 6303) Spring 2008 MIT OpenCourseWare http://ocw.mit.edu 2.830J / 6.780J / ESD.63J Control of Processes (SMA 6303) Spring 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Linear Dynamical Systems (Kalman filter)

Linear Dynamical Systems (Kalman filter) Linear Dynamical Systems (Kalman filter) (a) Overview of HMMs (b) From HMMs to Linear Dynamical Systems (LDS) 1 Markov Chains with Discrete Random Variables x 1 x 2 x 3 x T Let s assume we have discrete

More information

Ridracoli Dam: surveillance and safety evaluation reported on internet page

Ridracoli Dam: surveillance and safety evaluation reported on internet page Ridracoli Dam: surveillance and safety evaluation reported on internet page P.P. MARINI; P. BALDONI; F. FARINA; F. CORTEZZI - Romagna Acque, Forlì, Italy A. MASERA - Enel.Hydro, ISMES Division, Bergamo,

More information

Linear Classifiers. Michael Collins. January 18, 2012

Linear Classifiers. Michael Collins. January 18, 2012 Linear Classifiers Michael Collins January 18, 2012 Today s Lecture Binary classification problems Linear classifiers The perceptron algorithm Classification Problems: An Example Goal: build a system that

More information

Introduction to Mobile Robotics Information Gain-Based Exploration. Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Giorgio Grisetti, Kai Arras

Introduction to Mobile Robotics Information Gain-Based Exploration. Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Giorgio Grisetti, Kai Arras Introduction to Mobile Robotics Information Gain-Based Exploration Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Giorgio Grisetti, Kai Arras 1 Tasks of Mobile Robots mapping SLAM localization integrated

More information

Lecture 9. Time series prediction

Lecture 9. Time series prediction Lecture 9 Time series prediction Prediction is about function fitting To predict we need to model There are a bewildering number of models for data we look at some of the major approaches in this lecture

More information

Linear Dynamical Systems

Linear Dynamical Systems Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations

More information

University of Cambridge Engineering Part IIB Module 4F10: Statistical Pattern Processing Handout 2: Multivariate Gaussians

University of Cambridge Engineering Part IIB Module 4F10: Statistical Pattern Processing Handout 2: Multivariate Gaussians Engineering Part IIB: Module F Statistical Pattern Processing University of Cambridge Engineering Part IIB Module F: Statistical Pattern Processing Handout : Multivariate Gaussians. Generative Model Decision

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter

More information

Outline. A quiz

Outline. A quiz Introduction to Bayesian Networks Anders Ringgaard Kristensen Outline Causal networks Bayesian Networks Evidence Conditional Independence and d-separation Compilation The moral graph The triangulated graph

More information

STATISTICAL FORECASTING and SEASONALITY (M. E. Ippolito; )

STATISTICAL FORECASTING and SEASONALITY (M. E. Ippolito; ) STATISTICAL FORECASTING and SEASONALITY (M. E. Ippolito; 10-6-13) PART I OVERVIEW The following discussion expands upon exponential smoothing and seasonality as presented in Chapter 11, Forecasting, in

More information

Spatial Statistics with Image Analysis. Outline. A Statistical Approach. Johan Lindström 1. Lund October 6, 2016

Spatial Statistics with Image Analysis. Outline. A Statistical Approach. Johan Lindström 1. Lund October 6, 2016 Spatial Statistics Spatial Examples More Spatial Statistics with Image Analysis Johan Lindström 1 1 Mathematical Statistics Centre for Mathematical Sciences Lund University Lund October 6, 2016 Johan Lindström

More information

Classifying and building DLMs

Classifying and building DLMs Chapter 3 Classifying and building DLMs In this chapter we will consider a special form of DLM for which the transition matrix F k and the observation matrix H k are constant in time, ie, F k = F and H

More information

CIS 390 Fall 2016 Robotics: Planning and Perception Final Review Questions

CIS 390 Fall 2016 Robotics: Planning and Perception Final Review Questions CIS 390 Fall 2016 Robotics: Planning and Perception Final Review Questions December 14, 2016 Questions Throughout the following questions we will assume that x t is the state vector at time t, z t is the

More information

Introduction to Bayesian Networks

Introduction to Bayesian Networks Introduction to Bayesian Networks Anders Ringgaard Kristensen Slide 1 Outline Causal networks Bayesian Networks Evidence Conditional Independence and d-separation Compilation The moral graph The triangulated

More information

Technological Revolutions and Debt Hangovers: Is There a Link?

Technological Revolutions and Debt Hangovers: Is There a Link? Technological Revolutions and Debt Hangovers: Is There a Link? Dan Cao Jean-Paul L Huillier January 9th, 2014 Cao and L Huillier 0/41 Introduction Observation: Before Great Recession: IT (late 1990s) Before

More information

ABSTRACT INTRODUCTION

ABSTRACT INTRODUCTION ABSTRACT Presented in this paper is an approach to fault diagnosis based on a unifying review of linear Gaussian models. The unifying review draws together different algorithms such as PCA, factor analysis,

More information

Embedding a State Space Model Into a Markov Decision Process

Embedding a State Space Model Into a Markov Decision Process Embedding a State Space Model Into a Markov Decision Process Lars Relund Nielsen, Erik Jørgensen and Søren Højsgaard Research Unit of Statistics and Decision Analysis, Department of Genetics and Biotechnology,

More information

SmartDairy Catalog HerdMetrix Herd Management Software

SmartDairy Catalog HerdMetrix Herd Management Software SmartDairy Catalog HerdMetrix Herd Management Quality Milk Through Technology Sort Gate Hoof Care Feeding Station ISO RFID SmartControl Meter TouchPoint System Management ViewPoint Catalog March 2011 Quality

More information

Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density

Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density Simo Särkkä E-mail: simo.sarkka@hut.fi Aki Vehtari E-mail: aki.vehtari@hut.fi Jouko Lampinen E-mail: jouko.lampinen@hut.fi Abstract

More information

Bayesian Dynamic Linear Modelling for. Complex Computer Models

Bayesian Dynamic Linear Modelling for. Complex Computer Models Bayesian Dynamic Linear Modelling for Complex Computer Models Fei Liu, Liang Zhang, Mike West Abstract Computer models may have functional outputs. With no loss of generality, we assume that a single computer

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Machine Learning. Gaussian Mixture Models. Zhiyao Duan & Bryan Pardo, Machine Learning: EECS 349 Fall

Machine Learning. Gaussian Mixture Models. Zhiyao Duan & Bryan Pardo, Machine Learning: EECS 349 Fall Machine Learning Gaussian Mixture Models Zhiyao Duan & Bryan Pardo, Machine Learning: EECS 349 Fall 2012 1 The Generative Model POV We think of the data as being generated from some process. We assume

More information

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay Lecture 6: State Space Model and Kalman Filter Bus 490, Time Series Analysis, Mr R Tsay A state space model consists of two equations: S t+ F S t + Ge t+, () Z t HS t + ɛ t (2) where S t is a state vector

More information

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA

More information

Data assimilation with and without a model

Data assimilation with and without a model Data assimilation with and without a model Tim Sauer George Mason University Parameter estimation and UQ U. Pittsburgh Mar. 5, 2017 Partially supported by NSF Most of this work is due to: Tyrus Berry,

More information

Gibbs Sampling in Linear Models #2

Gibbs Sampling in Linear Models #2 Gibbs Sampling in Linear Models #2 Econ 690 Purdue University Outline 1 Linear Regression Model with a Changepoint Example with Temperature Data 2 The Seemingly Unrelated Regressions Model 3 Gibbs sampling

More information

Advanced Herd Management Probabilities and distributions

Advanced Herd Management Probabilities and distributions Advanced Herd Management Probabilities and distributions Anders Ringgaard Kristensen Slide 1 Outline Probabilities Conditional probabilities Bayes theorem Distributions Discrete Continuous Distribution

More information

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout :. The Multivariate Gaussian & Decision Boundaries..15.1.5 1 8 6 6 8 1 Mark Gales mjfg@eng.cam.ac.uk Lent

More information

15-780: Grad AI Lecture 17: Probability. Geoff Gordon (this lecture) Tuomas Sandholm TAs Erik Zawadzki, Abe Othman

15-780: Grad AI Lecture 17: Probability. Geoff Gordon (this lecture) Tuomas Sandholm TAs Erik Zawadzki, Abe Othman 15-780: Grad AI Lecture 17: Probability Geoff Gordon (this lecture) Tuomas Sandholm TAs Erik Zawadzki, Abe Othman Review: probability RVs, events, sample space Ω Measures, distributions disjoint union

More information

Engineering Part IIB: Module 4F10 Statistical Pattern Processing Lecture 5: Single Layer Perceptrons & Estimating Linear Classifiers

Engineering Part IIB: Module 4F10 Statistical Pattern Processing Lecture 5: Single Layer Perceptrons & Estimating Linear Classifiers Engineering Part IIB: Module 4F0 Statistical Pattern Processing Lecture 5: Single Layer Perceptrons & Estimating Linear Classifiers Phil Woodland: pcw@eng.cam.ac.uk Michaelmas 202 Engineering Part IIB:

More information

Discriminant analysis and supervised classification

Discriminant analysis and supervised classification Discriminant analysis and supervised classification Angela Montanari 1 Linear discriminant analysis Linear discriminant analysis (LDA) also known as Fisher s linear discriminant analysis or as Canonical

More information

Machine Learning - MT & 14. PCA and MDS

Machine Learning - MT & 14. PCA and MDS Machine Learning - MT 2016 13 & 14. PCA and MDS Varun Kanade University of Oxford November 21 & 23, 2016 Announcements Sheet 4 due this Friday by noon Practical 3 this week (continue next week if necessary)

More information

Denver International Airport MDSS Demonstration Verification Report for the Season

Denver International Airport MDSS Demonstration Verification Report for the Season Denver International Airport MDSS Demonstration Verification Report for the 2015-2016 Season Prepared by the University Corporation for Atmospheric Research Research Applications Division (RAL) Seth Linden

More information

Application of the Ensemble Kalman Filter to History Matching

Application of the Ensemble Kalman Filter to History Matching Application of the Ensemble Kalman Filter to History Matching Presented at Texas A&M, November 16,2010 Outline Philosophy EnKF for Data Assimilation Field History Match Using EnKF with Covariance Localization

More information

Unsupervised Learning with Permuted Data

Unsupervised Learning with Permuted Data Unsupervised Learning with Permuted Data Sergey Kirshner skirshne@ics.uci.edu Sridevi Parise sparise@ics.uci.edu Padhraic Smyth smyth@ics.uci.edu School of Information and Computer Science, University

More information

Geometric Image Manipulation

Geometric Image Manipulation Bruce A. Draper J. Ross Beveridge, January 24, 204 Geometric Image Manipulation Lecture (part a) January 24, 204 Bruce A. Draper J. Ross Beveridge, January 24, 204 Status Update Programming assignment

More information

Lecture 1a: Basic Concepts and Recaps

Lecture 1a: Basic Concepts and Recaps Lecture 1a: Basic Concepts and Recaps Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London c.archambeau@cs.ucl.ac.uk Advanced

More information

Scalable robust hypothesis tests using graphical models

Scalable robust hypothesis tests using graphical models Scalable robust hypothesis tests using graphical models Umamahesh Srinivas ipal Group Meeting October 22, 2010 Binary hypothesis testing problem Random vector x = (x 1,...,x n ) R n generated from either

More information

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012 Parametric Models Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Maximum Likelihood Estimation Bayesian Density Estimation Today s Topics Maximum Likelihood

More information

Machine Learning! in just a few minutes. Jan Peters Gerhard Neumann

Machine Learning! in just a few minutes. Jan Peters Gerhard Neumann Machine Learning! in just a few minutes Jan Peters Gerhard Neumann 1 Purpose of this Lecture Foundations of machine learning tools for robotics We focus on regression methods and general principles Often

More information

Logic Design II (17.342) Spring Lecture Outline

Logic Design II (17.342) Spring Lecture Outline Logic Design II (17.342) Spring 2012 Lecture Outline Class # 10 April 12, 2012 Dohn Bowden 1 Today s Lecture First half of the class Circuits for Arithmetic Operations Chapter 18 Should finish at least

More information

You are allowed two hours to answer this question paper. All questions are compulsory.

You are allowed two hours to answer this question paper. All questions are compulsory. Examination Question and Answer Book Write here your full examination number Centre Code: Hall Code: Desk Number: Foundation Level 3c Business Mathematics FBSM 0 May 00 Day 1 late afternoon INSTRUCTIONS

More information

Forecasting. BUS 735: Business Decision Making and Research. exercises. Assess what we have learned

Forecasting. BUS 735: Business Decision Making and Research. exercises. Assess what we have learned Forecasting BUS 735: Business Decision Making and Research 1 1.1 Goals and Agenda Goals and Agenda Learning Objective Learn how to identify regularities in time series data Learn popular univariate time

More information

Lecture Outline. Target Tracking: Lecture 3 Maneuvering Target Tracking Issues. Maneuver Illustration. Maneuver Illustration. Maneuver Detection

Lecture Outline. Target Tracking: Lecture 3 Maneuvering Target Tracking Issues. Maneuver Illustration. Maneuver Illustration. Maneuver Detection REGLERTEKNIK Lecture Outline AUTOMATIC CONTROL Target Tracking: Lecture 3 Maneuvering Target Tracking Issues Maneuver Detection Emre Özkan emre@isy.liu.se Division of Automatic Control Department of Electrical

More information

ECO 513 Fall 2008 C.Sims KALMAN FILTER. s t = As t 1 + ε t Measurement equation : y t = Hs t + ν t. u t = r t. u 0 0 t 1 + y t = [ H I ] u t.

ECO 513 Fall 2008 C.Sims KALMAN FILTER. s t = As t 1 + ε t Measurement equation : y t = Hs t + ν t. u t = r t. u 0 0 t 1 + y t = [ H I ] u t. ECO 513 Fall 2008 C.Sims KALMAN FILTER Model in the form 1. THE KALMAN FILTER Plant equation : s t = As t 1 + ε t Measurement equation : y t = Hs t + ν t. Var(ε t ) = Ω, Var(ν t ) = Ξ. ε t ν t and (ε t,

More information

Dynamic models 1 Kalman filters, linearization,

Dynamic models 1 Kalman filters, linearization, Koller & Friedman: Chapter 16 Jordan: Chapters 13, 15 Uri Lerner s Thesis: Chapters 3,9 Dynamic models 1 Kalman filters, linearization, Switching KFs, Assumed density filters Probabilistic Graphical Models

More information

Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard

Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard Robotics 2 Target Tracking Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard Slides by Kai Arras, Gian Diego Tipaldi, v.1.1, Jan 2012 Chapter Contents Target Tracking Overview Applications

More information

Announcements Monday, November 13

Announcements Monday, November 13 Announcements Monday, November 13 The third midterm is on this Friday, November 17. The exam covers 3.1, 3.2, 5.1, 5.2, 5.3, and 5.5. About half the problems will be conceptual, and the other half computational.

More information

Computer Vision Group Prof. Daniel Cremers. 9. Gaussian Processes - Regression

Computer Vision Group Prof. Daniel Cremers. 9. Gaussian Processes - Regression Group Prof. Daniel Cremers 9. Gaussian Processes - Regression Repetition: Regularized Regression Before, we solved for w using the pseudoinverse. But: we can kernelize this problem as well! First step:

More information

EKF and SLAM. McGill COMP 765 Sept 18 th, 2017

EKF and SLAM. McGill COMP 765 Sept 18 th, 2017 EKF and SLAM McGill COMP 765 Sept 18 th, 2017 Outline News and information Instructions for paper presentations Continue on Kalman filter: EKF and extension to mapping Example of a real mapping system:

More information

Bayesian Inference. Chapter 4: Regression and Hierarchical Models

Bayesian Inference. Chapter 4: Regression and Hierarchical Models Bayesian Inference Chapter 4: Regression and Hierarchical Models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Advanced Statistics and Data Mining Summer School

More information

FRAPPÉ/DISCOVER-AQ (July/August 2014) in perspective of multi-year ozone analysis

FRAPPÉ/DISCOVER-AQ (July/August 2014) in perspective of multi-year ozone analysis FRAPPÉ/DISCOVER-AQ (July/August 2014) in perspective of multi-year ozone analysis Project Report #2: Monitoring network assessment for the City of Fort Collins Prepared by: Lisa Kaser kaser@ucar.edu ph:

More information

Adaptive Dual Control

Adaptive Dual Control Adaptive Dual Control Björn Wittenmark Department of Automatic Control, Lund Institute of Technology Box 118, S-221 00 Lund, Sweden email: bjorn@control.lth.se Keywords: Dual control, stochastic control,

More information

2.830J / 6.780J / ESD.63J Control of Manufacturing Processes (SMA 6303) Spring 2008

2.830J / 6.780J / ESD.63J Control of Manufacturing Processes (SMA 6303) Spring 2008 MIT OpenCourseWare http://ocw.mit.edu 2.830J / 6.780J / ESD.63J Control of Processes (SMA 6303) Spring 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/term

More information

A short introduction to INLA and R-INLA

A short introduction to INLA and R-INLA A short introduction to INLA and R-INLA Integrated Nested Laplace Approximation Thomas Opitz, BioSP, INRA Avignon Workshop: Theory and practice of INLA and SPDE November 7, 2018 2/21 Plan for this talk

More information

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30 Problem Set 2 MAS 622J/1.126J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 30 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin

More information

Quantitative characters - exercises

Quantitative characters - exercises Quantitative characters - exercises 1. a) Calculate the genetic covariance between half sibs, expressed in the ij notation (Cockerham's notation), when up to loci are considered. b) Calculate the genetic

More information

Machine Learning (CS 567) Lecture 2

Machine Learning (CS 567) Lecture 2 Machine Learning (CS 567) Lecture 2 Time: T-Th 5:00pm - 6:20pm Location: GFS118 Instructor: Sofus A. Macskassy (macskass@usc.edu) Office: SAL 216 Office hours: by appointment Teaching assistant: Cheol

More information

The Development of Guidance for Forecast of. Maximum Precipitation Amount

The Development of Guidance for Forecast of. Maximum Precipitation Amount The Development of Guidance for Forecast of Maximum Precipitation Amount Satoshi Ebihara Numerical Prediction Division, JMA 1. Introduction Since 198, the Japan Meteorological Agency (JMA) has developed

More information