AS Automation and Systems Technology Project Work. Nonlinear filter design for combustion engine model

Size: px
Start display at page:

Download "AS Automation and Systems Technology Project Work. Nonlinear filter design for combustion engine model"

Transcription

1 AS Automation and Systems Technology Project Work Nonlinear filter design for combustion engine model Department of Electrical Engineering and Automation Aalto University Start date: January 21, 2015 End date: May 28, 2015 Supervisor: Sergey Samokhin Eero Siivola, ECTS: 5cr Dimitar Boyadzhiev, ECTS: 5cr

2 Contents Contents i 1 Introduction 1 2 Turbocharged EGR Air-Path Model 2 3 Estimators Particle filter The Degeneracy Problem Moving Horizon Estimator (MHE) MHE problem Solution to the MHE problem Including control to a fast Moving Horizon Estimator Results Particle Filter Moving Horizon Estimator Unscented Kalman Filter Robustness testing Particle Filter Moving Horizon Estimator Unscented Kalman Filter Comparasion of the estimators 27 7 Time managment 29 8 Risk managment 31 9 Team work managment Conclusions and future work 32 i

3 1 Introduction Internal combustion engines are highly nonlinear in nature and system states needed for their feedback control are not always measurable. As a result of these, the need for nonlinear filters arises that allow for the estimation of such unmeasurable states. In the past few decades, the Extended Kalman Filter (EKF) has gained huge popularity and has been applied extensively to nonlinear systems. It has, however, a number of inherent drawbacks that need to be taken into account, such as being computationally expensive and difficult to implement due to the calculation of Jacobians, the need for good initial state estimates and linearisation to name a few. The Particle Filter (PF) and the Moving Horizon Estimator (MHE) are two filters that are known to perform better in non-linear systems than the EKF. Therefore, the goal of this project was to design and implement in Matlab the two aforementioned non-linear filters and compare their performance on a given turbocharged internal combustion engine model. The main body of the paper is divided into 7 sections. Section 2 describes briefly the non-linear engine model on which the filters are tested. Section 3 is divided into two subsections: subsection 3.1, which describes the particle filter algorithm and subsection 3.2, which describes the moving horizon estimator algorithm. Section 4 is divided into three subsections, 4.1, 4.2, 4.3 which show the results of the particle filter, the moving horizon estimator and unscented kalman filter applied on the internal combustion engine model, respectively. Section 5 evaluates the robustness of the three filters. Section 6 compares three different estimators. Section 7 summarises the time spent on different tasks of the project. Section 8 summarises the assumed and actualised risks. Section 9 summarises team work managment and co-operation with project supervisor. The paper is concluded in Section 10. 1

4 2 Turbocharged EGR Air-Path Model Figure 1 shows the air-path model of the engine. The main components of the model are the W ci C W egr,2 u egr,2 Feedback loop p 4 ω tc C VGT T p i W ie u egr,1 W comp u vgt MAP W ie p x u exh W egr,1 p egr W vgt W f Wexh Figure 1: HPEGR diesel engine air-path block diagram. [1] four control volumes denoted by p i, p x, p erg and p 4, the intake compressor C, variable geometry turbocharger VGT in the EGR path and several control valves. The set of ordinary differential equations (ODEs) describing the model are: [1] ṗ i (t) = R it i V i (W ci (t) + W egr2 (t) W ie (t)) (1a) ṗ x (t) = R xt x V x (W ie (t) + W f (t) W exh (t) W egr1 (t)) (1b) ṗ egr (t) = R xt 3 V egr (W egr1 (t) W comp (t) W vgt (t)) (1c) ṗ 4 (t) = R xt 5 V 4 (W comp (t) W egr2 (t)) (1d) ω tc (t) = P t(t) p c (t) Jω tc P c (t) = 1 τ c (P t (t) p c (t)) (1e) (1f) where R[Jkg 1 K 1 ] is the specific gas constant of the air, T [K] is the air s temperature and V [m 3 ] the manifold volume. The mass flows are denoted as W [kg/s] and the powers as P [W ]. The observed states are marked with a red arrow. [1] For further description of the model and the calculation of the mass flows, reader is referred to [1]. 2

5 3 Estimators There are a few reasons for estimating the states that are not marked with a red arrow in equations 1a - 1f. The first and probably the most important one is because of the locations that sensors must be placed in order to be able to measure these states. The environment in these locations is not particularly friendly, with rather high temperatures and particulate matter of exhaust gas that would lead to the quick wearing out of the sensors. This will be both expensive and troublesome. Another reason for estimating these states would be to lower the initial cost of the engine as well. That is of course, in case the estimation of the states prove to be satisfactory for the feedback control and the effect on the engine performance is negligible. The implemented estimators are briefly discussed in the two following subsections. 3.1 Particle filter Particle Filters (PFs) are a set of techniques for the implementation of a recursive Bayesian estimator through the use of Monte Carlo sampling. The estimation of unobserved state variables is done through the representation of their posterior densities using a set of random particles, each of which is assigned a weight. The weight of each particle reflects directly the probability of the particle to be sampled with replacement from the probability density function (pdf). This can be seen from Figure 2. Figure 2: Visualisation of Particle Filter [3] In the beginning of each time iteration all particles have equal weight. During each time iteration every particle is propagated through the system model, and based on the observations, 3

6 is assigned a weight. The particles with higher weight (represented by bigger black dots in Figure 2) have higher probability of being sampled. Figure 2 shows the middle particle with the highest weight being copied three times, whilst other particles are copied twice, once or not at all. The function of sampling is to eliminate particles with low weights and focus on particles with high weights. Following the sampling step, a probability density function is created based on the particle values and the mode (the most common value) is the one taken for the estimate of the unobserved state. Afterwards, the iteration is shifted one unit forwards and the same steps are repeated. In order for a Particle Filter to function, a model of the system must exist that shows the relationship between the observable state variables and the unobservable (hidden) state variables that are to be estimated The Degeneracy Problem By executing the algorithm in the aforementioned way, a problem, known as the Degeneracy Problem, occurs. Degeneracy of the particle filter algorithm is a situation in which during the time iteration all but one particle are assigned an importance weight that is negligible (close to zero). A. Doucet (1998) ([2]) has proved that the variance of the importance weights can only increase with time and therefore the degeneracy problem cannot be avoided completely. One method to reduce the effect of degeneracy is by introducing resampling step into the algorithm which is called whenever severe degeneracy is observed. A suitable way to measure the degeneracy of the particle filter algorithm is through the use of a concept known as the Effective Sampling Size (ESS), which is calculated based on the normalised probability distribution of the particle weights. The ESS is then compared to some threshold value. If the ESS is bigger than this value, a particle is simply copied as it is. On the other hand, small ESS indicates severe degeneracy and if it is smaller than the threshold value resampling is called. A simplified pseudocode of the Particle Filter algorithm, including the resampling step, is shown in Algorithm 1. The number of particles are denoted by N p, the number of states by N s. 4

7 Algorithm 1 Particle Filter pseudo code 1: Initialize particles: x i 1 unif(a, b), for i = 1... Np 2: Assign initial weights: w1 0 i = 1/Np 3: for k time_max do 4: for i N p do 5: Propagate: x i k+1 f(x k+1 x i k ), where q is the system model 6: Assign weight: wk+1 k i = wi k k p(yi k+1 xi k+1 ), where p is a likelihood function 7: end for 8: for j N s do 9: Normalize weights: w j k+1 k = wj k+1 k / (w j k+1 k ) 10: Calculate Effective Sampling Size (ESS): ESS = 1/ (w j2 k+1 k ) 11: if ESS < N p/2 then 12: Resample according to Algorithm 2 13: end if 14: end for 15: end for Algorithm 2 Particle Filter Resampling pseudo code 1: Initialize the Cumulative Distribution Function (CDF): c 0 = 0 2: for i N p do 3: Construct the CDF: c i = c i 1 + w i k 4: Draw a random number: l unif(0, 1) 5: Find the first index for which l < c i : ind = find(l < c i, 1) 6: Copy the particle corresponding to this index for the next iteration: x i k+1 = xl k 7: end for 3.2 Moving Horizon Estimator (MHE) MHE problem Moving Horizon Estimation (MHE) is a model based state estimation method that optimises the current time step state using N past measurements. MHE concept is presented in the Figure 3. MHE is proven to be superior over traditional estimation approaches like extended Kalman filter. The method also has guaranteed stability properties. [5] For nonlinear models, like in this exercise, solving of MHE requires solving nonlinear optimization problem. As the horizon lenght N increases, also the amount of variables to be optimized increases. However, the curse of dimensionality increases the computational cost of solving the problem exponentially. For instance solving the problem with moderate horizon length of 10, would require solving a optimization problem in 60 dimensional space (10 6). Because of this and the fact that the built-in nonlinear solver in Matlab is only moderately good, something different had to be used. In this project, Newton-based nonlinear programming (Newton-based NLP) method is utilised to avoid the curse of dimensionality. The method is presented in [6]. When using MHE, it is assumed that the past measurement sequence {y(k N),..., y(k)} is 5

8 Figure 3: Illustration of Moving Horizon Estimation known and perfect nonlinear model of the system is known: x l+1 k = f(x l k, w l k ) l = 0,..., N 1 y l k = h(x l k ) (2) Where x l k X and w l k W. From this on, for simplicity, we use the notations x l x l k, w l w l k, y l y(k N + l), h l h(x l k ) and f l = f(x l k, w l k ). The objective in MHE is to find the disturbance sequence {w 0,..., w N 1 } and initial state x 0 so that they solve the nonlinear programming problem of the form: N 1 min Φ(η(k)) = Γ + L N + x 0,w l l=0 L l s.t. x l+1 = f(x l, w l ), l = 0,..., N 1 x l X, w l W (3) with Γ = Γ(x 0 ) = (x 0 x 0 (k)) T Π 1 0 (x 0 x 0 (k)) L l = L l (x l, w l ) = (y(k N + l) h(x l )) T R 1 l (y(k N + l) h(x l )) + wl T Q 1 l L N = L N (x N, w N ) = (y(k) h(x N )) T R 1 l (y(k) h(z N )) w l (4) and η(k) = { x 0 (k), Π 1 0 (k), y(k N),..., y(k)} (5) Here x 0 (k) is the a priori state estimate and associated covarinace is Π 0 (k). Matrices R, Q and Π 0 are user defined symmetric positive definite tuning matrices. 6

9 3.2.2 Solution to the MHE problem The Lagrange function associated to the NLP problem (3) is (λ l s are lagrange multipliers): L = Γ + L N + N 1 l=0 ( Ll + λ T l+1 (x l+1 f l ) ) (6) The optimal solution of this cost function L must satisfy the first order Karush-Kuhn-Tucker (KKT) conditions (gradients of lagrange function should be zero): x0 L = x0 Γ + x0 L 0 x0 f0 T λ 1 = 0 (7) xl L = xl L l + λ l xl fl T λ l+1 = 0, l = 1,, N 1 (8) wl L = wl L l wl fl T λ l+1 = 0, l = 1,, N 1 (9) λl+1 L = x l+1 f l = 0, l = 0,, N 1 (10) xn L = xn L N + λ N = 0 (11) Here ( ) means gradient with respect to ( ). Also, let us define A l = xl f l, G l = wl f l and C l = xl h l These conditions, however, are nonlinear for nonlinear problems and also hard to be solved. In Newton-based NLP, the KKT conditions are linearised around arbitary point (as close to the optimal solution as possible) and for the resulting problem, there exists a (rather) simple recursion. Let us denote: r xl = xl L r wl = wl L r λl = λl L P l = xl x l L W l = wl w l L F l = xl w l L Evaluated around some arbitary point ( x l, w l and λ l ). Also AB = A T B Then the optimal values x l,new = x l + x l (l = [0,..., N]), w l,new = w l + w l (l = [1,..., N]) and λ l,new = λ l + λ l (l = [1,..., N]) can be solved with the following recursion: 7

10 Algorithm 3 MHE recursion 1: Π 0 (P 0 F 0 W 1 0 F0 T ) 1 2: for l [0...N 1] do 3: M l+1 (G l W 1 l F T A l ) T Π l (F l W l G T l A l ) + G l W 1 l G T l 4: Π l+1 (P l+1 + M 1 l+1 F l+1w 1 l+1 F T l+1 ) 1 5: end for 6: Π N (P N + M 1 N ) 1 7: r M1 r λ1 + G 0 W 1 0 r w0 (G 0 W 1 0 F0 T A 0)Π 0 (r x0 F 0 W 1 0 r w0 ) 8: for l [0...N 1] do 9: r Ml+1 r λl+1 + G l W 1 l r wl (G l W 1 l Fl T A l )Π 0 (r xl F l W 1 l r wl ) 10: end for 11: x N Π N (r xn + M 1 N ) 12: for l [N...1] do 13: λ l M 1 l ( x l + r Ml ) 14: x l 1 Π l 1 (F l 1 W 1 l 1 GT l 1 AT l 1 ) λ l + Π l 1 (F l 1 W 1 l 1 rw l 1 rx l 1 M 1 l 1 r M l 1 ) 15: w l 1 W 1 l 1 F T l 1 z l 1 + W 1 l 1 GT l 1 λ l W 1 l 1 rw l 1 16: end for The recursions above solves only N time steps, but it is wanted that the algorithm solves time_max states. This can be obtained by the following algorithm: Algorithm 4 A fast Moving Horizon Estimator algorithm 1: Initialize first N state-estimates ˆx(0),..., ˆx(N 1), N-1 noise-estimates ŵ(0),..., ŵ(n 2) and N-1 lagrange multiplier estimates ˆλ(1),..., ˆλ(N 1) 2: for k [N,..., time_max] do 3: Intialize linearization points [ x 0,..., x N 1, x N ] [ˆx(k N),..., ˆx(k 1), f(ˆx(k 1), 0)], [ w 0,..., w N 2, w N 1 ] [ŵ(k N),..., ŵ(k 2), 0] and [ λ 1,..., λ N 1, λ N ] [ˆλ(k N + 1),..., ˆλ(k 1), 0] 4: Compute gradients A l, G l, C l, P l, W l, F l, r wl, r λl and r xl at linearization points 5: Compute [ x 0,new,..., x N,new ], [ w 0,new,..., w N 1,new ] and [ λ 1,new,..., λ N,new ] using algorithm 3 6: Update Π 0 using extended Kalman filter state covariance update formulation and already calculated gradients. 7: Set ˆx(k) x N,new, w(k 1) w N 1,new and ˆλ(k) λ N,new 8: Check that the constraints are fullfilled, if not, change the states correspondingly. 9: end for Including control to a fast Moving Horizon Estimator The combustion engine model contains also control, so the real model is of form: x l+1 k = f(x l k, u(l k), w l k ) l = 0,..., N 1 y l k = h(x l k ) (12) However the control is not included in the algorithm presented in [6]. The [ control ] can be w used and the result made more accurate by extending the noise vector: w aug =. This also u affects to the gradients and naturally increases the computational cost of the algorithm. Also, because the control values are accurate (contrary to the random noise), the corresponding values in tuning matrix Q must be made small and the estimated values w aug,new must be constrained so that they are equal to the actual control sequence. 8

11 4 Results The results for the estimated states are shown in the following two subsections for the Particle Filter and the Moving Horizon Estimator, respectively. The real values for the states are shown in blue and the estimated values in red. Additionally, in the particle filter plots, the particle paths of each particle are shown in yellow and the mean value of these paths, being one way to generate the state estimates, is shown in green. 4.1 Particle Filter The estimates for each state shown with red in Figures 5 to 12 are obtained by taking the mode of the probability density distribution created from the value of every particle at each time step. The number of particles used for the plots is 40. As it can be seen from these figures, this method for obtaining the estimate and the aforementioned method based on the mean of particle paths are quite similar for this particular problem. The mean method is computationally less demanding, however, as illustrated in Figure 4 can lead to false estimates in the case of multimodal distributions. Figure 4: Illustration of the kernel density estimation method for multimodal distributions Figure 4 shows that if the particle paths form a multimodal distribution the mean of these paths is rather possible to be misleading, leading to a bad estimate. Thus, even though the mode is more computationally expensive it yields more accurate results in such situations. All of the Figures 5 to 12 are based on 120 s of simulation time, except for Figure 7 which shows the simulation for the first 10 s for the state p x. The purpose of this is to show the behaviour 9

12 of the filter more closely. It can be seen that the estimation of each state converges to the true state value between 5 s to 10 s from the start of the simulation. Overall, the results for the estimation of each state are satisfactory when 40 particles are used. Increasing the number of particles leads to better estimates that follow the true state more closely and converge faster. This can be seen by comparing Figure 7 to Figure 8. In the latter the number of particles is doubled real p i estimated p i (pdf) estimated p i (mean) particle paths Figure 5: Estimation of state p i 10

13 real p x estimated p x (pdf) estimated p x (mean) particle paths Figure 6: Estimation of state p x real p x estimated p x (pdf) estimated p x (mean) particle paths Figure 7: Estimation of state p x with 40 particles, zoomed 11

14 real p x estimated p x (pdf) estimated p x (mean) particle paths Figure 8: Estimation of state p x with 80 particles, zoomed real p egr estimated p egr (pdf) estimated p egr (mean) particle paths Figure 9: Estimation of state p erg 12

15 real p 4 estimated p 4 (pdf) estimated p 4 (mean) particle paths Figure 10: Estimation of state p 4 10 x real ω tc estimated ω tc (pdf) estimated ω tc (mean) particle paths 0 Figure 11: Estimation of state ω tc 13

16 real P c estimated P c (pdf) estimated P c (mean) particle paths 0 Figure 12: Estimation of state p c 14

17 Figure 13 shows true EGR rate (blue) and the estimated EGR rate (red) which is based on the estimated values for the system s states real egrrate estimated egrrate (pdf) 0 Figure 13: The EGR rate based on the estimated states. 15

18 4.2 Moving Horizon Estimator The estimates for each state shown with red in Figures 14 to 20 are obtained by using the algorithm presented in 3.2 with horizon length of N = 10 and simulation time of 120 s. Similarly as for PF, Figure 16 shows the simulation for the first 10 s for the state p x. The purpose of this is the same as previously. Even though a horizon length of 10 was used here to present the results, the figures would be very similar also with horizon lengths of The used tuning cost matrices Q and RS were same as real state and measurement covariance matrices with the change that state covariance matrix was [ augmented with] a diagonal matrix with small entries to compensate diag(w) 0 the controls: Q aug = I The initialisation of the N first step was done by linearly increasing the horizon length by one for the first N steps. The effect of increasing the horizon length is quite similar to increasing the number of particles. There is, however, one important to note difference: Unlike in PF, all of the computed states of MHE affect each other and at some point the cumulation of computational errors and errors caused by the linearisation surpasses the benefits gained from more information. For this particular model, good horizon length seems to be surprisingly small and around real p i 2.95 estimated p i 2.9 measured p i Figure 14: Estimation of state p i Figure 21 shows true EGR rate (blue) and the estimated EGR rate (red) which is based on the estimated values for the system s states. 16

19 2.8 real p x 2.6 estimated p x Figure 15: Estimation of state p x From the results it can be seen that the estimated states converge to the real values in about 5 s-10 s. 17

20 2.8 real p x 2.6 estimated p x Figure 16: Estimation of state p x, zoomed real p egr estimated p egr 1 Figure 17: Estimation of state p egr 18

21 4.2 4 real p 4 estimated p Figure 18: Estimation of state p 4 10 x real w tc 6 estimated w tc measured w tc 5.5 Figure 19: Estimation of state ω tc 19

22 real P c estimated P c 1 Figure 20: Estimation of state p c real egrrate estimated egrrate Figure 21: The EGR rate based on the estimated states. 20

23 4.3 Unscented Kalman Filter The estimates for each state shown with red in Figures 22 to 28 are obtained by using UKF implemented in [1] with simulation time of 120 s. Similarly as before, Figure 24 shows the simulation for the first 10 s for the state p x. The purpose of this is the same as previously. 3 real p i 2.95 estimated p i 2.9 measured p i Figure 22: Estimation of state p i Figure 29 shows true EGR rate (blue) and the estimated EGR rate (red) which is based on the estimated values for the system s states. It can be seen that the estimation of each state converges to the true state value between 3 s to 5 s from the start of the simulation. 21

24 2.8 real p x 2.6 estimated p x Figure 23: Estimation of state p x 2.8 real p x 2.6 estimated p x Figure 24: Estimation of state p x, zoomed 22

25 real p egr estimated p egr 1.5 Figure 25: Estimation of state p egr 4 real p 4 estimated p Figure 26: Estimation of state p 4 23

26 10.5 x real w tc 6.5 estimated w tc measured w tc 6 Figure 27: Estimation of state ω tc real P c estimated P c 1 Figure 28: Estimation of state p c 24

27 real egrrate estimated egrrate Figure 29: The EGR rate based on the estimated states. 25

28 5 Robustness testing Parameter uncertainties in the form T x T x + T x were introduced to check the robustness of the filters. These uncertainties were created for temperatures T i, T x, T 3, and T 5, for effective areas A egr2, and A vgt_max, and for compressor and engine volumetric efficiencies η c and η v, respectively. These parameters were selected as there is biggest uncertainty in them in real combustion engines as well. If the measurements were created with a real engine, in these parameters there would be the biggest uncertainty. 5.1 Particle Filter For the particle filter, each of the previously mentioned parameters were changed one at a time, so that K, where K is any of the aforementioned parameters, was either 10%, 5%, 5% or 10% of K. The root-mean square errors (RMSE) of the estimated states relative to the true states were calculated for the original parameter values, as well as for the parameter values with uncertainties, and were compared to each other. A test was also conducted in which all the parameters were given a uncertainty of 10% at the same time. Overall, the particle filter performed well in each of the aforementioned scenarios. Some of the parameters, such as the temperatures T i and T x, showed to have less impact on the RMS values when uncertainties were present, whilst other parameters, such as the effective area A vgt_max and the compressor efficiency η c, had greater impacts. Despite these variations, the particle filter estimates remained stable and followed closely the true state values. 5.2 Moving Horizon Estimator Similar changes to the parameters as for PF were made for MHE, but the 10% change in all parameters at the same time was not done. The reason for this was that even 5% change in any of the parameters increased the RMSE of non measurable states by %. The estimator did not, however, diverge no matter how big the uncertainty was. The reason for large RMSE is that the uncertainty causes a bias in all non-measurable states and this makes the RMSE big. Especially for states p egr and p 4 even a small uncertainty made the estimates useless. Surprisingly, the errors were approximately the same for all parameters that were changed. As explained in [1], the function W comp seen in equation 1d is calculated with a lookuptable and functions W egr1, W egr2, W exh and W vgt are not continuous. However, Newton-based NLP MHE algorithm requires that L is differentiable [6]. However, in this particular case this is not true. Because W comp is calculated from a lookuptable, its derivative can be only approxi- 26

29 mated with the rigorous definition of derivative ( df(x) dx = f(x+ x) f(x) h ). Also since part of the functions are not continous, for them there exists no derivative in one point. 5.3 Unscented Kalman Filter Similar changes to the parameters as for PF were made for UKF. The root-mean square errors (RMSE) of the estimated states relative to the true states were calculated for the original parameter values, as well as for the parameter values with uncertainties, and were compared to each other. Overall, UKF performed bad in each of the scenarios. The results were similar to MHE, but worse. The algorithm even diverged when any parameter was changed 10 %. With 5 % uncertainty, the RMSE was on average about 200 % worse than with the perfect model. 6 Comparasion of the estimators All the estimators perform atleast moderately in estimating all the states. Naturally estimation of the measured states p i and ω tc is excellent for all methods. However, PF smoothens the random deviation of the states a little too much and therefore MHE and UKF perform little bit better in estimating these states. Other four states p x, p egr, p 4 and p c are also estimated well for all the estimators. However, PF and UKF seem to smoothen the random deviation of the states more than MHE and therefore the gradient of the states is not as big, even though the RMSEs of the estimates are approximately the same. Also, there seems to be some problems for all filters to follow the state p c. MHE seems to lost track of it at the very beginning but soon finds the correct state. Vice versa, PF and UKF perform well at the beginiing but after that not as well as MHE. EGR is estimated best by UKF as it seems to filter out the noise of the states better than PF and MHE. However, RMSEs for all the filters are approximately the same. UKF is also the fastest of all three estimators. It can be seen that the estimation of each state converges to the true state value between 3 s to 5 s from the start of the simulation for UKF and between 5 s to 10 s fo PF and MHE. UKF is also fastest of all the algorithms simulation of iterations took it 60 s on 2.8GHz quad-core i7 processor that has 16 GB of DDR3 memory. The same simulation took PF 600 s and MHE 2900 s. PF performed better in all robustness tests and even an uncertainty of 10 % does not reduce the performance drastically. On the contrary MHE and UKF are not robust at all. 27

30 The test were performed with gaussian noise and it seems that UKF is in this case the best option as an estimate, because it performs as well as PF in all the tests. However, if the noise was not gaussian or there is uncertainty in the model, then it would be preferable to use PF as it doesn t make any assumptions about the noise and is also very robust. MHE should not be used in this particular problem at all. 28

31 7 Time managment We were working on the project every week at the same time together, doing different things but helping each other whenever needed. By doing so, we could keep track of the hours spent more efficiently.since the beginning of the project, we have managed to work together twice a week on Mondays and Wednesdays so that the weekly work load of 6.7 hours has been fulfilled. A updated graphical representation of the actualised working hourse can be found from the Figure 30. The hours spent doing the project equals 5 ects for both team members. Gantt chart of the project can be found from Figure 31. Figure 30: Planned and actual hours. 29

32 Figure 31: Gantt chart for the project 30

33 8 Risk managment Since the project work is a simulation done on a computer, the only problems could had arisen from there. Earlier estimated risks are listed in the Table 1. Table 1: Most probable risks that can occur during the project and risk management. Risk Implementing benchmark models take too much time Probability of occurrence 100% Other school assignment 50% and work Programming related problems 40% Problems with MHE Sickness/Injuries Loosing project related data in case of hardware breakdown 15% Insufficient material 0% Risk did not occur 5% 5% Risk Management The risk occured. We did not properly manage it. Because of this, implementing the benchmark problems took too much time. Strictly following and properly planning the schedule Properly understanding of topics; Not being to shy to ask for help from course staff; Programming together Make easier implementation that has lower performance Keeping track of each other s work and have the files accessible by other group members Make backups on cloud The only occured risk was the first one. The first benchmark models ment to be used for testing implemented filters were too hard and had to be swaped to easier ones. The precentages seen in the table have not changed since midterm report. 9 Team work managment Team work and co-operation with the supervisor worked really well. The common working hours and regular meetings with the supervisor helped to maintain good group work for the whole project. No negative thing can be said about team work. 31

34 10 Conclusions and future work Both the particle filter and the moving horizon estimator worked well for the combustion engine model in the case the model the measurements were created is exactly the same as the model used for the estimators. If this however is not the case, moving horizon filter works only as a smoother for the measured states. On the contrary, particle filter is very robust and works well even though the model parameters are not accurate. Also the computation of PF takes much less time than MHE. Therefore, as a conclusion to this particular problem the particle filter is the better choice overall out of the implemented estimators. However, UKF outperforms PF and MHE in speed of the computation and performs equally well in accuracy of the estimates but is not as robust as PF. As a future work it would be interesting to investigate the possibilities of augmented Newtonbased NLP performance for some more suitable model (accurate model derived from the first principles). Scientific research of this kind could not be found during the project work. Also, it would be interesting to compare PF with different propability distributions of the particles. 32

35 References [1] Samokhin, S., Zenger, K., "High-pressure recirculated exhaust gas fraction identification and control in marine diesel engines," [2] Doucet, A., "On sequential Monte Carlo methods for Bayesian filtering," Dept. Eng., Univ. Cambridge, UK, Tech. Rep., 1998 [3] Pfeiffer, M. (2004). A brief introduction to particle filters. documento pdf. [4] Rawlingsm J. B., Mayne, D. Q., Model Predictive Control: Theory and Design pp. 38. [5] Rao, C. V., Rawlings, J.B., Mayne, D.Q "Constrained State Estimation for Nonlinear Discrete-Time Systems: Stability and Moving Horizon Approximations". IEEE Transactions on Automatic Control, (2003), 48 (2): [6] Zavala, M. V., Laird, C. D., Biegler, L. T. A fast moving horizon estimation algorithm based on nonlinear programming sensitivity. Elsevier, (2008), 18: [7] Julier, S. J., Uhlmann, J. K. Unscented filtering and nonlinear estimation. Proceedings of the IEEE, (2004), 92 (3):

A new unscented Kalman filter with higher order moment-matching

A new unscented Kalman filter with higher order moment-matching A new unscented Kalman filter with higher order moment-matching KSENIA PONOMAREVA, PARESH DATE AND ZIDONG WANG Department of Mathematical Sciences, Brunel University, Uxbridge, UB8 3PH, UK. Abstract This

More information

TSRT14: Sensor Fusion Lecture 8

TSRT14: Sensor Fusion Lecture 8 TSRT14: Sensor Fusion Lecture 8 Particle filter theory Marginalized particle filter Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 8 Gustaf Hendeby Spring 2018 1 / 25 Le 8: particle filter theory,

More information

Constrained State Estimation Using the Unscented Kalman Filter

Constrained State Estimation Using the Unscented Kalman Filter 16th Mediterranean Conference on Control and Automation Congress Centre, Ajaccio, France June 25-27, 28 Constrained State Estimation Using the Unscented Kalman Filter Rambabu Kandepu, Lars Imsland and

More information

State Estimation using Moving Horizon Estimation and Particle Filtering

State Estimation using Moving Horizon Estimation and Particle Filtering State Estimation using Moving Horizon Estimation and Particle Filtering James B. Rawlings Department of Chemical and Biological Engineering UW Math Probability Seminar Spring 2009 Rawlings MHE & PF 1 /

More information

ROBUST CONSTRAINED ESTIMATION VIA UNSCENTED TRANSFORMATION. Pramod Vachhani a, Shankar Narasimhan b and Raghunathan Rengaswamy a 1

ROBUST CONSTRAINED ESTIMATION VIA UNSCENTED TRANSFORMATION. Pramod Vachhani a, Shankar Narasimhan b and Raghunathan Rengaswamy a 1 ROUST CONSTRINED ESTIMTION VI UNSCENTED TRNSFORMTION Pramod Vachhani a, Shankar Narasimhan b and Raghunathan Rengaswamy a a Department of Chemical Engineering, Clarkson University, Potsdam, NY -3699, US.

More information

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER Zhen Zhen 1, Jun Young Lee 2, and Abdus Saboor 3 1 Mingde College, Guizhou University, China zhenz2000@21cn.com 2 Department

More information

The Unscented Particle Filter

The Unscented Particle Filter The Unscented Particle Filter Rudolph van der Merwe (OGI) Nando de Freitas (UC Bereley) Arnaud Doucet (Cambridge University) Eric Wan (OGI) Outline Optimal Estimation & Filtering Optimal Recursive Bayesian

More information

Introduction to Unscented Kalman Filter

Introduction to Unscented Kalman Filter Introduction to Unscented Kalman Filter 1 Introdution In many scientific fields, we use certain models to describe the dynamics of system, such as mobile robot, vision tracking and so on. The word dynamics

More information

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS Frédéric Mustière e-mail: mustiere@site.uottawa.ca Miodrag Bolić e-mail: mbolic@site.uottawa.ca Martin Bouchard e-mail: bouchard@site.uottawa.ca

More information

A Study of Covariances within Basic and Extended Kalman Filters

A Study of Covariances within Basic and Extended Kalman Filters A Study of Covariances within Basic and Extended Kalman Filters David Wheeler Kyle Ingersoll December 2, 2013 Abstract This paper explores the role of covariance in the context of Kalman filters. The underlying

More information

AUTOMOTIVE ENVIRONMENT SENSORS

AUTOMOTIVE ENVIRONMENT SENSORS AUTOMOTIVE ENVIRONMENT SENSORS Lecture 5. Localization BME KÖZLEKEDÉSMÉRNÖKI ÉS JÁRMŰMÉRNÖKI KAR 32708-2/2017/INTFIN SZÁMÚ EMMI ÁLTAL TÁMOGATOTT TANANYAG Related concepts Concepts related to vehicles moving

More information

State Estimation of Linear and Nonlinear Dynamic Systems

State Estimation of Linear and Nonlinear Dynamic Systems State Estimation of Linear and Nonlinear Dynamic Systems Part III: Nonlinear Systems: Extended Kalman Filter (EKF) and Unscented Kalman Filter (UKF) James B. Rawlings and Fernando V. Lima Department of

More information

MMSE-Based Filtering for Linear and Nonlinear Systems in the Presence of Non-Gaussian System and Measurement Noise

MMSE-Based Filtering for Linear and Nonlinear Systems in the Presence of Non-Gaussian System and Measurement Noise MMSE-Based Filtering for Linear and Nonlinear Systems in the Presence of Non-Gaussian System and Measurement Noise I. Bilik 1 and J. Tabrikian 2 1 Dept. of Electrical and Computer Engineering, University

More information

An introduction to particle filters

An introduction to particle filters An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline

More information

Dirac Mixture Density Approximation Based on Minimization of the Weighted Cramér von Mises Distance

Dirac Mixture Density Approximation Based on Minimization of the Weighted Cramér von Mises Distance Dirac Mixture Density Approximation Based on Minimization of the Weighted Cramér von Mises Distance Oliver C Schrempf, Dietrich Brunn, and Uwe D Hanebeck Abstract This paper proposes a systematic procedure

More information

EKF and SLAM. McGill COMP 765 Sept 18 th, 2017

EKF and SLAM. McGill COMP 765 Sept 18 th, 2017 EKF and SLAM McGill COMP 765 Sept 18 th, 2017 Outline News and information Instructions for paper presentations Continue on Kalman filter: EKF and extension to mapping Example of a real mapping system:

More information

2D Image Processing (Extended) Kalman and particle filter

2D Image Processing (Extended) Kalman and particle filter 2D Image Processing (Extended) Kalman and particle filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz

More information

Autonomous Mobile Robot Design

Autonomous Mobile Robot Design Autonomous Mobile Robot Design Topic: Extended Kalman Filter Dr. Kostas Alexis (CSE) These slides relied on the lectures from C. Stachniss, J. Sturm and the book Probabilistic Robotics from Thurn et al.

More information

in a Rao-Blackwellised Unscented Kalman Filter

in a Rao-Blackwellised Unscented Kalman Filter A Rao-Blacwellised Unscented Kalman Filter Mar Briers QinetiQ Ltd. Malvern Technology Centre Malvern, UK. m.briers@signal.qinetiq.com Simon R. Masell QinetiQ Ltd. Malvern Technology Centre Malvern, UK.

More information

Dynamic System Identification using HDMR-Bayesian Technique

Dynamic System Identification using HDMR-Bayesian Technique Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in

More information

State Estimation for Nonlinear Systems using Restricted Genetic Optimization

State Estimation for Nonlinear Systems using Restricted Genetic Optimization State Estimation for Nonlinear Systems using Restricted Genetic Optimization Santiago Garrido, Luis Moreno, and Carlos Balaguer Universidad Carlos III de Madrid, Leganés 28911, Madrid (Spain) Abstract.

More information

State-Space Methods for Inferring Spike Trains from Calcium Imaging

State-Space Methods for Inferring Spike Trains from Calcium Imaging State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline

More information

Gaussian Mixtures Proposal Density in Particle Filter for Track-Before-Detect

Gaussian Mixtures Proposal Density in Particle Filter for Track-Before-Detect 12th International Conference on Information Fusion Seattle, WA, USA, July 6-9, 29 Gaussian Mixtures Proposal Density in Particle Filter for Trac-Before-Detect Ondřej Straa, Miroslav Šimandl and Jindřich

More information

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras 1 Motivation Recall: Discrete filter Discretize the

More information

Week 3: Linear Regression

Week 3: Linear Regression Week 3: Linear Regression Instructor: Sergey Levine Recap In the previous lecture we saw how linear regression can solve the following problem: given a dataset D = {(x, y ),..., (x N, y N )}, learn to

More information

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics EKF, UKF Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Kalman Filter Kalman Filter = special case of a Bayes filter with dynamics model and sensory

More information

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics EKF, UKF Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Kalman Filter Kalman Filter = special case of a Bayes filter with dynamics model and sensory

More information

Inferring biological dynamics Iterated filtering (IF)

Inferring biological dynamics Iterated filtering (IF) Inferring biological dynamics 101 3. Iterated filtering (IF) IF originated in 2006 [6]. For plug-and-play likelihood-based inference on POMP models, there are not many alternatives. Directly estimating

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:

More information

Course on Model Predictive Control Part II Linear MPC design

Course on Model Predictive Control Part II Linear MPC design Course on Model Predictive Control Part II Linear MPC design Gabriele Pannocchia Department of Chemical Engineering, University of Pisa, Italy Email: g.pannocchia@diccism.unipi.it Facoltà di Ingegneria,

More information

Nonlinear Identification of Backlash in Robot Transmissions

Nonlinear Identification of Backlash in Robot Transmissions Nonlinear Identification of Backlash in Robot Transmissions G. Hovland, S. Hanssen, S. Moberg, T. Brogårdh, S. Gunnarsson, M. Isaksson ABB Corporate Research, Control Systems Group, Switzerland ABB Automation

More information

RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T DISTRIBUTION

RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T DISTRIBUTION 1 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING, SEPT. 3 6, 1, SANTANDER, SPAIN RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T

More information

A nonlinear filtering tool for analysis of hot-loop test campaings

A nonlinear filtering tool for analysis of hot-loop test campaings A nonlinear filtering tool for analysis of hot-loop test campaings Enso Ikonen* Jenő Kovács*, ** * Systems Engineering Laboratory, Department of Process and Environmental Engineering, University of Oulu,

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Moving Horizon Filter for Monotonic Trends

Moving Horizon Filter for Monotonic Trends 43rd IEEE Conference on Decision and Control December 4-7, 2004 Atlantis, Paradise Island, Bahamas ThA.3 Moving Horizon Filter for Monotonic Trends Sikandar Samar Stanford University Dimitry Gorinevsky

More information

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information

Combined Particle and Smooth Variable Structure Filtering for Nonlinear Estimation Problems

Combined Particle and Smooth Variable Structure Filtering for Nonlinear Estimation Problems 14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 2011 Combined Particle and Smooth Variable Structure Filtering for Nonlinear Estimation Problems S. Andrew Gadsden

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter

More information

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania Nonlinear and/or Non-normal Filtering Jesús Fernández-Villaverde University of Pennsylvania 1 Motivation Nonlinear and/or non-gaussian filtering, smoothing, and forecasting (NLGF) problems are pervasive

More information

Tracking an Accelerated Target with a Nonlinear Constant Heading Model

Tracking an Accelerated Target with a Nonlinear Constant Heading Model Tracking an Accelerated Target with a Nonlinear Constant Heading Model Rong Yang, Gee Wah Ng DSO National Laboratories 20 Science Park Drive Singapore 118230 yrong@dsoorgsg ngeewah@dsoorgsg Abstract This

More information

SEQUENTIAL MONTE CARLO METHODS WITH APPLICATIONS TO COMMUNICATION CHANNELS. A Thesis SIRISH BODDIKURAPATI

SEQUENTIAL MONTE CARLO METHODS WITH APPLICATIONS TO COMMUNICATION CHANNELS. A Thesis SIRISH BODDIKURAPATI SEQUENTIAL MONTE CARLO METHODS WITH APPLICATIONS TO COMMUNICATION CHANNELS A Thesis by SIRISH BODDIKURAPATI Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of

More information

Particle Filters. Outline

Particle Filters. Outline Particle Filters M. Sami Fadali Professor of EE University of Nevada Outline Monte Carlo integration. Particle filter. Importance sampling. Degeneracy Resampling Example. 1 2 Monte Carlo Integration Numerical

More information

Nonlinear Model Predictive Control Tools (NMPC Tools)

Nonlinear Model Predictive Control Tools (NMPC Tools) Nonlinear Model Predictive Control Tools (NMPC Tools) Rishi Amrit, James B. Rawlings April 5, 2008 1 Formulation We consider a control system composed of three parts([2]). Estimator Target calculator Regulator

More information

Mark Gales October y (x) x 1. x 2 y (x) Inputs. Outputs. x d. y (x) Second Output layer layer. layer.

Mark Gales October y (x) x 1. x 2 y (x) Inputs. Outputs. x d. y (x) Second Output layer layer. layer. University of Cambridge Engineering Part IIB & EIST Part II Paper I0: Advanced Pattern Processing Handouts 4 & 5: Multi-Layer Perceptron: Introduction and Training x y (x) Inputs x 2 y (x) 2 Outputs x

More information

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets J. Clayton Kerce a, George C. Brown a, and David F. Hardiman b a Georgia Tech Research Institute, Georgia Institute of Technology,

More information

Dual Estimation and the Unscented Transformation

Dual Estimation and the Unscented Transformation Dual Estimation and the Unscented Transformation Eric A. Wan ericwan@ece.ogi.edu Rudolph van der Merwe rudmerwe@ece.ogi.edu Alex T. Nelson atnelson@ece.ogi.edu Oregon Graduate Institute of Science & Technology

More information

1 Introduction ISSN

1 Introduction ISSN Techset Composition Ltd, Salisbury Doc: {IEE}CTA/Articles/Pagination/CTA58454.3d www.ietdl.org Published in IET Control Theory and Applications Received on 15th January 2009 Revised on 18th May 2009 ISSN

More information

DESIGN AND IMPLEMENTATION OF SENSORLESS SPEED CONTROL FOR INDUCTION MOTOR DRIVE USING AN OPTIMIZED EXTENDED KALMAN FILTER

DESIGN AND IMPLEMENTATION OF SENSORLESS SPEED CONTROL FOR INDUCTION MOTOR DRIVE USING AN OPTIMIZED EXTENDED KALMAN FILTER INTERNATIONAL JOURNAL OF ELECTRONICS AND COMMUNICATION ENGINEERING & TECHNOLOGY (IJECET) International Journal of Electronics and Communication Engineering & Technology (IJECET), ISSN 0976 ISSN 0976 6464(Print)

More information

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London http://www2.imperial.ac.uk/~agandy London

More information

Nonlinear State Estimation! Particle, Sigma-Points Filters!

Nonlinear State Estimation! Particle, Sigma-Points Filters! Nonlinear State Estimation! Particle, Sigma-Points Filters! Robert Stengel! Optimal Control and Estimation, MAE 546! Princeton University, 2017!! Particle filter!! Sigma-Points Unscented Kalman ) filter!!

More information

Engineering Part IIB: Module 4F10 Statistical Pattern Processing Lecture 5: Single Layer Perceptrons & Estimating Linear Classifiers

Engineering Part IIB: Module 4F10 Statistical Pattern Processing Lecture 5: Single Layer Perceptrons & Estimating Linear Classifiers Engineering Part IIB: Module 4F0 Statistical Pattern Processing Lecture 5: Single Layer Perceptrons & Estimating Linear Classifiers Phil Woodland: pcw@eng.cam.ac.uk Michaelmas 202 Engineering Part IIB:

More information

On a Data Assimilation Method coupling Kalman Filtering, MCRE Concept and PGD Model Reduction for Real-Time Updating of Structural Mechanics Model

On a Data Assimilation Method coupling Kalman Filtering, MCRE Concept and PGD Model Reduction for Real-Time Updating of Structural Mechanics Model On a Data Assimilation Method coupling, MCRE Concept and PGD Model Reduction for Real-Time Updating of Structural Mechanics Model 2016 SIAM Conference on Uncertainty Quantification Basile Marchand 1, Ludovic

More information

Efficient Monitoring for Planetary Rovers

Efficient Monitoring for Planetary Rovers International Symposium on Artificial Intelligence and Robotics in Space (isairas), May, 2003 Efficient Monitoring for Planetary Rovers Vandi Verma vandi@ri.cmu.edu Geoff Gordon ggordon@cs.cmu.edu Carnegie

More information

Gaussian Process for Internal Model Control

Gaussian Process for Internal Model Control Gaussian Process for Internal Model Control Gregor Gregorčič and Gordon Lightbody Department of Electrical Engineering University College Cork IRELAND E mail: gregorg@rennesuccie Abstract To improve transparency

More information

A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization

A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization and Timothy D. Barfoot CRV 2 Outline Background Objective Experimental Setup Results Discussion Conclusion 2 Outline

More information

System analysis of a diesel engine with VGT and EGR

System analysis of a diesel engine with VGT and EGR System analysis of a diesel engine with VGT and EGR Master s thesis performed in Vehicular Systems by Thomas Johansson Reg nr: LiTH-ISY-EX- -5/3714- -SE 9th October 25 System analysis of a diesel engine

More information

Expectation Propagation in Dynamical Systems

Expectation Propagation in Dynamical Systems Expectation Propagation in Dynamical Systems Marc Peter Deisenroth Joint Work with Shakir Mohamed (UBC) August 10, 2012 Marc Deisenroth (TU Darmstadt) EP in Dynamical Systems 1 Motivation Figure : Complex

More information

State Estimation of Linear and Nonlinear Dynamic Systems

State Estimation of Linear and Nonlinear Dynamic Systems State Estimation of Linear and Nonlinear Dynamic Systems Part IV: Nonlinear Systems: Moving Horizon Estimation (MHE) and Particle Filtering (PF) James B. Rawlings and Fernando V. Lima Department of Chemical

More information

Sensor Fusion: Particle Filter

Sensor Fusion: Particle Filter Sensor Fusion: Particle Filter By: Gordana Stojceska stojcesk@in.tum.de Outline Motivation Applications Fundamentals Tracking People Advantages and disadvantages Summary June 05 JASS '05, St.Petersburg,

More information

Week 5: Logistic Regression & Neural Networks

Week 5: Logistic Regression & Neural Networks Week 5: Logistic Regression & Neural Networks Instructor: Sergey Levine 1 Summary: Logistic Regression In the previous lecture, we covered logistic regression. To recap, logistic regression models and

More information

A STUDY ON THE STATE ESTIMATION OF NONLINEAR ELECTRIC CIRCUITS BY UNSCENTED KALMAN FILTER

A STUDY ON THE STATE ESTIMATION OF NONLINEAR ELECTRIC CIRCUITS BY UNSCENTED KALMAN FILTER A STUDY ON THE STATE ESTIMATION OF NONLINEAR ELECTRIC CIRCUITS BY UNSCENTED KALMAN FILTER Esra SAATCI Aydın AKAN 2 e-mail: esra.saatci@iku.edu.tr e-mail: akan@istanbul.edu.tr Department of Electronic Eng.,

More information

MODELLING ANALYSIS & DESIGN OF DSP BASED NOVEL SPEED SENSORLESS VECTOR CONTROLLER FOR INDUCTION MOTOR DRIVE

MODELLING ANALYSIS & DESIGN OF DSP BASED NOVEL SPEED SENSORLESS VECTOR CONTROLLER FOR INDUCTION MOTOR DRIVE International Journal of Advanced Research in Engineering and Technology (IJARET) Volume 6, Issue 3, March, 2015, pp. 70-81, Article ID: IJARET_06_03_008 Available online at http://www.iaeme.com/ijaret/issues.asp?jtypeijaret&vtype=6&itype=3

More information

17 : Markov Chain Monte Carlo

17 : Markov Chain Monte Carlo 10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo

More information

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein Kalman filtering and friends: Inference in time series models Herke van Hoof slides mostly by Michael Rubinstein Problem overview Goal Estimate most probable state at time k using measurement up to time

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 7 Sequential Monte Carlo methods III 7 April 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

Lecture 7: Optimal Smoothing

Lecture 7: Optimal Smoothing Department of Biomedical Engineering and Computational Science Aalto University March 17, 2011 Contents 1 What is Optimal Smoothing? 2 Bayesian Optimal Smoothing Equations 3 Rauch-Tung-Striebel Smoother

More information

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007 Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember

More information

Robotics. Lecture 4: Probabilistic Robotics. See course website for up to date information.

Robotics. Lecture 4: Probabilistic Robotics. See course website   for up to date information. Robotics Lecture 4: Probabilistic Robotics See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College London Review: Sensors

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering

Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering Oliver C. Schrempf, Dietrich Brunn, Uwe D. Hanebeck Intelligent Sensor-Actuator-Systems Laboratory Institute

More information

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada SLAM Techniques and Algorithms Jack Collier Defence Research and Development Canada Recherche et développement pour la défense Canada Canada Goals What will we learn Gain an appreciation for what SLAM

More information

Failure Prognostics with Missing Data Using Extended Kalman Filter

Failure Prognostics with Missing Data Using Extended Kalman Filter Failure Prognostics with Missing Data Using Extended Kalman Filter Wlamir Olivares Loesch Vianna 1, and Takashi Yoneyama 2 1 EMBRAER S.A., São José dos Campos, São Paulo, 12227 901, Brazil wlamir.vianna@embraer.com.br

More information

DUAL REGULARIZED TOTAL LEAST SQUARES SOLUTION FROM TWO-PARAMETER TRUST-REGION ALGORITHM. Geunseop Lee

DUAL REGULARIZED TOTAL LEAST SQUARES SOLUTION FROM TWO-PARAMETER TRUST-REGION ALGORITHM. Geunseop Lee J. Korean Math. Soc. 0 (0), No. 0, pp. 1 0 https://doi.org/10.4134/jkms.j160152 pissn: 0304-9914 / eissn: 2234-3008 DUAL REGULARIZED TOTAL LEAST SQUARES SOLUTION FROM TWO-PARAMETER TRUST-REGION ALGORITHM

More information

3.4 Linear Least-Squares Filter

3.4 Linear Least-Squares Filter X(n) = [x(1), x(2),..., x(n)] T 1 3.4 Linear Least-Squares Filter Two characteristics of linear least-squares filter: 1. The filter is built around a single linear neuron. 2. The cost function is the sum

More information

Machine Learning for OR & FE

Machine Learning for OR & FE Machine Learning for OR & FE Hidden Markov Models Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com Additional References: David

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

(Extended) Kalman Filter

(Extended) Kalman Filter (Extended) Kalman Filter Brian Hunt 7 June 2013 Goals of Data Assimilation (DA) Estimate the state of a system based on both current and all past observations of the system, using a model for the system

More information

Data assimilation with and without a model

Data assimilation with and without a model Data assimilation with and without a model Tim Sauer George Mason University Parameter estimation and UQ U. Pittsburgh Mar. 5, 2017 Partially supported by NSF Most of this work is due to: Tyrus Berry,

More information

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC Background Data Assimilation Iterative process Forecast Analysis Background

More information

Numerical Fitting-based Likelihood Calculation to Speed up the Particle Filter

Numerical Fitting-based Likelihood Calculation to Speed up the Particle Filter Numerical Fitting-based Likelihood Calculation to Speed up the Particle Filter Tiancheng Li, Shudong Sun, Juan M. Corchado, Tariq P. Sattar and Shubin Si T. Li is with the with the research group of Bioinformatics,

More information

A Gaussian Mixture Motion Model and Contact Fusion Applied to the Metron Data Set

A Gaussian Mixture Motion Model and Contact Fusion Applied to the Metron Data Set 1th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 211 A Gaussian Mixture Motion Model and Contact Fusion Applied to the Metron Data Set Kathrin Wilkens 1,2 1 Institute

More information

Pattern Recognition and Machine Learning

Pattern Recognition and Machine Learning Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability

More information

Moving Horizon Estimation with Dynamic Programming

Moving Horizon Estimation with Dynamic Programming Cleveland State University EngagedScholarship@CSU ETD Archive 2013 Moving Horizon Estimation with Dynamic Programming Mohan Kumar Ramalingam Cleveland State University How does access to this work benefit

More information

Predictive Control of Gyroscopic-Force Actuators for Mechanical Vibration Damping

Predictive Control of Gyroscopic-Force Actuators for Mechanical Vibration Damping ARC Centre of Excellence for Complex Dynamic Systems and Control, pp 1 15 Predictive Control of Gyroscopic-Force Actuators for Mechanical Vibration Damping Tristan Perez 1, 2 Joris B Termaat 3 1 School

More information

output dimension input dimension Gaussian evidence Gaussian Gaussian evidence evidence from t +1 inputs and outputs at time t x t+2 x t-1 x t+1

output dimension input dimension Gaussian evidence Gaussian Gaussian evidence evidence from t +1 inputs and outputs at time t x t+2 x t-1 x t+1 To appear in M. S. Kearns, S. A. Solla, D. A. Cohn, (eds.) Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, 999. Learning Nonlinear Dynamical Systems using an EM Algorithm Zoubin

More information

Parameter Estimation in a Moving Horizon Perspective

Parameter Estimation in a Moving Horizon Perspective Parameter Estimation in a Moving Horizon Perspective State and Parameter Estimation in Dynamical Systems Reglerteknik, ISY, Linköpings Universitet State and Parameter Estimation in Dynamical Systems OUTLINE

More information

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother Simo Särkkä, Aki Vehtari and Jouko Lampinen Helsinki University of Technology Department of Electrical and Communications

More information

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

Sliding Window Test vs. Single Time Test for Track-to-Track Association

Sliding Window Test vs. Single Time Test for Track-to-Track Association Sliding Window Test vs. Single Time Test for Track-to-Track Association Xin Tian Dept. of Electrical and Computer Engineering University of Connecticut Storrs, CT 06269-257, U.S.A. Email: xin.tian@engr.uconn.edu

More information

Implementation of Particle Filter-based Target Tracking

Implementation of Particle Filter-based Target Tracking of -based V. Rajbabu rajbabu@ee.iitb.ac.in VLSI Group Seminar Dept. of Electrical Engineering IIT Bombay November 15, 2007 Outline Introduction 1 Introduction 2 3 4 5 2 / 55 Outline Introduction 1 Introduction

More information

LPV Decoupling and Input Shaping for Control of Diesel Engines

LPV Decoupling and Input Shaping for Control of Diesel Engines American Control Conference Marriott Waterfront, Baltimore, MD, USA June -July, WeB9.6 LPV Decoupling and Input Shaping for Control of Diesel Engines Javad Mohammadpour, Karolos Grigoriadis, Matthew Franchek,

More information

Distributed Data Fusion with Kalman Filters. Simon Julier Computer Science Department University College London

Distributed Data Fusion with Kalman Filters. Simon Julier Computer Science Department University College London Distributed Data Fusion with Kalman Filters Simon Julier Computer Science Department University College London S.Julier@cs.ucl.ac.uk Structure of Talk Motivation Kalman Filters Double Counting Optimal

More information

1 Kalman Filter Introduction

1 Kalman Filter Introduction 1 Kalman Filter Introduction You should first read Chapter 1 of Stochastic models, estimation, and control: Volume 1 by Peter S. Maybec (available here). 1.1 Explanation of Equations (1-3) and (1-4) Equation

More information

Sequential Monte Carlo methods for filtering of unobservable components of multidimensional diffusion Markov processes

Sequential Monte Carlo methods for filtering of unobservable components of multidimensional diffusion Markov processes Sequential Monte Carlo methods for filtering of unobservable components of multidimensional diffusion Markov processes Ellida M. Khazen * 13395 Coppermine Rd. Apartment 410 Herndon VA 20171 USA Abstract

More information

BAYESIAN ESTIMATION BY SEQUENTIAL MONTE CARLO SAMPLING FOR NONLINEAR DYNAMIC SYSTEMS

BAYESIAN ESTIMATION BY SEQUENTIAL MONTE CARLO SAMPLING FOR NONLINEAR DYNAMIC SYSTEMS BAYESIAN ESTIMATION BY SEQUENTIAL MONTE CARLO SAMPLING FOR NONLINEAR DYNAMIC SYSTEMS DISSERTATION Presented in Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy in the Graduate

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run

More information

Parsimonious Adaptive Rejection Sampling

Parsimonious Adaptive Rejection Sampling Parsimonious Adaptive Rejection Sampling Luca Martino Image Processing Laboratory, Universitat de València (Spain). Abstract Monte Carlo (MC) methods have become very popular in signal processing during

More information

Confidence Estimation Methods for Neural Networks: A Practical Comparison

Confidence Estimation Methods for Neural Networks: A Practical Comparison , 6-8 000, Confidence Estimation Methods for : A Practical Comparison G. Papadopoulos, P.J. Edwards, A.F. Murray Department of Electronics and Electrical Engineering, University of Edinburgh Abstract.

More information

Particle Filters; Simultaneous Localization and Mapping (Intelligent Autonomous Robotics) Subramanian Ramamoorthy School of Informatics

Particle Filters; Simultaneous Localization and Mapping (Intelligent Autonomous Robotics) Subramanian Ramamoorthy School of Informatics Particle Filters; Simultaneous Localization and Mapping (Intelligent Autonomous Robotics) Subramanian Ramamoorthy School of Informatics Recap: State Estimation using Kalman Filter Project state and error

More information