Optimal input design for nonlinear dynamical systems: a graph-theory approach

Size: px
Start display at page:

Download "Optimal input design for nonlinear dynamical systems: a graph-theory approach"

Transcription

1 Optimal input design for nonlinear dynamical systems: a graph-theory approach Patricio E. Valenzuela Department of Automatic Control and ACCESS Linnaeus Centre KTH Royal Institute of Technology, Stockholm, Sweden Seminar Uppsala university January 16, 2015

2 System identification v t {}}{ v m t u t System y t System identification: Modeling based on input-output data. Basic entities: data set, model structure, identification method. 2

3 System identification The maximum likelihood method: where As n seq, we have: ˆθ nseq θ 0 ˆθ nseq = arg max θ Θ l θ(y 1:nseq ) l θ (y 1:nseq ) := log p θ (y 1:nseq ) ) nseq (ˆθnseq θ 0 Normal with zero mean and covariance {I e F } 1 3

4 System identification where ) nseq (ˆθnseq θ 0 Normal with zero mean and covariance {IF e } 1 { IF e := E θ l θ(y 1:nseq ) θ=θ0 θ l θ(y 1:nseq ) } θ=θ0 u 1:n seq Different u 1:nseq s different IF e s. Covariance matrix of ) n seq (ˆθnseq θ 0 affected by u 1:nseq! 4

5 Input design for dynamic systems Input design: Maximize information from an experiment. Existing methods: focused on linear systems. Recent developments for nonlinear systems (Hjalmarsson 2007, Larsson 2010, Gopaluni 2011, De Cock-Gevers-Schoukens 2013, Forgione-Bombois-Van den Hof-Hjalmarsson 2014). 5

6 Input design for dynamic systems Challenges for nonlinear systems: Problem complexity (usually non-convex). Model restrictions. Input restrictions. How could we overcome these limitations? 6

7 Summary 1. We present a method for input design for dynamic systems. 2. The method is also suitable for nonlinear systems. 7

8 Outline Problem formulation for output-error models Input design based on graph theory Extension to nonlinear SSM Closed-loop application oriented input design Conclusions and future work 8

9 Problem formulation for output-error models Input design based on graph theory Extension to nonlinear SSM Closed-loop application oriented input design Conclusions and future work 9

10 10 Problem formulation for output-error models e t u t G 0 y t G 0 is a system, where: -e t : white noise (variance λ e ) -u t : input -y t : system output Goal: Design u 1:nseq := (u 1,..., u nseq ) as a realization of a stationary process maximizing I F.

11 11 Problem formulation for output-error models Here, I F := 1 λ e E { nseq t=1 = 1 n seq λ e t=1 ψ θ 0 t (u t ) := θ ŷ t (u t ) θ=θ0 ŷ t (u t ) := G(u t ; θ) ψ θ 0 t (u t)ψ θ 0 t (u t) } ψ θ 0 t (u t)ψ θ 0 t (u t) dp(u 1:nseq ) Design u 1:nseq R nseq Design P(u 1:nseq ) P.

12 12 Problem formulation for output-error models Here, I F := 1 λ e E { nseq t=1 = 1 n seq λ e t=1 ψ θ 0 t (u t )ψ θ 0 t (u t ) } ψ θ 0 t (u t )ψ θ 0 t (u t ) dp(u 1:nseq ) Assumption u t C (C finite set) I F = 1 λ e n seq u 1:nseq C nseq t=1 ψ θ 0 t (u t)ψ θ 0 t (u t) p(u 1:nseq )

13 13 Problem formulation for output-error models Characterizing p(u 1:nseq ) P C : p nonnegative, p(x) = 1, p is shift invariant.

14 14 Problem formulation for output-error models Problem Design u opt 1:n seq C nseq as a realization from p opt (u 1:nseq ), where p opt (u 1:nseq ) := arg max p P C h(i F (p)) where h : R n θ n θ R is a matrix concave function, and I F (p) = 1 λ e n seq u 1:nseq C nseq t=1 ψ θ 0 t (u t)ψ θ 0 t (u t) p(u 1:nseq )

15 15 Problem formulation for output-error models Input design based on graph theory Extension to nonlinear SSM Closed-loop application oriented input design Conclusions and future work

16 16 Input design problem Problem Design u opt 1:n seq C nseq as a realization from p opt (u 1:nseq ), where p opt (u 1:nseq ) := arg max p P C h(i F (p)) where h : R n θ n θ R is a matrix concave function, and Issues: I F (p) = 1 λ e n seq u 1:nseq C nseq t=1 ψ θ 0 t (u t)ψ θ 0 t (u t) p(u 1:nseq ) 1. I F (p) requires a sum of n seq -dimensional terms (n seq large). 2. How could we represent an element in P C?

17 17 Input design problem Solving the issues: 1. I F (p) requires a sum of n seq -dimensional terms (n seq large). Assumption u 1:nseq is a realization of a stationary process with memory n m (n m < n seq ). I F (p) requires a sum of n m -dimensional terms. Minimum n m : related with the memory of the system.

18 Input design problem P C Solving the issues: 2. How could we represent an element in P C? P C is a polyhedron. V PC : Set of extreme points of P C. P C can be described as a convex combination of V PC. The elements in V PC can be found by using Graph theory! 18

19 Graph theory in input design Example: de Bruijn graph, C := {0, 1}, n m := 2. (u t 1, u t) (0, 0) (u t 1, u t) (0, 1) (u t 1, u t) (1, 0) (u t 1, u t) (1, 1) Elements in V PC Prime cycles in G C nm Prime cycles in G C nm Elementary cycles in G C (nm 1) 19

20 Graph theory in input design Example: de Bruijn graph, C := {0, 1}, n m := 2. (u t 1, u t) (0, 0) (u t 1, u t) (0, 1) (u t 1, u t) (1, 0) (u t 1, u t) (1, 1) Elements in V PC Prime cycles in G C nm Prime cycles in G C nm Elementary cycles in G C (nm 1) 20

21 Graph theory in input design Example: de Bruijn graph, C := {0, 1}, n m := 2. (u t 1, u t) (0, 0) (u t 1, u t) (0, 1) (u t 1, u t) (1, 0) (u t 1, u t) (1, 1) Elements in V PC Prime cycles in G C nm Prime cycles in G C nm Elementary cycles in G C (nm 1) 21

22 Graph theory in input design Example: de Bruijn graph, C := {0, 1}, n m := 2. (u t 1, u t) (0, 0) (u t 1, u t) (0, 1) (u t 1, u t) (1, 0) (u t 1, u t) (1, 1) Elements in V PC Prime cycles in G C nm Prime cycles in G C nm Elementary cycles in G C (nm 1) 22

23 Graph theory in input design Example: de Bruijn graph, C := {0, 1}, n m := 2. (u t 1, u t) (0, 0) (u t 1, u t) (0, 1) (u t 1, u t) (1, 0) (u t 1, u t) (1, 1) Elements in V PC Prime cycles in G C nm Prime cycles in G C nm Elementary cycles in G C (nm 1) 23

24 Graph theory in input design Example: de Bruijn graph, C := {0, 1}, n m := 2. (u t 1, u t) (0, 0) (u t 1, u t) (0, 1) (u t 1, u t) (1, 0) (u t 1, u t) (1, 1) There are algorithms to find elementary cycles (Johnson 1975, Tarjan 1972). 24

25 Graph theory in input design Once v i V PC is known The distribution for each v i is known. An input signal {u i t }t=n t=0 can be drawn from v i. Therefore, I (i) F := 1 λ e 1 λ e N for all v i V PC. n m u 1:nm C nm t=1 N t=1 ψ θ 0 t (u t)ψ θ 0 t (u t) v i (u 1:nm ) ψ θ 0 t (u t)ψ θ 0 t (u t) 25

26 26 Graph theory in input design Therefore, I (i) F := 1 λ e 1 λ e N for all v i V PC. n m u 1:nm C nm t=1 N t=1 ψ θ 0 t (u t )ψ θ 0 t (u t ) v i (u 1:nm ) ψ θ 0 t (u t)ψ θ 0 t (u t) The sum is approximated by Monte-Carlo!

27 27 Input design based on graph theory To design an experiment in C nm : 1. Compute all the prime cycles of G C nm. 2. Generate the input signals {u i t} t=n t=0 G C nm, for each i {1,..., n V }. 3. For each i {1,..., n V }, approximate I (i) F I (i) F 1 λ e N N t=1 from the prime cycles of ψ θ 0 t (u t)ψ θ 0 t (u t) by using

28 28 Input design based on graph theory To design an experiment in C nm : 4. Define γ := {α 1,..., α nv } R n V. Solve γ opt := arg max h(i app γ R n F (γ)) V where I app F n V i=1 (γ) := n V α i = 1 i=1 α i I (i) F α i 0, for all i {1,..., n V }

29 29 Input design based on graph theory To design an experiment in C nm : 5. The optimal pmf p opt is given by p opt = n V i=1 α opt i v i 6. Sample u 1:nseq from p opt using Markov chains. I app F (γ) linear in the decision variables The problem is convex!

30 30 Example I e t u t G 0 y t 1 x t+1 = θ 1 + x 2 + u t t G(u t ; θ) = y t = θ 2 x 2 t + e t x 1 = 0 [ ] [. with θ = θ 1 θ 2 = θ0 = 0.8 2] e t : white noise, Gaussian, zero mean, variance λ e = 1.

31 31 Example I e t u t G 0 y t with θ = 1 x t+1 = θ 1 + x 2 + u t t G(u t ; θ) = y t = θ 2 x 2 t + e t x 1 = 0 [ ] [. θ 1 θ 2 = θ0 = 0.8 2] We consider h( ) = log det( ), and n seq = 10 4.

32 32 Example I Results: 1 x t+1 = θ 1 + x 2 t G(u t ; θ) = y t = θ 2 x 2 t + e t x 1 = 0 + u t h(i F ) Case 1 Case 2 Case 3 Binary log{det(i F )} Case 1: n m = 2, C = { 1, 0, 1} Case 2: n m = 1, C = { 1, 1/3, 1/3, 1} Case 3: n m = 1, C = { 1, 0.5, 0, 0.5, 1}

33 Example I Results (95 % confidence ellipsoids): ˆθ ˆθ 1 Red: Case 2; Blue: Binary input. 33

34 34 Problem formulation for output-error models Input design based on graph theory Extension to nonlinear SSM Closed-loop application oriented input design Conclusions and future work

35 35 Extension to nonlinear SSM Nonlinear state space model: x 0 µ(x 0 ) x t x t 1 f θ (x t x t 1, u t 1 ) y t x t g θ (y t x t, u t ) where θ Θ. -f θ, g θ, µ: pdfs -x t : states -u t : input -y t : system output Goal: Design u 1:nseq := (u 1,..., u nseq ) as a realization of a stationary process maximizing I F.

36 36 Extension to nonlinear SSM Here, { } I F = E S(θ 0 )S (θ 0 ) S(θ 0 ) = θ log p θ (y 1:nseq u 1:nseq ) θ=θ0 Design u 1:nseq R nseq Design P(u 1:nseq ) P.

37 37 Extension to nonlinear SSM Fisher s identity: θ log p θ (y 1:T u 1:T ) = E { θ log p θ (x 1:T, y 1:T u 1:T ) y 1:T, u 1:T } θ log p θ (y 1:T u 1:T ) = T t=1 X 2 ξ θ (x t 1:t, u t )p θ (x t 1:t y 1:T )dx t 1:t with [ ] ξ θ (x t 1:t, u t ) = θ log f θ (x t x t 1, u t 1 ) + log g θ (y t x t, u t ) Assumption u t C (C finite set)

38 38 Extension to nonlinear SSM Recall P C : p nonnegative, p(x) = 1, p is shift invariant.

39 39 Extension to nonlinear SSM Problem Design u opt 1:n seq C nseq as a realization from p opt (u 1:nseq ), where p opt (u 1:nseq ) := arg max p P C h(i F (p)) where h : R n θ n θ R is a matrix concave function, and { } I F (p) = E S(θ 0 )S (θ 0 )

40 40 Input design problem for nonlinear SSM Problem Design u opt 1:n seq C nseq as a realization from p opt (u 1:nseq ), where p opt (u 1:nseq ) := arg max p P C h(i F (p)) where h : R n θ n θ R is a matrix concave function, and { } I F (p) = E S(θ 0 )S (θ 0 ) Issues: 1. How could we represent an element in P C? Use the graph theory approach! 2. How could we compute I F (p)?

41 41 Input design based on graph theory (revisited) To design an experiment in C nm : 1. Compute all the prime cycles of G C nm. 2. Generate the input signals {u i t} t=n t=0 G C nm, for each i {1,..., n V }. 3. For each i {1,..., n V }, approximate I (i) F I (i) { } F := E v i (u 1:nm ) S(θ 0 )S (θ 0 ) (new expression required!) from the prime cycles of by using

42 42 Estimating I F Approximate I F as Î F := 1 M M S m (θ 0 )Sm (θ 0) m=1 Difficulty: S m (θ 0 ) is not available. Solution: Estimate S m (θ 0 ) using particle methods!

43 43 Particle methods to estimate S m (θ 0 ) Goal: Approximate {p θ (x 1:t y 1:t )} t 1. {x (i) 1:t, w(i) t } N i=1 : Particle system. Approach: Auxiliary particle filter + Fixed-lag smoother.

44 44 Particle methods to estimate S m (θ 0 ) Estimate S m (θ 0 ) as Ŝ m (θ 0 ) := T N w (i) t=1 i=1 κ t ξ θ0 (x a(i) κ t,t 1 t 1, x a(i) κ t,t t, u t ) where [ ] ξ θ (x t 1:t, u t ) = θ log f θ (x t x t 1, u t 1 ) + log g θ (y t x t, u t )

45 45 Input design based on graph theory (revisited) To design an experiment in C nm : 1. Compute all the prime cycles of G C nm. 2. Generate the input signals {u i t} t=n t=0 G C nm, for each i {1,..., n V }. 3. For each i {1,..., n V }, approximate I (i) F I (i) { } F := E v i (u 1:nm ) S(θ 0 )S (θ 0 ) 1 M Ŝ m (θ 0 M )Ŝ m(θ 0 ) m=1 from the prime cycles of by using

46 Compute γ as the sample mean of {γ opt,k } K k=1. 46 Input design based on graph theory (revisited) To design an experiment in C nm : 4. Define γ := {α 1,..., α nv } R n V. For k {1,..., K}, solve where γ opt,k := arg max h(i app,k γ R n F (γ k )) V I app,k F (γ k ) := n V i=1 α i,k = 1 n V i=1 α i,k I (i),k F α i,k 0, for all i {1,..., n V }

47 47 Input design based on graph theory (revisited) To design an experiment in C nm : 5. The optimal pmf p opt is given by p opt = n V i=1 α opt i v i 6. Sample u 1:nseq from p opt using Markov chains. I app F (γ) linear in the decision variables The problem is convex!

48 48 Example II Nonlinear state space model: x t x t+1 = θ 1 x t + θ 2 + x 2 t + u t + v t, v t N (0, ) where θ = y t = 1 2 x t x2 t + e t, e t N (0, ) [ ], [. θ 1 θ 2 θ0 = ] Input design: n seq = , n m = 2, C = { 1, 0, 1}, and h( ) = log det( ).

49 49 Example II Input sequence: ut t

50 50 Example II Results: Input / h(îf ) log det(îf ) Optimal Binary Uniform 24.38

51 51 Problem formulation for output-error models Input design based on graph theory Extension to nonlinear SSM Closed-loop application oriented input design Conclusions and future work

52 Closed-loop application oriented input design e t (ext. excitation) r t u t System y t H K y,t ( ) y d (setpoint) System (θ 0 Θ): x t+1 = A θ0 x t + B θ0 u t y t = C θ0 x t + ν t ν t = H(q; θ 0 )e t {e t }: white noise, known distribution. Feedback: u t = r t + K y,t (y t y d ) 52

53 53 Closed-loop application oriented input design Model: θ Θ. x t+1 = A(θ)x t + B(θ)u t y t = C(θ)x t + ν t ν t = H(q; θ)e t Goal: Perform an experiment to obtain ˆθ nseq. design r 1:nseq! Requirements: 1. y t, u t should not be perturbed excessively. 2. ˆθnseq must satisfy quality constraints.

54 54 Closed-loop application oriented input design Minimize control objective: J = E { nseq } y t y d 2 + u t u t 1 2 Q R t=1 Requirements: 1. y t, u t should not be perturbed excessively. Probabilistic bounds: P{ y t y d y max } > 1 ǫ y P{ u t u max } > 1 ǫ x for t = 1,..., n seq

55 55 Closed-loop application oriented input design Θ SI (α) Requirements: 2. ˆθnseq must satisfy quality constraints. Maximum likelihood: As n seq, we have: nseq (ˆθnseq θ 0 ) AsN (0, {I e F } 1 ) Identification set: { } Θ SI (α) = θ : (θ θ 0 ) IF e (θ θ 0) χ 2 α (n θ)

56 56 Closed-loop application oriented input design Θ app (γ) Requirements: 2. ˆθnseq must satisfy quality constraints. Quality constraint: Application set { Θ(γ) = θ : V app (θ) 1 } γ Relaxation: Application ellipsoid { Θ app (γ) := θ : (θ θ 0 ) 2 θ V app(θ) (θ θ 0 ) 2 } θ=θ0 γ

57 Closed-loop application oriented input design Θ app (γ) Θ SI (α) Requirements: 2. ˆθnseq must satisfy quality constraints. Quality constraint: Θ SI (α) Θ app (γ) achieved by 1 χ 2 α (n θ) Ie F γ 2 2 θv app (θ) θ=θ0 57

58 58 Closed-loop application oriented input design Optimization problem: { nseq } min J = E y t y d 2 {r t} nseq Q + u t 2 R t=1 t=1 s. t. System constraints P{ y t y d y max } > 1 ǫ y P{ u t u max } > 1 ǫ x I F γχ2 α (n θ) 2 2 θv app (θ) Difficulty: P (and IF e ) hard to optimize. Solution: Use the graph-theory approach!

59 59 Closed-loop application oriented input design Graph theory approach: r 1:nseq realization from p(r 1:nm ) with alphabet C. To design an experiment in C nm : 1. Compute all the prime cycles of G C nm. 2. Generate the signals {r i t} t=n t=0 from the prime cycles of G C nm, for each i {1,..., n V }.

60 60 Closed-loop application oriented input design Graph theory approach: r 1:nseq realization from p(r 1:nm ) with alphabet C. Given e 1:Nsim, r (j) 1:N sim, approximate J (j) 1 N sim N sim y (j) t y d 2 u + (j) Q t t=1 u (j) t 1 2 R I (j) F P et,r (j) t { u (j) t u max } Monte Carlo P et,r (j) { y (j) t y d y max } Monte Carlo t computed as in previous parts.

61 61 Closed-loop application oriented input design Optimization problem (graph-theory): min {β 1,..., β nv } n v β j J (j) j=1 s. t. System constraints Constraints on {β j } nv j=1 n v j=1 n v j=1 n v j=1 β j P et,r (j) t { u (j) t u max } > 1 ǫ x β j P et,r (j) { y (j) t y d y max } > 1 ǫ y t β j I (j) F γχ2 α (n) 2 2n θv app (θ) seq

62 62 Closed-loop application oriented input design Optimal pmf: n v p opt := β opt j p j j=1

63 63 Example III Consider the open-loop, SISO state space system [ ] [. θ1 0 θ2 0 = ] x t+1 = θ 0 2 x t + u t y t = θ 0 1 x t + e t Input: k y = 0.5 known. [ Goal: Estimate θ 1 u t = r t k y y t θ 2 ] using indirect identification.

64 64 Example III Design {r t } 500 t=1, n m = 2, r t C = { 0.5, 0.25, 0, 0.25, 0.5}. Performance degradation: V app (θ) = y t (θ 0 ) y t (θ) t=1 y d = 0 ǫ y = ǫ x = 0.07 y max = 2, u max = 1

65 65 Example III Reference: 0.5 rt rt rt t

66 66 Example III Input: ut ut ut t

67 67 Example III Output: yt yt yt t

68 68 Example III Application and identification ellipsoids: (98% confidence level) Θ app (γ) Θ binary SI (α) θ θ 1 Θ opt SI (α)

69 69 Problem formulation for output-error models Input design based on graph theory Extension to nonlinear SSM Closed-loop application oriented input design Conclusions and future work

70 70 Conclusions A new method for input design was introduced. The method can be used for nonlinear systems. Convex problem even for nonlinear systems.

71 71 Future work Reducible Markov chains. Computational complexity. Robust input design. Application oriented input design for MPC.

72 Thanks for your attention. 72

73 Optimal input design for nonlinear dynamical systems: a graph-theory approach Patricio E. Valenzuela Department of Automatic Control and ACCESS Linnaeus Centre KTH Royal Institute of Technology, Stockholm, Sweden Seminar Uppsala university January 16, 2015

74 74 Outline Problem formulation for output-error models Input design based on graph theory Extension to nonlinear SSM Closed-loop application oriented input design Conclusions and future work

75 75 Appendix I: Graph theory in input design Example: Elementary cycles for a de Bruijn graph, C := {0, 1}, n m := 1. u t = 0 u t = 1 One elementary cycle: z = (0, 1, 0). Prime cycles for a de Bruijn Graph, C := {0, 1}, n m := 2: v 1 = ((0, 1), (1, 0), (0, 1)).

76 76 Appendix II: Graph theory in input design Example: Generation of input signal from a prime cycle. Consider a de Bruijn graph, C := {0, 1}, n m := 2. Finally, v 1 = ((0, 1), (1, 0), (0, 1)) {u 1 t }t=n t=0 : Take last element of each node. {u i t }t=n t=0 = {1, 0, 1, 0,..., (( 1)N + 1)/2}

77 77 Appendix III: Building A For i C nm, define (the set of ancestors of i). A i := {j C nm : (j, i) E}. For each i C nm, let P{i} P{k}, if j A i and P{k} 0 k A i k A i A ij = 1 #A i, if j A i and P{k} = 0 k A i 0, otherwise.

78 Apendix IV: Example nonlinear case e t u t G 0 y t G 0 (u t ) = G 1 (q, θ) u t + G 2 (q, θ) u 2 t where G 1 (q, θ) = θ 1 + θ 2 q 1 G 2 (q, θ) = θ 3 + θ 4 q 1 e t : Gaussian white noise, zero mean, variance λ e = 1. 78

79 Apendix IV: Example nonlinear case e t u t G 0 y t G 0 (u t ) = G 1 (q, θ) u t + G 2 (q, θ) u 2 t where G 1 (q, θ) = θ 1 + θ 2 q 1 G 2 (q, θ) = θ 3 + θ 4 q 1 We consider h( ) = det( ), c seq = 3, n m = 2, C := { 1, 0, 1}, and N =

80 Apendix IV: Example nonlinear case Stationary probabilities: 1 ut u t det(i app F ) = Results consistent with previous contributions (Larsson et al. 2010). 80

81 81 Appendix V: Particle methods to estimate S m (θ 0 ) Auxiliary particle filter: {x (i) 1:t, w(i) t ˆp θ (x 1:t y 1:t ) := N i=1 } N i=1 : Particle system. w (i) t Nk=1 w (k) t δ(x 1:t x (i) 1:t ) Two step procedure to compute {x (i) 1:t, w(i) t } N i=1 : 1. Sampling/propagation. 2. Weighting.

82 82 Appendix V: Particle methods to estimate S m (θ 0 ) Two step procedure to compute {x (i) 1:t, w(i) t } N i=1 : 1. Sampling/propagation: {a (i) t, x (i) t } wt 1 at Nk=1 w (k) R θ,t (x t x at t 1, u t 1) t 1

83 83 Appendix V: Particle methods to estimate S m (θ 0 ) Two step procedure to compute {x (i) 1:t, w(i) t } N i=1 : 2. Weighting: w (i) t := g θ(y t x (i) t, u t ) f θ (x t x (i) t 1, u t 1) R θ,t (x t x (i) t 1, u t 1)

84 84 Appendix V: Particle methods to estimate S m (θ 0 ) Difficulty: Auxiliary particle filter suffers particle degeneracy. Solution: Use fixed-lag smoother. Main idea FL-smoother: p θ (x t y 1:T, u 1:T ) p θ (x t y 1:κt, u 1:κt ) for κ t = min(t +, T ), for some fixed-lag > 0.

85 85 Apendix VI: Equivalence time and frequency domain Frequency Time Design variable Φ u (ω) p(u 1:nm ) Restrictions Φ u (ω) 0 1 π π 2π Φ u(ω)dω 1 p 0 u 1:nm p(u 1:nm ) = 1 p stationary Information matrix I F (Φ u ) I F (p)

Optimal input design for non-linear dynamic systems: a graph theory approach

Optimal input design for non-linear dynamic systems: a graph theory approach Optimal input design for non-linear dynamic systems: a graph theory approach Patricio E. Valenzuela, Cristian R. Rojas and Håkan Hjalmarsson Abstract In this article a new algorithm for the design of stationary

More information

On robust input design for nonlinear dynamical models

On robust input design for nonlinear dynamical models On robust input design for nonlinear dynamical models Patricio E. Valenzuela a Johan Dahlin b Cristian R. Rojas a Thomas B. Schön b a Department of Automatic Control, KTH Royal Institute of Technology,

More information

On Input Design for System Identification

On Input Design for System Identification On Input Design for System Identification Input Design Using Markov Chains CHIARA BRIGHENTI Masters Degree Project Stockholm, Sweden March 2009 XR-EE-RT 2009:002 Abstract When system identification methods

More information

MOOSE: Model Based Optimal Input Design Toolbox for Matlab

MOOSE: Model Based Optimal Input Design Toolbox for Matlab MOOSE: Model Based Optimal Input Design Toolbox for Matlab Technical Report Mariette Annergren mariette.annergren@ee.kth.se Christian Larsson christian.larsson@ee.kth.se September 23, 2011 Contents 1 Introduction

More information

On Application Oriented Experiment Design for Closed-loop System Identification AFROOZ EBADAT

On Application Oriented Experiment Design for Closed-loop System Identification AFROOZ EBADAT On Application Oriented Experiment Design for Closed-loop System Identification AFROOZ EBADAT Licentiate Thesis Stockholm, Sweden 2015 TRITA-EE 2015:002 ISSN 1653-5146 ISBN 978-91-7595-410-3 KTH Royal

More information

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems John Bardsley, University of Montana Collaborators: H. Haario, J. Kaipio, M. Laine, Y. Marzouk, A. Seppänen, A. Solonen, Z.

More information

Chance Constrained Input Design

Chance Constrained Input Design 5th IEEE Conference on Decision and Control and European Control Conference (CDC-ECC) Orlando, FL, USA, December -5, Chance Constrained Input Design Cristian R. Rojas, Dimitrios Katselis, Håkan Hjalmarsson,

More information

On Identification of Cascade Systems 1

On Identification of Cascade Systems 1 On Identification of Cascade Systems 1 Bo Wahlberg Håkan Hjalmarsson Jonas Mårtensson Automatic Control and ACCESS, School of Electrical Engineering, KTH, SE-100 44 Stockholm, Sweden. (bo.wahlberg@ee.kth.se

More information

Errors-in-variables identification through covariance matching: Analysis of a colored measurement noise case

Errors-in-variables identification through covariance matching: Analysis of a colored measurement noise case 008 American Control Conference Westin Seattle Hotel Seattle Washington USA June -3 008 WeB8.4 Errors-in-variables identification through covariance matching: Analysis of a colored measurement noise case

More information

Finite-time experiment design with multisines

Finite-time experiment design with multisines Proceedings of the 7th World Congress The International Federation of Automatic Control Seoul, Korea, July 6-, 8 Finite-time experiment design with multisines X. Bombois M. Barenthin P.M.J. Van den Hof

More information

Outline Lecture 3 2(40)

Outline Lecture 3 2(40) Outline Lecture 3 4 Lecture 3 Expectation aximization E and Clustering Thomas Schön Division of Automatic Control Linöping University Linöping Sweden. Email: schon@isy.liu.se Phone: 3-8373 Office: House

More information

MOOSE2: Model Based Optimal Input Design Toolbox for MATLAB R (version 2)

MOOSE2: Model Based Optimal Input Design Toolbox for MATLAB R (version 2) MOOSE2: Model Based Optimal Input Design Toolbox for MATLAB R (version 2) User s Guide Mariette Annergren mariette.annergren@ee.kth.se Christian Larsson christian.y.larsson@scania.com April 15, 2016 Contents

More information

Controlled sequential Monte Carlo

Controlled sequential Monte Carlo Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation

More information

Ten years of progress in Identification for Control. Outline

Ten years of progress in Identification for Control. Outline Ten years of progress in Identification for Control Design and Optimization of Restricted Complexity Controllers Grenoble Workshop, 15-16 January, 2003 Michel Gevers CESAME - UCL, Louvain-la-Neuve, Belgium

More information

Estimation, Detection, and Identification

Estimation, Detection, and Identification Estimation, Detection, and Identification Graduate Course on the CMU/Portugal ECE PhD Program Spring 2008/2009 Chapter 5 Best Linear Unbiased Estimators Instructor: Prof. Paulo Jorge Oliveira pjcro @ isr.ist.utl.pt

More information

Least costly probing signal design for power system mode estimation

Least costly probing signal design for power system mode estimation 1 Least costly probing signal design for power system mode estimation Vedran S. Perić, Xavier Bombois, Luigi Vanfretti KTH Royal Institute of Technology, Stockholm, Sweden NASPI Meeting, March 23, 2015.

More information

Variational inference

Variational inference Simon Leglaive Télécom ParisTech, CNRS LTCI, Université Paris Saclay November 18, 2016, Télécom ParisTech, Paris, France. Outline Introduction Probabilistic model Problem Log-likelihood decomposition EM

More information

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Review. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate

More information

On the Optimal Scaling of the Modified Metropolis-Hastings algorithm

On the Optimal Scaling of the Modified Metropolis-Hastings algorithm On the Optimal Scaling of the Modified Metropolis-Hastings algorithm K. M. Zuev & J. L. Beck Division of Engineering and Applied Science California Institute of Technology, MC 4-44, Pasadena, CA 925, USA

More information

Batch-to-batch strategies for cooling crystallization

Batch-to-batch strategies for cooling crystallization Batch-to-batch strategies for cooling crystallization Marco Forgione 1, Ali Mesbah 1, Xavier Bombois 1, Paul Van den Hof 2 1 Delft University of echnology Delft Center for Systems and Control 2 Eindhoven

More information

A New Subspace Identification Method for Open and Closed Loop Data

A New Subspace Identification Method for Open and Closed Loop Data A New Subspace Identification Method for Open and Closed Loop Data Magnus Jansson July 2005 IR S3 SB 0524 IFAC World Congress 2005 ROYAL INSTITUTE OF TECHNOLOGY Department of Signals, Sensors & Systems

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015.

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

Identifiability in dynamic network identification

Identifiability in dynamic network identification Identifiability in dynamic network identification Harm Weerts 1 Arne Dankers 2 Paul Van den Hof 1 1 Control Systems, Department of Electrical Engineering, Eindhoven University of Technology, The Netherlands.

More information

Simulation - Lectures - Part III Markov chain Monte Carlo

Simulation - Lectures - Part III Markov chain Monte Carlo Simulation - Lectures - Part III Markov chain Monte Carlo Julien Berestycki Part A Simulation and Statistical Programming Hilary Term 2018 Part A Simulation. HT 2018. J. Berestycki. 1 / 50 Outline Markov

More information

Experiment design for batch-to-batch model-based learning control

Experiment design for batch-to-batch model-based learning control Experiment design for batch-to-batch model-based learning control Marco Forgione, Xavier Bombois and Paul M.J. Van den Hof Abstract An Experiment Design framewor for dynamical systems which execute multiple

More information

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS 1. Define random process? The sample space composed of functions of time is called a random process. 2. Define Stationary process? If a random process is divided

More information

System Identification, Lecture 4

System Identification, Lecture 4 System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2012 F, FRI Uppsala University, Information Technology 30 Januari 2012 SI-2012 K. Pelckmans

More information

Statistics & Data Sciences: First Year Prelim Exam May 2018

Statistics & Data Sciences: First Year Prelim Exam May 2018 Statistics & Data Sciences: First Year Prelim Exam May 2018 Instructions: 1. Do not turn this page until instructed to do so. 2. Start each new question on a new sheet of paper. 3. This is a closed book

More information

Graphical Models and Kernel Methods

Graphical Models and Kernel Methods Graphical Models and Kernel Methods Jerry Zhu Department of Computer Sciences University of Wisconsin Madison, USA MLSS June 17, 2014 1 / 123 Outline Graphical Models Probabilistic Inference Directed vs.

More information

CHEAPEST IDENTIFICATION EXPERIMENT WITH GUARANTEED ACCURACY IN THE PRESENCE OF UNDERMODELING

CHEAPEST IDENTIFICATION EXPERIMENT WITH GUARANTEED ACCURACY IN THE PRESENCE OF UNDERMODELING CHEAPEST IDENTIFICATION EXPERIMENT WITH GUARANTEED ACCURACY IN THE PRESENCE OF UNDERMODELING Xavier Bombois, Marion Gilson Delft Center for Systems and Control, Delft University of Technology, Mekelweg

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 12: Gaussian Belief Propagation, State Space Models and Kalman Filters Guest Kalman Filter Lecture by

More information

Outline lecture 6 2(35)

Outline lecture 6 2(35) Outline lecture 35 Lecture Expectation aximization E and clustering Thomas Schön Division of Automatic Control Linöping University Linöping Sweden. Email: schon@isy.liu.se Phone: 13-1373 Office: House

More information

Input Signal Generation for Constrained Multiple-Input Multple-Output Systems

Input Signal Generation for Constrained Multiple-Input Multple-Output Systems Preprints of the 19th World Congress The International Federation of Automatic Control Input Signal Generation for Constrained Multiple-Input Multple-Output Systems Per Hägg Christian A. Larsson Afrooz

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Outline Lecture 2 2(32)

Outline Lecture 2 2(32) Outline Lecture (3), Lecture Linear Regression and Classification it is our firm belief that an understanding of linear models is essential for understanding nonlinear ones Thomas Schön Division of Automatic

More information

Robust Monte Carlo Methods for Sequential Planning and Decision Making

Robust Monte Carlo Methods for Sequential Planning and Decision Making Robust Monte Carlo Methods for Sequential Planning and Decision Making Sue Zheng, Jason Pacheco, & John Fisher Sensing, Learning, & Inference Group Computer Science & Artificial Intelligence Laboratory

More information

Linear Dynamical Systems (Kalman filter)

Linear Dynamical Systems (Kalman filter) Linear Dynamical Systems (Kalman filter) (a) Overview of HMMs (b) From HMMs to Linear Dynamical Systems (LDS) 1 Markov Chains with Discrete Random Variables x 1 x 2 x 3 x T Let s assume we have discrete

More information

A NEW INFORMATION THEORETIC APPROACH TO ORDER ESTIMATION PROBLEM. Massachusetts Institute of Technology, Cambridge, MA 02139, U.S.A.

A NEW INFORMATION THEORETIC APPROACH TO ORDER ESTIMATION PROBLEM. Massachusetts Institute of Technology, Cambridge, MA 02139, U.S.A. A EW IFORMATIO THEORETIC APPROACH TO ORDER ESTIMATIO PROBLEM Soosan Beheshti Munther A. Dahleh Massachusetts Institute of Technology, Cambridge, MA 0239, U.S.A. Abstract: We introduce a new method of model

More information

14 - Gaussian Stochastic Processes

14 - Gaussian Stochastic Processes 14-1 Gaussian Stochastic Processes S. Lall, Stanford 211.2.24.1 14 - Gaussian Stochastic Processes Linear systems driven by IID noise Evolution of mean and covariance Example: mass-spring system Steady-state

More information

Factor Graphs and Message Passing Algorithms Part 1: Introduction

Factor Graphs and Message Passing Algorithms Part 1: Introduction Factor Graphs and Message Passing Algorithms Part 1: Introduction Hans-Andrea Loeliger December 2007 1 The Two Basic Problems 1. Marginalization: Compute f k (x k ) f(x 1,..., x n ) x 1,..., x n except

More information

Parameter Estimation in a Moving Horizon Perspective

Parameter Estimation in a Moving Horizon Perspective Parameter Estimation in a Moving Horizon Perspective State and Parameter Estimation in Dynamical Systems Reglerteknik, ISY, Linköpings Universitet State and Parameter Estimation in Dynamical Systems OUTLINE

More information

Lecture 7: Optimal Smoothing

Lecture 7: Optimal Smoothing Department of Biomedical Engineering and Computational Science Aalto University March 17, 2011 Contents 1 What is Optimal Smoothing? 2 Bayesian Optimal Smoothing Equations 3 Rauch-Tung-Striebel Smoother

More information

OPTIMAL EXPERIMENT DESIGN IN CLOSED LOOP. KTH, Signals, Sensors and Systems, S Stockholm, Sweden.

OPTIMAL EXPERIMENT DESIGN IN CLOSED LOOP. KTH, Signals, Sensors and Systems, S Stockholm, Sweden. OPTIMAL EXPERIMENT DESIGN IN CLOSED LOOP Henrik Jansson Håkan Hjalmarsson KTH, Signals, Sensors and Systems, S-00 44 Stockholm, Sweden. henrik.jansson@s3.kth.se Abstract: In this contribution we extend

More information

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015.

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

7. Statistical estimation

7. Statistical estimation 7. Statistical estimation Convex Optimization Boyd & Vandenberghe maximum likelihood estimation optimal detector design experiment design 7 1 Parametric distribution estimation distribution estimation

More information

Probabilistic Graphical Models

Probabilistic Graphical Models 2016 Robert Nowak Probabilistic Graphical Models 1 Introduction We have focused mainly on linear models for signals, in particular the subspace model x = Uθ, where U is a n k matrix and θ R k is a vector

More information

Bayesian Inference by Density Ratio Estimation

Bayesian Inference by Density Ratio Estimation Bayesian Inference by Density Ratio Estimation Michael Gutmann https://sites.google.com/site/michaelgutmann Institute for Adaptive and Neural Computation School of Informatics, University of Edinburgh

More information

ML Estimation of Process Noise Variance in Dynamic Systems

ML Estimation of Process Noise Variance in Dynamic Systems ML Estimation of Process Noise Variance in Dynamic Systems Patrik Axelsson, Umut Orguner, Fredrik Gustafsson and Mikael Norrlöf {axelsson,umut,fredrik,mino} @isy.liu.se Division of Automatic Control Department

More information

EECE Adaptive Control

EECE Adaptive Control EECE 574 - Adaptive Control Basics of System Identification Guy Dumont Department of Electrical and Computer Engineering University of British Columbia January 2010 Guy Dumont (UBC) EECE574 - Basics of

More information

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:

More information

A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization

A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization and Timothy D. Barfoot CRV 2 Outline Background Objective Experimental Setup Results Discussion Conclusion 2 Outline

More information

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators Estimation theory Parametric estimation Properties of estimators Minimum variance estimator Cramer-Rao bound Maximum likelihood estimators Confidence intervals Bayesian estimation 1 Random Variables Let

More information

I. D. Landau, A. Karimi: A Course on Adaptive Control Adaptive Control. Part 9: Adaptive Control with Multiple Models and Switching

I. D. Landau, A. Karimi: A Course on Adaptive Control Adaptive Control. Part 9: Adaptive Control with Multiple Models and Switching I. D. Landau, A. Karimi: A Course on Adaptive Control - 5 1 Adaptive Control Part 9: Adaptive Control with Multiple Models and Switching I. D. Landau, A. Karimi: A Course on Adaptive Control - 5 2 Outline

More information

Particle Metropolis-Hastings using gradient and Hessian information

Particle Metropolis-Hastings using gradient and Hessian information Particle Metropolis-Hastings using gradient and Hessian information Johan Dahlin, Fredrik Lindsten and Thomas B. Schön September 19, 2014 Abstract Particle Metropolis-Hastings (PMH) allows for Bayesian

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression

More information

Stochastic Spectral Approaches to Bayesian Inference

Stochastic Spectral Approaches to Bayesian Inference Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to

More information

Particle Filters. Outline

Particle Filters. Outline Particle Filters M. Sami Fadali Professor of EE University of Nevada Outline Monte Carlo integration. Particle filter. Importance sampling. Degeneracy Resampling Example. 1 2 Monte Carlo Integration Numerical

More information

Bayesian networks: approximate inference

Bayesian networks: approximate inference Bayesian networks: approximate inference Machine Intelligence Thomas D. Nielsen September 2008 Approximative inference September 2008 1 / 25 Motivation Because of the (worst-case) intractability of exact

More information

Comparison of Prediction-Error-Modelling Criteria

Comparison of Prediction-Error-Modelling Criteria Downloaded from orbitdtudk on: Aug 25, 208 Comparison of Prediction-Error-Modelling Criteria Jørgensen, John Bagterp; Jørgensen, Sten Bay Published in: American Control Conference, 2007 ACC '07 Link to

More information

1.1 Basis of Statistical Decision Theory

1.1 Basis of Statistical Decision Theory ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 Lecture 1: Introduction Lecturer: Yihong Wu Scribe: AmirEmad Ghassami, Jan 21, 2016 [Ed. Jan 31] Outline: Introduction of

More information

OPTIMAL DESIGN INPUTS FOR EXPERIMENTAL CHAPTER 17. Organization of chapter in ISSO. Background. Linear models

OPTIMAL DESIGN INPUTS FOR EXPERIMENTAL CHAPTER 17. Organization of chapter in ISSO. Background. Linear models CHAPTER 17 Slides for Introduction to Stochastic Search and Optimization (ISSO)by J. C. Spall OPTIMAL DESIGN FOR EXPERIMENTAL INPUTS Organization of chapter in ISSO Background Motivation Finite sample

More information

ECE 275B Homework # 1 Solutions Version Winter 2015

ECE 275B Homework # 1 Solutions Version Winter 2015 ECE 275B Homework # 1 Solutions Version Winter 2015 1. (a) Because x i are assumed to be independent realizations of a continuous random variable, it is almost surely (a.s.) 1 the case that x 1 < x 2

More information

Identification, Model Validation and Control. Lennart Ljung, Linköping

Identification, Model Validation and Control. Lennart Ljung, Linköping Identification, Model Validation and Control Lennart Ljung, Linköping Acknowledgment: Useful discussions with U Forssell and H Hjalmarsson 1 Outline 1. Introduction 2. System Identification (in closed

More information

Organization. I MCMC discussion. I project talks. I Lecture.

Organization. I MCMC discussion. I project talks. I Lecture. Organization I MCMC discussion I project talks. I Lecture. Content I Uncertainty Propagation Overview I Forward-Backward with an Ensemble I Model Reduction (Intro) Uncertainty Propagation in Causal Systems

More information

Plant friendly input design for system identification in closed loop

Plant friendly input design for system identification in closed loop Plant friendly input design for system identification in closed loop Sridharakumar Narasimhan, Xavier Bombois Dept. of Chemical Engineering, IIT Madras, Chennai, India. (e-mail: sridharkrn@iitm.ac.in.

More information

Switching Regime Estimation

Switching Regime Estimation Switching Regime Estimation Series de Tiempo BIrkbeck March 2013 Martin Sola (FE) Markov Switching models 01/13 1 / 52 The economy (the time series) often behaves very different in periods such as booms

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Deep Poisson Factorization Machines: a factor analysis model for mapping behaviors in journalist ecosystem

Deep Poisson Factorization Machines: a factor analysis model for mapping behaviors in journalist ecosystem 000 001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 045 046 047 048 049 050

More information

Efficient Monte Carlo computation of Fisher information matrix using prior information

Efficient Monte Carlo computation of Fisher information matrix using prior information Efficient Monte Carlo computation of Fisher information matrix using prior information Sonjoy Das, UB James C. Spall, APL/JHU Roger Ghanem, USC SIAM Conference on Data Mining Anaheim, California, USA April

More information

Part III. Hypothesis Testing. III.1. Log-rank Test for Right-censored Failure Time Data

Part III. Hypothesis Testing. III.1. Log-rank Test for Right-censored Failure Time Data 1 Part III. Hypothesis Testing III.1. Log-rank Test for Right-censored Failure Time Data Consider a survival study consisting of n independent subjects from p different populations with survival functions

More information

Sequential Monte Carlo methods for system identification

Sequential Monte Carlo methods for system identification Technical report arxiv:1503.06058v3 [stat.co] 10 Mar 2016 Sequential Monte Carlo methods for system identification Thomas B. Schön, Fredrik Lindsten, Johan Dahlin, Johan Wågberg, Christian A. Naesseth,

More information

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts III-IV

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts III-IV Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts III-IV Aapo Hyvärinen Gatsby Unit University College London Part III: Estimation of unnormalized models Often,

More information

Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US

Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US Gerdie Everaert 1, Lorenzo Pozzi 2, and Ruben Schoonackers 3 1 Ghent University & SHERPPA 2 Erasmus

More information

BASICS OF DETECTION AND ESTIMATION THEORY

BASICS OF DETECTION AND ESTIMATION THEORY BASICS OF DETECTION AND ESTIMATION THEORY 83050E/158 In this chapter we discuss how the transmitted symbols are detected optimally from a noisy received signal (observation). Based on these results, optimal

More information

An efficient stochastic approximation EM algorithm using conditional particle filters

An efficient stochastic approximation EM algorithm using conditional particle filters An efficient stochastic approximation EM algorithm using conditional particle filters Fredrik Lindsten Linköping University Post Print N.B.: When citing this work, cite the original article. Original Publication:

More information

Semidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization

Semidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization Semidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization Instructor: Farid Alizadeh Author: Ai Kagawa 12/12/2012

More information

Chapter 2 Distributed Parameter Systems: Controllability, Observability, and Identification

Chapter 2 Distributed Parameter Systems: Controllability, Observability, and Identification Chapter 2 Distributed Parameter Systems: Controllability, Observability, and Identification 2.1 Mathematical Description We introduce the class of systems to be considered in the framework of this monograph

More information

Inferring biological dynamics Iterated filtering (IF)

Inferring biological dynamics Iterated filtering (IF) Inferring biological dynamics 101 3. Iterated filtering (IF) IF originated in 2006 [6]. For plug-and-play likelihood-based inference on POMP models, there are not many alternatives. Directly estimating

More information

Normalising constants and maximum likelihood inference

Normalising constants and maximum likelihood inference Normalising constants and maximum likelihood inference Jakob G. Rasmussen Department of Mathematics Aalborg University Denmark March 9, 2011 1/14 Today Normalising constants Approximation of normalising

More information

Density Estimation. Seungjin Choi

Density Estimation. Seungjin Choi Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/

More information

Learning of dynamical systems

Learning of dynamical systems Learning of dynamical systems Particle filters and Markov chain methods Thomas B. Schön and Fredrik Lindsten c Draft date August 23, 2017 2 Contents 1 Introduction 3 1.1 A few words for readers of the

More information

IDENTIFICATION FOR CONTROL

IDENTIFICATION FOR CONTROL IDENTIFICATION FOR CONTROL Raymond A. de Callafon, University of California San Diego, USA Paul M.J. Van den Hof, Delft University of Technology, the Netherlands Keywords: Controller, Closed loop model,

More information

ECE 275B Homework # 1 Solutions Winter 2018

ECE 275B Homework # 1 Solutions Winter 2018 ECE 275B Homework # 1 Solutions Winter 2018 1. (a) Because x i are assumed to be independent realizations of a continuous random variable, it is almost surely (a.s.) 1 the case that x 1 < x 2 < < x n Thus,

More information

DSGE Methods. Estimation of DSGE models: GMM and Indirect Inference. Willi Mutschler, M.Sc.

DSGE Methods. Estimation of DSGE models: GMM and Indirect Inference. Willi Mutschler, M.Sc. DSGE Methods Estimation of DSGE models: GMM and Indirect Inference Willi Mutschler, M.Sc. Institute of Econometrics and Economic Statistics University of Münster willi.mutschler@wiwi.uni-muenster.de Summer

More information

Pattern Recognition and Machine Learning

Pattern Recognition and Machine Learning Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability

More information

Stochastic Models, Estimation and Control Peter S. Maybeck Volumes 1, 2 & 3 Tables of Contents

Stochastic Models, Estimation and Control Peter S. Maybeck Volumes 1, 2 & 3 Tables of Contents Navtech Part #s Volume 1 #1277 Volume 2 #1278 Volume 3 #1279 3 Volume Set #1280 Stochastic Models, Estimation and Control Peter S. Maybeck Volumes 1, 2 & 3 Tables of Contents Volume 1 Preface Contents

More information

EL1820 Modeling of Dynamical Systems

EL1820 Modeling of Dynamical Systems EL1820 Modeling of Dynamical Systems Lecture 10 - System identification as a model building tool Experiment design Examination and prefiltering of data Model structure selection Model validation Lecture

More information

Towards a Theory of Information Flow in the Finitary Process Soup

Towards a Theory of Information Flow in the Finitary Process Soup Towards a Theory of in the Finitary Process Department of Computer Science and Complexity Sciences Center University of California at Davis June 1, 2010 Goals Analyze model of evolutionary self-organization

More information

EECE Adaptive Control

EECE Adaptive Control EECE 574 - Adaptive Control Recursive Identification in Closed-Loop and Adaptive Control Guy Dumont Department of Electrical and Computer Engineering University of British Columbia January 2010 Guy Dumont

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

System Identification, Lecture 4

System Identification, Lecture 4 System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2016 F, FRI Uppsala University, Information Technology 13 April 2016 SI-2016 K. Pelckmans

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

Factor Analysis and Kalman Filtering (11/2/04)

Factor Analysis and Kalman Filtering (11/2/04) CS281A/Stat241A: Statistical Learning Theory Factor Analysis and Kalman Filtering (11/2/04) Lecturer: Michael I. Jordan Scribes: Byung-Gon Chun and Sunghoon Kim 1 Factor Analysis Factor analysis is used

More information

Further Results on Model Structure Validation for Closed Loop System Identification

Further Results on Model Structure Validation for Closed Loop System Identification Advances in Wireless Communications and etworks 7; 3(5: 57-66 http://www.sciencepublishinggroup.com/j/awcn doi:.648/j.awcn.735. Further esults on Model Structure Validation for Closed Loop System Identification

More information

Cautious Data Driven Fault Detection and Isolation applied to the Wind Turbine Benchmark

Cautious Data Driven Fault Detection and Isolation applied to the Wind Turbine Benchmark Driven Fault Detection and Isolation applied to the Wind Turbine Benchmark Prof. Michel Verhaegen Delft Center for Systems and Control Delft University of Technology the Netherlands November 28, 2011 Prof.

More information

Calibration of a magnetometer in combination with inertial sensors

Calibration of a magnetometer in combination with inertial sensors Calibration of a magnetometer in combination with inertial sensors Manon Kok, Linköping University, Sweden Joint work with: Thomas Schön, Uppsala University, Sweden Jeroen Hol, Xsens Technologies, the

More information

Lessons in Estimation Theory for Signal Processing, Communications, and Control

Lessons in Estimation Theory for Signal Processing, Communications, and Control Lessons in Estimation Theory for Signal Processing, Communications, and Control Jerry M. Mendel Department of Electrical Engineering University of Southern California Los Angeles, California PRENTICE HALL

More information