Data mining for system identi cation
|
|
- Brent Dean
- 6 years ago
- Views:
Transcription
1 LERTEKNIK REG AU T O MA RO T IC C O N T L applications to process André Carvalho Bittencourt Automatic Control, Linköping Sweden
2 Outline 1. Problem formulation. Theoretical guiding principles Modeling Data Estimation 3. Tests and outline of algorithm 4. Mining data from an entire plant 5. Concluding remarks / 15
3 Mining data Historical database continuous vars r, u, y discrete mode variable m 195 control loops 5 types of process variables 4 samples per minute data pts 3.1K/min, 4.5M/day Can we extract useful intervals of data for sysid? 3 / 15
4 Mining data Historical database continuous vars r, u, y discrete mode variable m 195 control loops 5 types of process variables 4 samples per minute data pts 3.1K/min, 4.5M/day Can we extract useful intervals of data for sysid? 3 / 15
5 Requisites and assumptions Assumptions SISO loops linear dynamics real-valued poles* Requisites minimal knowledge exible fast Approach take guidance from the theory use exible models and recursive solutions measure of quality 4 / 15
6 Modeling - de nitions e(k) m(k) r (k) System K (q) y (k) G0 (q) S M {M(θ) : θ Dθ } Model set y (k) = G0 (q) + H0 (q)e(k) Model structure H0 (q) M(θ) y (k) = G (q, θ) + H(q, θ)e(k) S M G0 (z) = G (z, θ0 ), H0 (z) = H(z, θ0 ) 0 0 for θ Dθ. S = M(θ ) Optimal one-step ahead predictor y (k θ) = H(q, θ) 1 G (q, θ) + 1 y (k) 1 H(q, θ) true set DT (S, M) {θ Dθ : S = M(θ)} 5 / 15
7 Modeling - de nitions e(k) m(k) r (k) System K (q) y (k) G0 (q) S M {M(θ) : θ Dθ } Model set y (k) = G0 (q) + H0 (q)e(k) Model structure H0 (q) M(θ) y (k) = G (q, θ) + H(q, θ)e(k) S M G0 (z) = G (z, θ0 ), H0 (z) = H(z, θ0 ) 0 0 for θ Dθ. S = M(θ ) Optimal one-step ahead predictor y (k θ) = H(q, θ) 1 G (q, θ) + 1 y (k) 1 H(q, θ) true set DT (S, M) {θ Dθ : S = M(θ)} 5 / 15
8 Modeling - de nitions e(k) m(k) r (k) System K (q) y (k) G0 (q) S M {M(θ) : θ Dθ } Model set y (k) = G0 (q) + H0 (q)e(k) Model structure H0 (q) M(θ) y (k) = G (q, θ) + H(q, θ)e(k) S M G0 (z) = G (z, θ0 ), H0 (z) = H(z, θ0 ) 0 0 for θ Dθ. S = M(θ ) Optimal one-step ahead predictor y (k θ) = H(q, θ) 1 G (q, θ) + 1 y (k) 1 H(q, θ) true set DT (S, M) {θ Dθ : S = M(θ)} 5 / 15
9 Modeling - de nitions e(k) m(k) r (k) System K (q) y (k) G0 (q) S M {M(θ) : θ Dθ } Model set y (k) = G0 (q) + H0 (q)e(k) Model structure H0 (q) M(θ) y (k) = G (q, θ) + H(q, θ)e(k) S M G0 (z) = G (z, θ0 ), H0 (z) = H(z, θ0 ) 0 0 for θ Dθ. S = M(θ ) Optimal one-step ahead predictor y (k θ) = H(q, θ) 1 G (q, θ) + 1 y (k) 1 H(q, θ) true set DT (S, M) {θ Dθ : S = M(θ)} 5 / 15
10 Modeling - de nitions e(k) m(k) r (k) System K (q) y (k) G0 (q) S M {M(θ) : θ Dθ } Model set y (k) = G0 (q) + H0 (q)e(k) Model structure H0 (q) M(θ) y (k) = G (q, θ) + H(q, θ)e(k) S M G0 (z) = G (z, θ0 ), H0 (z) = H(z, θ0 ) 0 0 for θ Dθ. S = M(θ ) Optimal one-step ahead predictor y (k θ) = H(q, θ) 1 G (q, θ) + 1 y (k) 1 H(q, θ) true set DT (S, M) {θ Dθ : S = M(θ)} 5 / 15
11 Properties of models One-to-one relation T (q, θ) W (q, θ) = T (q, θ)x(k) e(k) 1 1 y (k θ) = [H(q, θ) G (q, θ) 1 H(q, θ) ] = W (q, θ)z(k) y (k) y (k) = [G (q, θ) H(q, θ)] Identi ability at θ0 θ gives the same freq W (z, θ) = W (z, θ0 ), z = θ = θ0 Whether no other resp: Remarks choose True parameter If θ0 S M what about the data? S M and M is globally identi able, DT (S, M) = θ0. then 6 / 15
12 Properties of models One-to-one relation T (q, θ) W (q, θ) = T (q, θ)x(k) e(k) 1 1 y (k θ) = [H(q, θ) G (q, θ) 1 H(q, θ) ] = W (q, θ)z(k) y (k) y (k) = [G (q, θ) H(q, θ)] Identi ability at θ0 θ gives the same freq W (z, θ) = W (z, θ0 ), z = θ = θ0 Whether no other resp: Remarks choose True parameter If θ0 S M what about the data? S M and M is globally identi able, DT (S, M) = θ0. then 6 / 15
13 Properties of models One-to-one relation T (q, θ) W (q, θ) = T (q, θ)x(k) e(k) 1 1 y (k θ) = [H(q, θ) G (q, θ) 1 H(q, θ) ] = W (q, θ)z(k) y (k) y (k) = [G (q, θ) H(q, θ)] Identi ability at θ0 θ gives the same freq W (z, θ) = W (z, θ0 ), z = θ = θ0 Whether no other resp: Remarks choose True parameter If θ0 S M what about the data? S M and M is globally identi able, DT (S, M) = θ0. then 6 / 15
14 Properties of models One-to-one relation T (q, θ) W (q, θ) = T (q, θ)x(k) e(k) 1 1 y (k θ) = [H(q, θ) G (q, θ) 1 H(q, θ) ] = W (q, θ)z(k) y (k) y (k) = [G (q, θ) H(q, θ)] Identi ability at θ0 θ gives the same freq W (z, θ) = W (z, θ0 ), z = θ = θ0 Whether no other resp: Remarks choose True parameter If θ0 S M what about the data? S M and M is globally identi able, DT (S, M) = θ0. then 6 / 15
15 Flexible model structures - S M ARX: delay expansion L-ARX: Laguerre expansion P Wu (z, θ) Pni=1 bi z i, Bn (z) Wy (z, θ) ni=1 ai z i, An (z) P Wu (z, θ) Pni=1 bi Li (z, α), B n (z, α) Wy (z, θ) ni=1 ai Li (z, α), A n (z, α) exact if n exact if n convergence depends on Ts convergence depends on α more e cient with delays unknown delays real poles Remarks describe any linear Li (q, α) = (1 α )Ts q α αq q α 1 i 1 dynamics linear regressions Wu (z, θb ) = B nb (z, α) Wy (z, θa ) = Ana (z) Laguerre better for delays integrate globally identi able choice of n and α if integrating plant guess of largest delay and time cte 7 / 15
16 Flexible model structures - S M ARX: delay expansion L-ARX: Laguerre expansion P Wu (z, θ) Pni=1 bi z i, Bn (z) Wy (z, θ) ni=1 ai z i, An (z) P Wu (z, θ) Pni=1 bi Li (z, α), B n (z, α) Wy (z, θ) ni=1 ai Li (z, α), A n (z, α) exact if n exact if n convergence depends on Ts convergence depends on α more e cient with delays unknown delays real poles Remarks describe any linear Li (q, α) = (1 α )Ts q α αq q α 1 i 1 dynamics linear regressions Wu (z, θb ) = B nb (z, α) Wy (z, θa ) = Ana (z) Laguerre better for delays integrate globally identi able choice of n and α if integrating plant guess of largest delay and time cte 7 / 15
17 Flexible model structures - S M ARX: delay expansion L-ARX: Laguerre expansion P Wu (z, θ) Pni=1 bi z i, Bn (z) Wy (z, θ) ni=1 ai z i, An (z) P Wu (z, θ) Pni=1 bi Li (z, α), B n (z, α) Wy (z, θ) ni=1 ai Li (z, α), A n (z, α) exact if n exact if n convergence depends on Ts convergence depends on α more e cient with delays unknown delays real poles Remarks describe any linear Li (q, α) = (1 α )Ts q α αq q α 1 i 1 dynamics linear regressions Wu (z, θb ) = B nb (z, α) Wy (z, θa ) = Ana (z) Laguerre better for delays integrate globally identi able choice of n and α if integrating plant guess of largest delay and time cte 7 / 15
18 Flexible model structures - S M ARX: delay expansion L-ARX: Laguerre expansion P Wu (z, θ) Pni=1 bi z i, Bn (z) Wy (z, θ) ni=1 ai z i, An (z) P Wu (z, θ) Pni=1 bi Li (z, α), B n (z, α) Wy (z, θ) ni=1 ai Li (z, α), A n (z, α) exact if n exact if n convergence depends on Ts convergence depends on α more e cient with delays unknown delays real poles Remarks describe any linear Li (q, α) = (1 α )Ts q α αq q α 1 i 1 dynamics linear regressions Wu (z, θb ) = B nb (z, α) Wy (z, θa ) = Ana (z) Laguerre better for delays integrate globally identi able choice of n and α if integrating plant guess of largest delay and time cte 7 / 15
19 Properties of the data Informative enough {z(k)}n 1 ARX informative enough data Distinguishes non-equiv. models E [(W (z, θ1 ) W (z, θ )) z(k)] = 0 W (e ıω, θ1 ) W (e ıω, θ ) open: i is SRnb closed: let K (q) = YX (q) (q) r (k) 0, dist. rejection (nx na, ny nb ) 0 i Persistent excitation (PE) of φ(k) Full rank information matrix E φ(k)φ(k)t > 0 n-su. if rich signal r (k) 6 0, servo i r (k) is SRnr nr min(na nx, nb ny ) (SRn) φ(k) = [u(k 1),, u(k n)] is PE. 8 / 15
20 Properties of the data Informative enough {z(k)}n 1 ARX informative enough data Distinguishes non-equiv. models E [(W (z, θ1 ) W (z, θ )) z(k)] = 0 W (e ıω, θ1 ) W (e ıω, θ ) open: i is SRnb closed: let K (q) = YX (q) (q) r (k) 0, dist. rejection (nx na, ny nb ) 0 i Persistent excitation (PE) of φ(k) Full rank information matrix E φ(k)φ(k)t > 0 n-su. if rich signal r (k) 6 0, servo i r (k) is SRnr nr min(na nx, nb ny ) (SRn) φ(k) = [u(k 1),, u(k n)] is PE. 8 / 15
21 Properties of the data Informative enough {z(k)}n 1 ARX informative enough data Distinguishes non-equiv. models E [(W (z, θ1 ) W (z, θ )) z(k)] = 0 W (e ıω, θ1 ) W (e ıω, θ ) open: i is SRnb closed: let K (q) = YX (q) (q) r (k) 0, dist. rejection (nx na, ny nb ) 0 i Persistent excitation (PE) of φ(k) Full rank information matrix E φ(k)φ(k)t > 0 n-su. if rich signal r (k) 6 0, servo i r (k) is SRnr nr min(na nx, nb ny ) (SRn) φ(k) = [u(k 1),, u(k n)] is PE. 8 / 15
22 Properties of the data Informative enough {z(k)}n 1 ARX informative enough data Distinguishes non-equiv. models E [(W (z, θ1 ) W (z, θ )) z(k)] = 0 W (e ıω, θ1 ) W (e ıω, θ ) open: i is SRnb closed: let K (q) = YX (q) (q) r (k) 0, dist. rejection (nx na, ny nb ) 0 i Persistent excitation (PE) of φ(k) Full rank information matrix E φ(k)φ(k)t > 0 n-su. if rich signal r (k) 6 0, servo i r (k) is SRnr nr min(na nx, nb ny ) (SRn) φ(k) = [u(k 1),, u(k n)] is PE. 8 / 15
23 Properties of the data Informative enough {z(k)}n 1 ARX informative enough data Distinguishes non-equiv. models E [(W (z, θ1 ) W (z, θ )) z(k)] = 0 W (e ıω, θ1 ) W (e ıω, θ ) open: i is SRnb closed: let K (q) = YX (q) (q) r (k) 0, dist. rejection (nx na, ny nb ) 0 i Persistent excitation (PE) of φ(k) r (k) 6 0, servo i r (k) is SRnr nr min(na nx, nb ny ) Full rank information matrix E φ(k)φ(k)t > 0 n-su. if rich signal (SRn) φ(k) = [u(k 1),, u(k n)] PE. Step signal example is = (k), φ(k) = [ (k 1),..., (k n)] Let E [φ(k)φ(k)t ] = , SR1! 8 / 15
24 Properties of the data Informative enough {z(k)}n 1 ARX informative enough data Distinguishes non-equiv. models E [(W (z, θ1 ) W (z, θ )) z(k)] = 0 W (e ıω, θ1 ) W (e ıω, θ ) open: i is SRnb closed: let K (q) = YX (q) (q) r (k) 0, dist. rejection (nx na, ny nb ) 0 i Persistent excitation (PE) of φ(k) Full rank information matrix E φ(k)φ(k)t > 0 n-su. if rich signal PE. Remarks (SRn) φ(k) = [u(k 1),, u(k n)] r (k) 6 0, servo i r (k) is SRnr nr min(na nx, nb ny ) is how to verify? nite sample? disturbance rejection? look at the estimate! 8 / 15
25 Properties of the data Informative enough {z(k)}n 1 ARX informative enough data Distinguishes non-equiv. models E [(W (z, θ1 ) W (z, θ )) z(k)] = 0 W (e ıω, θ1 ) W (e ıω, θ ) open: i is SRnb closed: let K (q) = YX (q) (q) r (k) 0, dist. rejection (nx na, ny nb ) 0 i Persistent excitation (PE) of φ(k) Full rank information matrix E φ(k)φ(k)t > 0 n-su. if rich signal PE. Remarks (SRn) φ(k) = [u(k 1),, u(k n)] r (k) 6 0, servo i r (k) is SRnr nr min(na nx, nb ny ) is how to verify? nite sample? disturbance rejection? look at the estimate! 8 / 15
26 Recursive least squares The RLS estimate θ k = arg min θ Dθ Vk (θ) = arg Consistent, i.e. Remarks k i λ ε (k, θ) i=1 θ DT (S, M)(= θ0 ), S M, M(θ) λ min θ Dθ k X if open, excitation in u servo, excitation in r through feedback is identi able (globally) N is informative enough 1, Z Frequency representation 9 / 15
27 Recursive least squares The RLS estimate θ k = arg min θ Dθ Vk (θ) = arg Consistent, i.e. Remarks k i λ ε (k, θ) i=1 θ DT (S, M)(= θ0 ), S M, M(θ) λ min θ Dθ k X if open, excitation in u servo, excitation in r through feedback is identi able (globally) N is informative enough 1, Z Frequency representation k i For constant λ = 1 N and N. V (θ) = 1 4π G0 + Bθ Gθ H0 Hθ Φ + u Hθ Hθ Φue Bθ = (H0 Hθ ) Φu Φε = Z π Φε (ω, θ)dω and π Φue γ0 + γ0 Φu 9 / 15
28 Recursive least squares The RLS estimate θ k = arg min θ Dθ Vk (θ) = arg Consistent, i.e. Remarks k i λ ε (k, θ) i=1 θ DT (S, M)(= θ0 ), S M, M(θ) λ min θ Dθ k X if open, excitation in u servo, excitation in r through feedback is identi able (globally) N is informative enough 1, Z Frequency representation Open-loop, Φue 0. Unbiased estimate. Bθ = 0 9 / 15
29 Recursive least squares The RLS estimate θ k = arg min θ Dθ Vk (θ) = arg Consistent, i.e. Remarks k i λ ε (k, θ) i=1 θ DT (S, M)(= θ0 ), S M, M(θ) λ min θ Dθ k X if open, excitation in u servo, excitation in r through feedback is identi able (globally) N is informative enough 1, Z Frequency representation Servo, Φu = Φru +Φeu and Φue = Φeu γ0. Bθ = H0 Hθ Excitation in r through feedback! γ0 Φeu (Φru + Φeu ) 9 / 15
30 Recursive least squares The RLS estimate θ k = arg min θ Dθ Vk (θ) = arg Consistent, i.e. min θ Dθ k X Remarks λk i ε (k, θ) i=1 θ DT (S, M)(= θ0 ), S M, M(θ) λ 1, Z N open, excitation in u servo, excitation in r through feedback if dist, excitation in is identi able (globally) e through feedback is informative enough Frequency representation Dist. rejection r, Φu 0. Noise must a ect the input! Bθ = H0 Hθ γ0 Φeu 9 / 15
31 Recursive least squares The RLS estimate θ k = arg min θ Dθ Vk (θ) = arg Consistent, i.e. min θ Dθ k X Remarks λk i ε (k, θ) i=1 θ DT (S, M)(= θ0 ), S M, M(θ) λ 1, Z N open, excitation in u servo, excitation in r through feedback if dist, excitation in is identi able (globally) e through feedback is informative enough Frequency representation Dist. rejection r, Φu 0. Noise must a ect the input! Bθ = H0 Hθ γ0 Φeu 9 / 15
32 RLS for linear regressions Linear regression (L-)ARX: y (k θ) = ϕ(k)t θ Asymptotics Let Recursive solution θ k θ k 1 = R (k) 1 ϕ(k)ε(k, θ k 1 ) R (k) = λr (k 1) + ϕ(k)ϕ(k)t 1 λ 1 and λ(θ k θ0 ) As N (0, Pθ ), h i 1 1 Pθ, γ0 E ϕ(k)ϕ(k)t Σθ = (1 λ) Vk (θ k ) = λvk 1 (θ k 1 ) + ε(k, θ k 1 )ε(k, θ k ) Remarks closed-form recursive sol invertibility of R k γ0 R 1 Finite sample estimates γ k = (1 λ)vk (θ k ) ˆ = (1 λ)r (k) R k (1 λ) Σ k = Vk (θ k )R (k) 1 QR-RLS solution 10 / 15
33 RLS for linear regressions Linear regression (L-)ARX: y (k θ) = ϕ(k)t θ Asymptotics Let Recursive solution θ k θ k 1 = R (k) 1 ϕ(k)ε(k, θ k 1 ) R (k) = λr (k 1) + ϕ(k)ϕ(k)t 1 λ 1 and λ(θ k θ0 ) As N (0, Pθ ), h i 1 1 Pθ, γ0 E ϕ(k)ϕ(k)t Σθ = (1 λ) Vk (θ k ) = λvk 1 (θ k 1 ) + ε(k, θ k 1 )ε(k, θ k ) Remarks closed-form recursive sol invertibility of R k γ0 R 1 Finite sample estimates γ k = (1 λ)vk (θ k ) ˆ = (1 λ)r (k) R k (1 λ) Σ k = Vk (θ k )R (k) 1 QR-RLS solution 10 / 15
34 RLS for linear regressions Linear regression (L-)ARX: y (k θ) = ϕ(k)t θ Asymptotics Let Recursive solution θ k θ k 1 = R (k) 1 ϕ(k)ε(k, θ k 1 ) R (k) = λr (k 1) + ϕ(k)ϕ(k)t 1 λ 1 and λ(θ k θ0 ) As N (0, Pθ ), h i 1 1 Pθ, γ0 E ϕ(k)ϕ(k)t Σθ = (1 λ) Vk (θ k ) = λvk 1 (θ k 1 ) + ε(k, θ k 1 )ε(k, θ k ) Remarks closed-form recursive sol invertibility of R k γ0 R 1 Finite sample estimates γ k = (1 λ)vk (θ k ) ˆ = (1 λ)r (k) R k (1 λ) Σ k = Vk (θ k )R (k) 1 QR-RLS solution 10 / 15
35 RLS for linear regressions Linear regression (L-)ARX: y (k θ) = ϕ(k)t θ Asymptotics Let Recursive solution θ k θ k 1 = R (k) 1 ϕ(k)ε(k, θ k 1 ) R (k) = λr (k 1) + ϕ(k)ϕ(k)t 1 λ 1 and λ(θ k θ0 ) As N (0, Pθ ), h i 1 1 Pθ, γ0 E ϕ(k)ϕ(k)t Σθ = (1 λ) Vk (θ k ) = λvk 1 (θ k 1 ) + ε(k, θ k 1 )ε(k, θ k ) Remarks closed-form recursive sol invertibility of R k γ0 R 1 Finite sample estimates γ k = (1 λ)vk (θ k ) ˆ = (1 λ)r (k) R k (1 λ) Σ k = Vk (θ k )R (k) 1 QR-RLS solution 10 / 15
36 RLS for linear regressions Linear regression (L-)ARX: y (k θ) = ϕ(k)t θ Asymptotics Let Recursive solution θ k θ k 1 = R (k) 1 ϕ(k)ε(k, θ k 1 ) R (k) = λr (k 1) + ϕ(k)ϕ(k)t 1 λ 1 and λ(θ k θ0 ) As N (0, Pθ ), h i 1 1 Pθ, γ0 E ϕ(k)ϕ(k)t Σθ = (1 λ) Vk (θ k ) = λvk 1 (θ k 1 ) + ε(k, θ k 1 )ε(k, θ k ) Remarks closed-form recursive sol invertibility of R k γ0 R 1 Finite sample estimates γ k = (1 λ)vk (θ k ) ˆ = (1 λ)r (k) R k (1 λ) Σ k = Vk (θ k )R (k) 1 QR-RLS solution 10 / 15
37 Tests T4 : 0: Normalize the data Granger causality test Can T1 : step changes in or r (k) That is how the plant is operated! > η1 or For help predict y (k θ) = ϕu (k)t θkb + ϕy (k)t θka r (k) > η1 under (θ kb )T T : y (k) variability in y (k) s(k) = s(k) is used y γy (k) > η as a quality measure! 5: Logical conditions Ti+1 conditioning of info matrix Check whether 1 H0 : θ b = 0 (Σbθ ) 1 θ kb As Xnb should vary after step Monitor variance of T3 : y (k)? κ (R (k)) = R (k) is invertible σmin (R (k)) σmax (R (k)) > η3 only computed if Ti passed Exit if any test fails Accept interval if T4 Interval goes from passed T1 to exit 11 / 15
38 Tests T4 : 0: Normalize the data Granger causality test Can T1 : step changes in or r (k) That is how the plant is operated! > η1 or For help predict y (k θ) = ϕu (k)t θkb + ϕy (k)t θka r (k) > η1 under (θ kb )T T : y (k) variability in y (k) s(k) = s(k) is used y γy (k) > η as a quality measure! 5: Logical conditions Ti+1 conditioning of info matrix Check whether 1 H0 : θ b = 0 (Σbθ ) 1 θ kb As Xnb should vary after step Monitor variance of T3 : y (k)? κ (R (k)) = R (k) is invertible σmin (R (k)) σmax (R (k)) > η3 only computed if Ti passed Exit if any test fails Accept interval if T4 Interval goes from passed T1 to exit 11 / 15
39 Tests T4 : 0: Normalize the data Granger causality test Can T1 : step changes in or r (k) That is how the plant is operated! > η1 or For help predict y (k θ) = ϕu (k)t θkb + ϕy (k)t θka r (k) > η1 under (θ kb )T T : y (k) variability in y (k) s(k) = s(k) is used y γy (k) > η as a quality measure! 5: Logical conditions Ti+1 conditioning of info matrix Check whether 1 H0 : θ b = 0 (Σbθ ) 1 θ kb As Xnb should vary after step Monitor variance of T3 : y (k)? κ (R (k)) = R (k) is invertible σmin (R (k)) σmax (R (k)) > η3 only computed if Ti passed Exit if any test fails Accept interval if T4 Interval goes from passed T1 to exit 11 / 15
40 Tests T4 : 0: Normalize the data Granger causality test Can T1 : step changes in or r (k) That is how the plant is operated! > η1 or For help predict y (k θ) = ϕu (k)t θkb + ϕy (k)t θka r (k) > η1 under (θ kb )T T : y (k) variability in y (k) s(k) = s(k) is used y γy (k) > η as a quality measure! 5: Logical conditions Ti+1 conditioning of info matrix Check whether 1 H0 : θ b = 0 (Σbθ ) 1 θ kb As Xnb should vary after step Monitor variance of T3 : y (k)? κ (R (k)) = R (k) is invertible σmin (R (k)) σmax (R (k)) > η3 only computed if Ti passed Exit if any test fails Accept interval if T4 Interval goes from passed T1 to exit 11 / 15
41 Tests T4 : 0: Normalize the data Granger causality test Can T1 : step changes in or r (k) That is how the plant is operated! > η1 or For help predict y (k θ) = ϕu (k)t θkb + ϕy (k)t θka r (k) > η1 under (θ kb )T T : y (k) variability in y (k) s(k) = s(k) is used y γy (k) > η as a quality measure! 5: Logical conditions Ti+1 conditioning of info matrix Check whether 1 H0 : θ b = 0 (Σbθ ) 1 θ kb As Xnb should vary after step Monitor variance of T3 : y (k)? κ (R (k)) = R (k) is invertible σmin (R (k)) σmax (R (k)) > η3 only computed if Ti passed Exit if any test fails Accept interval if T4 Interval goes from passed T1 to exit 11 / 15
42 FIR Example, Granger causality test y (k) = B10 (q) ( + d(k)) + e(k) Remarks fails as a causality test θ 6= 0,k > 1000 ( 500 < k < 1000 d(k) 6= 0, 1500 < k < 000 but works well to select data! statistical signi cance of (any) parameter SNRu =10, SNRd =30 1 / 15
43 Outline of the algorithm k k 1 k0 k0 +η0 k1 k k3 k4 ke N E5 m(k) c(k) η0 E0 ζ(k) > η1 µy (k) I γy (k) I γy (k) > η E ϕ(k) I3 R (k) I3 1 κ (k) I3 1 E3 κ (k) > η3 s(k) I4 s(k) > η4 E4 13 / 15
44 Mining data from an entire plant Evaluation 170 minutes to run Plant 195 control loops selects 1.46% of all samples 37 months of data nds all bump tests 1.15G samples every test is important quality measure supports further analysis Mode of operation type count (%) Average length Loop type open open Density (i) closed closed Flow Level (i) Pressure (i) Temperature / 15
45 Mining data from an entire plant Evaluation 170 minutes to run Plant 195 control loops selects 1.46% of all samples 37 months of data nds all bump tests 1.15G samples every test is important quality measure supports further analysis Level, open Density, closed y u y u r k k 14 / 15
46 Summary Requisites range of variables (normalization) mode operation type (change in u or r) integrating plant ( nite gain models) guess of largest delay (tuning) guess of largest time cte (tuning) 7 tuning vars, 5 thresholds for entire plant Extensions Kautz polynominals (complex poles) nding the topology MIMO case 15 / 15
47
An Algorithm for Finding Process Identification Intervals from Normal Operating Data
Processes 015, 3, 357-383; doi:10.3390/pr300357 OPEN ACCESS processes ISSN 7-9717 www.mdpi.com/journal/processes Article An Algorithm for Finding Process Identification Intervals from Normal Operating
More informationEL1820 Modeling of Dynamical Systems
EL1820 Modeling of Dynamical Systems Lecture 9 - Parameter estimation in linear models Model structures Parameter estimation via prediction error minimization Properties of the estimate: bias and variance
More informationOn Input Design for System Identification
On Input Design for System Identification Input Design Using Markov Chains CHIARA BRIGHENTI Masters Degree Project Stockholm, Sweden March 2009 XR-EE-RT 2009:002 Abstract When system identification methods
More informationAdvanced Process Control Tutorial Problem Set 2 Development of Control Relevant Models through System Identification
Advanced Process Control Tutorial Problem Set 2 Development of Control Relevant Models through System Identification 1. Consider the time series x(k) = β 1 + β 2 k + w(k) where β 1 and β 2 are known constants
More informationEL1820 Modeling of Dynamical Systems
EL1820 Modeling of Dynamical Systems Lecture 10 - System identification as a model building tool Experiment design Examination and prefiltering of data Model structure selection Model validation Lecture
More informationIdentification of ARX, OE, FIR models with the least squares method
Identification of ARX, OE, FIR models with the least squares method CHEM-E7145 Advanced Process Control Methods Lecture 2 Contents Identification of ARX model with the least squares minimizing the equation
More informationCONTROL SYSTEMS, ROBOTICS, AND AUTOMATION - Vol. V - Prediction Error Methods - Torsten Söderström
PREDICTIO ERROR METHODS Torsten Söderström Department of Systems and Control, Information Technology, Uppsala University, Uppsala, Sweden Keywords: prediction error method, optimal prediction, identifiability,
More informationIntroduction to System Identification and Adaptive Control
Introduction to System Identification and Adaptive Control A. Khaki Sedigh Control Systems Group Faculty of Electrical and Computer Engineering K. N. Toosi University of Technology May 2009 Introduction
More informationI. D. Landau, A. Karimi: A Course on Adaptive Control Adaptive Control. Part 9: Adaptive Control with Multiple Models and Switching
I. D. Landau, A. Karimi: A Course on Adaptive Control - 5 1 Adaptive Control Part 9: Adaptive Control with Multiple Models and Switching I. D. Landau, A. Karimi: A Course on Adaptive Control - 5 2 Outline
More informationf-domain expression for the limit model Combine: 5.12 Approximate Modelling What can be said about H(q, θ) G(q, θ ) H(q, θ ) with
5.2 Approximate Modelling What can be said about if S / M, and even G / G? G(q, ) H(q, ) f-domain expression for the limit model Combine: with ε(t, ) =H(q, ) [y(t) G(q, )u(t)] y(t) =G (q)u(t) v(t) We know
More informationOutline. What Can Regularization Offer for Estimation of Dynamical Systems? State-of-the-Art System Identification
Outline What Can Regularization Offer for Estimation of Dynamical Systems? with Tianshi Chen Preamble: The classic, conventional System Identification Setup Bias Variance, Model Size Selection Regularization
More information12. Prediction Error Methods (PEM)
12. Prediction Error Methods (PEM) EE531 (Semester II, 2010) description optimal prediction Kalman filter statistical results computational aspects 12-1 Description idea: determine the model parameter
More informationEECE Adaptive Control
EECE 574 - Adaptive Control Recursive Identification in Closed-Loop and Adaptive Control Guy Dumont Department of Electrical and Computer Engineering University of British Columbia January 2010 Guy Dumont
More informationy k = ( ) x k + v k. w q wk i 0 0 wk
Four telling examples of Kalman Filters Example : Signal plus noise Measurement of a bandpass signal, center frequency.2 rad/sec buried in highpass noise. Dig out the quadrature part of the signal while
More informationNonlinear System Identification Using MLP Dr.-Ing. Sudchai Boonto
Dr-Ing Sudchai Boonto Department of Control System and Instrumentation Engineering King Mongkut s Unniversity of Technology Thonburi Thailand Nonlinear System Identification Given a data set Z N = {y(k),
More informationProblem Set 2: Box-Jenkins methodology
Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +
More informationOutline 2(42) Sysid Course VT An Overview. Data from Gripen 4(42) An Introductory Example 2,530 3(42)
Outline 2(42) Sysid Course T1 2016 An Overview. Automatic Control, SY, Linköpings Universitet An Umbrella Contribution for the aterial in the Course The classic, conventional System dentification Setup
More information10/8/2015. Control Design. Pole-placement by state-space methods. Process to be controlled. State controller
Pole-placement by state-space methods Control Design To be considered in controller design * Compensate the effect of load disturbances * Reduce the effect of measurement noise * Setpoint following (target
More informationNotes on Time Series Modeling
Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g
More informationPublished in: Proceedings of the 16th IFAC Symposium on System Identification (SYSID 2012), July , Brussels, Belgium
Dynamic network structure identification with prediction error methods - basic examples Dankers, AG; Van den Hof, PMJ; Heuberger, PSC; Bombois, X Published in: Proceedings of the 16th IFAC Symposium on
More informationControl Systems Lab - SC4070 System Identification and Linearization
Control Systems Lab - SC4070 System Identification and Linearization Dr. Manuel Mazo Jr. Delft Center for Systems and Control (TU Delft) m.mazo@tudelft.nl Tel.:015-2788131 TU Delft, February 13, 2015 (slides
More informationTime Series Analysis
Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Input-Output systems The z-transform important issues
More informationLecture 7 Open-loop & closedloop experiments. The Bias Formula goes closed-loop and the returns
Lecture 7 Open-loop & closedloop experiments The Bias Formula goes closed-loop and the returns Closed-loop vs open-loop Open-loop system disturbance Feedback-free system input output Control input independent
More informationRealization theory for systems biology
Realization theory for systems biology Mihály Petreczky CNRS Ecole Central Lille, France February 3, 2016 Outline Control theory and its role in biology. Realization problem: an abstract formulation. Realization
More informationStability of Parameter Adaptation Algorithms. Big picture
ME5895, UConn, Fall 215 Prof. Xu Chen Big picture For ˆθ (k + 1) = ˆθ (k) + [correction term] we haven t talked about whether ˆθ(k) will converge to the true value θ if k. We haven t even talked about
More informationCBE507 LECTURE III Controller Design Using State-space Methods. Professor Dae Ryook Yang
CBE507 LECTURE III Controller Design Using State-space Methods Professor Dae Ryook Yang Fall 2013 Dept. of Chemical and Biological Engineering Korea University Korea University III -1 Overview States What
More informationEECE Adaptive Control
EECE 574 - Adaptive Control Basics of System Identification Guy Dumont Department of Electrical and Computer Engineering University of British Columbia January 2010 Guy Dumont (UBC) EECE574 - Basics of
More informationi) the probability of type I error; ii) the 95% con dence interval; iii) the p value; iv) the probability of type II error; v) the power of a test.
1. Explain what is: i) the probability of type I error; ii) the 95% con dence interval; iii) the p value; iv) the probability of type II error; v) the power of a test. Answer: i) It is the probability
More informationIntegral action in state feedback control
Automatic Control 1 in state feedback control Prof. Alberto Bemporad University of Trento Academic year 21-211 Prof. Alberto Bemporad (University of Trento) Automatic Control 1 Academic year 21-211 1 /
More informationSimulation and Implementation of Servo Motor Control
Simulation and Implementation of Servo Motor Control with Sliding Mode Control (SMC) using Matlab and LabView Bondhan Novandy http://bono02.wordpress.com/ Outline Background Motivation AC Servo Motor Inverter
More informationFRTN 15 Predictive Control
Department of AUTOMATIC CONTROL FRTN 5 Predictive Control Final Exam March 4, 27, 8am - 3pm General Instructions This is an open book exam. You may use any book you want, including the slides from the
More informationAnalysis of Discrete-Time Systems
TU Berlin Discrete-Time Control Systems 1 Analysis of Discrete-Time Systems Overview Stability Sensitivity and Robustness Controllability, Reachability, Observability, and Detectabiliy TU Berlin Discrete-Time
More informationECON 4160, Lecture 11 and 12
ECON 4160, 2016. Lecture 11 and 12 Co-integration Ragnar Nymoen Department of Economics 9 November 2017 1 / 43 Introduction I So far we have considered: Stationary VAR ( no unit roots ) Standard inference
More informationCHBE507 LECTURE II MPC Revisited. Professor Dae Ryook Yang
CHBE507 LECURE II MPC Revisited Professor Dae Ryook Yang Fall 2013 Dept. of Chemical and Biological Engineering Korea University Korea University III -1 Process Models ransfer function models Fixed order
More informationGMM-based inference in the AR(1) panel data model for parameter values where local identi cation fails
GMM-based inference in the AR() panel data model for parameter values where local identi cation fails Edith Madsen entre for Applied Microeconometrics (AM) Department of Economics, University of openhagen,
More informationFurther Results on Model Structure Validation for Closed Loop System Identification
Advances in Wireless Communications and etworks 7; 3(5: 57-66 http://www.sciencepublishinggroup.com/j/awcn doi:.648/j.awcn.735. Further esults on Model Structure Validation for Closed Loop System Identification
More information6.435, System Identification
System Identification 6.435 SET 3 Nonparametric Identification Munther A. Dahleh 1 Nonparametric Methods for System ID Time domain methods Impulse response Step response Correlation analysis / time Frequency
More informationIDENTIFICATION OF A TWO-INPUT SYSTEM: VARIANCE ANALYSIS
IDENTIFICATION OF A TWO-INPUT SYSTEM: VARIANCE ANALYSIS M Gevers,1 L Mišković,2 D Bonvin A Karimi Center for Systems Engineering and Applied Mechanics (CESAME) Université Catholique de Louvain B-1348 Louvain-la-Neuve,
More informationOPTIMAL EXPERIMENT DESIGN IN CLOSED LOOP. KTH, Signals, Sensors and Systems, S Stockholm, Sweden.
OPTIMAL EXPERIMENT DESIGN IN CLOSED LOOP Henrik Jansson Håkan Hjalmarsson KTH, Signals, Sensors and Systems, S-00 44 Stockholm, Sweden. henrik.jansson@s3.kth.se Abstract: In this contribution we extend
More informationOn Identification of Cascade Systems 1
On Identification of Cascade Systems 1 Bo Wahlberg Håkan Hjalmarsson Jonas Mårtensson Automatic Control and ACCESS, School of Electrical Engineering, KTH, SE-100 44 Stockholm, Sweden. (bo.wahlberg@ee.kth.se
More informationCBE495 LECTURE IV MODEL PREDICTIVE CONTROL
What is Model Predictive Control (MPC)? CBE495 LECTURE IV MODEL PREDICTIVE CONTROL Professor Dae Ryook Yang Fall 2013 Dept. of Chemical and Biological Engineering Korea University * Some parts are from
More informationAnalysis of Discrete-Time Systems
TU Berlin Discrete-Time Control Systems TU Berlin Discrete-Time Control Systems 2 Stability Definitions We define stability first with respect to changes in the initial conditions Analysis of Discrete-Time
More informationTrending Models in the Data
April 13, 2009 Spurious regression I Before we proceed to test for unit root and trend-stationary models, we will examine the phenomena of spurious regression. The material in this lecture can be found
More informationIDENTIFICATION FOR CONTROL
IDENTIFICATION FOR CONTROL Raymond A. de Callafon, University of California San Diego, USA Paul M.J. Van den Hof, Delft University of Technology, the Netherlands Keywords: Controller, Closed loop model,
More informationClosed-Loop Identification of Unstable Systems Using Noncausal FIR Models
23 American Control Conference (ACC) Washington, DC, USA, June 7-9, 23 Closed-Loop Identification of Unstable Systems Using Noncausal FIR Models Khaled Aljanaideh, Benjamin J. Coffer, and Dennis S. Bernstein
More informationControl System Design
ELEC4410 Control System Design Lecture 19: Feedback from Estimated States and Discrete-Time Control Design Julio H. Braslavsky julio@ee.newcastle.edu.au School of Electrical Engineering and Computer Science
More informationCBE507 LECTURE IV Multivariable and Optimal Control. Professor Dae Ryook Yang
CBE507 LECURE IV Multivariable and Optimal Control Professor Dae Ryook Yang Fall 03 Dept. of Chemical and Biological Engineering Korea University Korea University IV - Decoupling Handling MIMO processes
More informationBooth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay. Solutions to Midterm
Booth School of Business, University of Chicago Business 41914, Spring Quarter 017, Mr Ruey S Tsay Solutions to Midterm Problem A: (51 points; 3 points per question) Answer briefly the following questions
More informationarxiv: v4 [cs.sy] 24 Oct 2017
Identifiability of linear dynamic networks Harm H.M. Weerts a, Paul M.J. Van den Hof a and Arne G. Dankers b a Control Systems Group, Department of Electrical Engineering, Eindhoven University of Technology,
More informationRECURSIVE SUBSPACE IDENTIFICATION IN THE LEAST SQUARES FRAMEWORK
RECURSIVE SUBSPACE IDENTIFICATION IN THE LEAST SQUARES FRAMEWORK TRNKA PAVEL AND HAVLENA VLADIMÍR Dept of Control Engineering, Czech Technical University, Technická 2, 166 27 Praha, Czech Republic mail:
More informationLecture 7: Discrete-time Models. Modeling of Physical Systems. Preprocessing Experimental Data.
ISS0031 Modeling and Identification Lecture 7: Discrete-time Models. Modeling of Physical Systems. Preprocessing Experimental Data. Aleksei Tepljakov, Ph.D. October 21, 2015 Discrete-time Transfer Functions
More informationCover page. : On-line damage identication using model based orthonormal. functions. Author : Raymond A. de Callafon
Cover page Title : On-line damage identication using model based orthonormal functions Author : Raymond A. de Callafon ABSTRACT In this paper, a new on-line damage identication method is proposed for monitoring
More informationGranger Causality and Dynamic Structural Systems 1
Granger Causality and Dynamic Structural Systems 1 Halbert White and Xun Lu Department of Economics, University of California, San Diego December 10, 2009 1 forthcoming, Journal of Financial Econometrics
More informationModelling and Control of Dynamic Systems. Stability of Linear Systems. Sven Laur University of Tartu
Modelling and Control of Dynamic Systems Stability of Linear Systems Sven Laur University of Tartu Motivating Example Naive open-loop control r[k] Controller Ĉ[z] u[k] ε 1 [k] System Ĝ[z] y[k] ε 2 [k]
More informationOn Consistency of Closed-loop Subspace Identifictaion with Innovation Estimation
Technical report from Automatic Control at Linköpings universitet On Consistency of Closed-loop Subspace Identictaion with Innovation Estimation Weilu Lin, S Joe Qin, Lennart Ljung Division of Automatic
More informationOptimal control and estimation
Automatic Control 2 Optimal control and estimation Prof. Alberto Bemporad University of Trento Academic year 2010-2011 Prof. Alberto Bemporad (University of Trento) Automatic Control 2 Academic year 2010-2011
More informationPANEL DATA RANDOM AND FIXED EFFECTS MODEL. Professor Menelaos Karanasos. December Panel Data (Institute) PANEL DATA December / 1
PANEL DATA RANDOM AND FIXED EFFECTS MODEL Professor Menelaos Karanasos December 2011 PANEL DATA Notation y it is the value of the dependent variable for cross-section unit i at time t where i = 1,...,
More informationMAE143a: Signals & Systems (& Control) Final Exam (2011) solutions
MAE143a: Signals & Systems (& Control) Final Exam (2011) solutions Question 1. SIGNALS: Design of a noise-cancelling headphone system. 1a. Based on the low-pass filter given, design a high-pass filter,
More informationNEW DEVELOPMENTS IN PREDICTIVE CONTROL FOR NONLINEAR SYSTEMS
NEW DEVELOPMENTS IN PREDICTIVE CONTROL FOR NONLINEAR SYSTEMS M. J. Grimble, A. Ordys, A. Dutka, P. Majecki University of Strathclyde Glasgow Scotland, U.K Introduction Model Predictive Control (MPC) is
More informationSequential Change-Point Approach for Online Community Detection
Sequential Change-Point Approach for Online Community Detection Yao Xie Joint work with David Marangoni-Simonsen H. Milton Stewart School of Industrial and Systems Engineering Georgia Institute of Technology
More informationChapter 7 Interconnected Systems and Feedback: Well-Posedness, Stability, and Performance 7. Introduction Feedback control is a powerful approach to o
Lectures on Dynamic Systems and Control Mohammed Dahleh Munther A. Dahleh George Verghese Department of Electrical Engineering and Computer Science Massachuasetts Institute of Technology c Chapter 7 Interconnected
More informationRobust Adaptive Attitude Control of a Spacecraft
Robust Adaptive Attitude Control of a Spacecraft AER1503 Spacecraft Dynamics and Controls II April 24, 2015 Christopher Au Agenda Introduction Model Formulation Controller Designs Simulation Results 2
More informationLifted approach to ILC/Repetitive Control
Lifted approach to ILC/Repetitive Control Okko H. Bosgra Maarten Steinbuch TUD Delft Centre for Systems and Control TU/e Control System Technology Dutch Institute of Systems and Control DISC winter semester
More informationQuis custodiet ipsos custodes?
Quis custodiet ipsos custodes? James B. Rawlings, Megan Zagrobelny, Luo Ji Dept. of Chemical and Biological Engineering, Univ. of Wisconsin-Madison, WI, USA IFAC Conference on Nonlinear Model Predictive
More informationControl Design. Lecture 9: State Feedback and Observers. Two Classes of Control Problems. State Feedback: Problem Formulation
Lecture 9: State Feedback and s [IFAC PB Ch 9] State Feedback s Disturbance Estimation & Integral Action Control Design Many factors to consider, for example: Attenuation of load disturbances Reduction
More informationLecture Stat Information Criterion
Lecture Stat 461-561 Information Criterion Arnaud Doucet February 2008 Arnaud Doucet () February 2008 1 / 34 Review of Maximum Likelihood Approach We have data X i i.i.d. g (x). We model the distribution
More informationKernel-Based Contrast Functions for Sufficient Dimension Reduction
Kernel-Based Contrast Functions for Sufficient Dimension Reduction Michael I. Jordan Departments of Statistics and EECS University of California, Berkeley Joint work with Kenji Fukumizu and Francis Bach
More informationThe Basic New Keynesian Model. Jordi Galí. November 2010
The Basic New Keynesian Model by Jordi Galí November 2 Motivation and Outline Evidence on Money, Output, and Prices: Short Run E ects of Monetary Policy Shocks (i) persistent e ects on real variables (ii)
More informationExercises Chapter 4 Statistical Hypothesis Testing
Exercises Chapter 4 Statistical Hypothesis Testing Advanced Econometrics - HEC Lausanne Christophe Hurlin University of Orléans December 5, 013 Christophe Hurlin (University of Orléans) Advanced Econometrics
More informationEstimation of electromechanical modes in power systems using system identification techniques
Estimation of electromechanical modes in power systems using system identification techniques Vedran S. Peric, Luigi Vanfretti, X. Bombois E-mail: vperic@kth.se, luigiv@kth.se, xavier.bombois@ec-lyon.fr
More informationProcess Modelling, Identification, and Control
Jan Mikles Miroslav Fikar 2008 AGI-Information Management Consultants May be used for personal purporses only or by libraries associated to dandelon.com network. Process Modelling, Identification, and
More informationRegression Estimation - Least Squares and Maximum Likelihood. Dr. Frank Wood
Regression Estimation - Least Squares and Maximum Likelihood Dr. Frank Wood Least Squares Max(min)imization Function to minimize w.r.t. β 0, β 1 Q = n (Y i (β 0 + β 1 X i )) 2 i=1 Minimize this by maximizing
More informationSystem Identification: From Data to Model
1 : From Data to Model With Applications to Aircraft Modeling Division of Automatic Control Linköping University Sweden Dynamic Systems 2 A Dynamic system has an output response y that depends on (all)
More informationMoving Average (MA) representations
Moving Average (MA) representations The moving average representation of order M has the following form v[k] = MX c n e[k n]+e[k] (16) n=1 whose transfer function operator form is MX v[k] =H(q 1 )e[k],
More informationEnvironmental Econometrics
Environmental Econometrics Syngjoo Choi Fall 2008 Environmental Econometrics (GR03) Fall 2008 1 / 37 Syllabus I This is an introductory econometrics course which assumes no prior knowledge on econometrics;
More informationLecture 13 and 14: Bayesian estimation theory
1 Lecture 13 and 14: Bayesian estimation theory Spring 2012 - EE 194 Networked estimation and control (Prof. Khan) March 26 2012 I. BAYESIAN ESTIMATORS Mother Nature conducts a random experiment that generates
More informationECON 4160, Spring term Lecture 12
ECON 4160, Spring term 2013. Lecture 12 Non-stationarity and co-integration 2/2 Ragnar Nymoen Department of Economics 13 Nov 2013 1 / 53 Introduction I So far we have considered: Stationary VAR, with deterministic
More informationPERFORMANCE ANALYSIS OF CLOSED LOOP SYSTEM WITH A TAILOR MADE PARAMETERIZATION. Jianhong Wang, Hong Jiang and Yonghong Zhu
International Journal of Innovative Computing, Information and Control ICIC International c 208 ISSN 349-498 Volume 4, Number, February 208 pp. 8 96 PERFORMANCE ANALYSIS OF CLOSED LOOP SYSTEM WITH A TAILOR
More informationTesting Linear Restrictions: cont.
Testing Linear Restrictions: cont. The F-statistic is closely connected with the R of the regression. In fact, if we are testing q linear restriction, can write the F-stastic as F = (R u R r)=q ( R u)=(n
More informationBlow-up Analysis of a Fractional Liouville Equation in dimension 1. Francesca Da Lio
Blow-up Analysis of a Fractional Liouville Equation in dimension 1 Francesca Da Lio Outline Liouville Equation: Local Case Geometric Motivation Brezis & Merle Result Nonlocal Framework The Half-Laplacian
More information1 x(k +1)=(Φ LH) x(k) = T 1 x 2 (k) x1 (0) 1 T x 2(0) T x 1 (0) x 2 (0) x(1) = x(2) = x(3) =
567 This is often referred to as Þnite settling time or deadbeat design because the dynamics will settle in a Þnite number of sample periods. This estimator always drives the error to zero in time 2T or
More informationSystem Identification for MPC
System Identification for MPC Conflict of Conflux? B. Erik Ydstie, Carnegie Mellon University Course Objectives: 1. The McNamara Program for MPC 2. The Feldbaum Program for MPC 3. From Optimal Control
More informationAdaptive Rejection of Periodic Disturbances Acting on Linear Systems with Unknown Dynamics
206 IEEE 55th Conference on Decision and Control (CDC) ARIA Resort & Casino December 2-4, 206, Las Vegas, USA Adaptive Rejection of Periodic Disturbances Acting on Linear Systems with Unknown Dynamics
More informationREGLERTEKNIK AUTOMATIC CONTROL LINKÖPING
Time-domain Identication of Dynamic Errors-in-variables Systems Using Periodic Excitation Signals Urban Forssell, Fredrik Gustafsson, Tomas McKelvey Department of Electrical Engineering Linkping University,
More informationRozwiązanie zagadnienia odwrotnego wyznaczania sił obciąŝających konstrukcje w czasie eksploatacji
Rozwiązanie zagadnienia odwrotnego wyznaczania sił obciąŝających konstrukcje w czasie eksploatacji Tadeusz Uhl Piotr Czop Krzysztof Mendrok Faculty of Mechanical Engineering and Robotics Department of
More informationsc Control Systems Design Q.1, Sem.1, Ac. Yr. 2010/11
sc46 - Control Systems Design Q Sem Ac Yr / Mock Exam originally given November 5 9 Notes: Please be reminded that only an A4 paper with formulas may be used during the exam no other material is to be
More informationIdentification in closed-loop, MISO identification, practical issues of identification
Identification in closed-loop, MISO identification, practical issues of identification CHEM-E7145 Advanced Process Control Methods Lecture 4 Contents Identification in practice Identification in closed-loop
More information3.1 Overview 3.2 Process and control-loop interactions
3. Multivariable 3.1 Overview 3.2 and control-loop interactions 3.2.1 Interaction analysis 3.2.2 Closed-loop stability 3.3 Decoupling control 3.3.1 Basic design principle 3.3.2 Complete decoupling 3.3.3
More informationLECTURE 13: TIME SERIES I
1 LECTURE 13: TIME SERIES I AUTOCORRELATION: Consider y = X + u where y is T 1, X is T K, is K 1 and u is T 1. We are using T and not N for sample size to emphasize that this is a time series. The natural
More informationThe Case Against JIVE
The Case Against JIVE Related literature, Two comments and One reply PhD. student Freddy Rojas Cama Econometrics Theory II Rutgers University November 14th, 2011 Literature 1 Literature 2 Key de nitions
More informationControl Systems II. ETH, MAVT, IDSC, Lecture 4 17/03/2017. G. Ducard
Control Systems II ETH, MAVT, IDSC, Lecture 4 17/03/2017 Lecture plan: Control Systems II, IDSC, 2017 SISO Control Design 24.02 Lecture 1 Recalls, Introductory case study 03.03 Lecture 2 Cascaded Control
More informationi) the probability of type I error; ii) the 95% con dence interval; iii) the p value; iv) the probability of type II error; v) the power of a test.
Problem Set 5. Questions:. Exlain what is: i) the robability of tye I error; ii) the 95% con dence interval; iii) the value; iv) the robability of tye II error; v) the ower of a test.. Solve exercise 3.
More informationTime Series Analysis
Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Regression based methods, 1st part: Introduction (Sec.
More informationBooth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay. Midterm
Booth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay Midterm Chicago Booth Honor Code: I pledge my honor that I have not violated the Honor Code during
More informationAdaptive Robust Control
Adaptive Robust Control Adaptive control: modifies the control law to cope with the fact that the system and environment are uncertain. Robust control: sacrifices performance by guaranteeing stability
More information2. Multivariate ARMA
2. Multivariate ARMA JEM 140: Quantitative Multivariate Finance IES, Charles University, Prague Summer 2018 JEM 140 () 2. Multivariate ARMA Summer 2018 1 / 19 Multivariate AR I Let r t = (r 1t,..., r kt
More informationAdaptive Harmonic Steady State Control for Disturbance Rejection
Adaptive Harmonic Steady State Control for Disturbance Rejection J. Chandrasekar L. Liu D. Patt P. P. Friedmann and D. S. Bernstein I. INTRODUCTION The use of feedback control for disturbance rejection
More informationGMM - Generalized method of moments
GMM - Generalized method of moments GMM Intuition: Matching moments You want to estimate properties of a data set {x t } T t=1. You assume that x t has a constant mean and variance. x t (µ 0, σ 2 ) Consider
More informationAdvanced Econometrics
Based on the textbook by Verbeek: A Guide to Modern Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna May 2, 2013 Outline Univariate
More informationHandling parametric and non-parametric additive faults in LTV Systems
1 / 16 Handling parametric and non-parametric additive faults in LTV Systems Qinghua Zhang & Michèle Basseville INRIA & CNRS-IRISA, Rennes, France 9th IFAC SAFEPROCESS, Paris, France, Sept. 2-4, 2015 2
More information