SET MEMBERSHIP ESTIMATION THEORY


 Angela Floyd
 3 months ago
 Views:
Transcription
1 SET MEMBERSHIP ESTIMATION THEORY Michele TARAGNA Dipartimento di Elettronica e Telecomunicazioni Politecnico di Torino Master Course in Mechatronic Engineering Master Course in Computer Engineering 0RKYQW / 0RKYOV Estimation, Filtering and System Identification Academic Year 07/08
2 Example: estimation of a resistance value N voltagecurrent measurements are performed on a real resistor, assuming that: its static characteristic is linear the device model is given by the Ohm s law v R = R i R the measurements are corrupted by an unknown noise e = [e,...,e N T The following system of linear equations is derived: v R, = R i R, +e v R, = R i R, +e v R,N. = R i R,N +e N 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification
3 In matrix terms: v R, v R,. v R,N }{{} y = i R, i R,. i R,N }{{} Φ [R }{{} θ o + e e. e N }{{} e is in the standard form: y }{{} known data = F(θ o ) }{{} known function + e }{{} unknown noise F (θ o ) = Φ θ o = linear function of the unknown parameterθ o Goal: find an estimate ˆR ofrby means of an estimation algorithm (estimator)ψ applied to the data vectory: ˆR = ψ(y) = R 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification
4 Least squares estimation errors θ o : true parameters that generated the data vectory Due to measurement noise,y = Φθ o +e Φθ o using the least squares algorithm as estimator: ˆθ = ( Φ T Φ ) Φ T y = ( Φ T Φ ) Φ T (Φθ o +e) = = ( Φ T Φ ) Φ T Φ }{{} I θ o + ( Φ T Φ ) Φ T e = θ o + ( Φ T Φ ) Φ T e ˆθ θ o = ( Φ T Φ ) Φ T e = estimation error e is not exactly known, but different assumptions may be made one :  random variable statistical estimation  componentwise bounded Set Membership estimation  energy bounded 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 3
5 Unknown But Bounded (UBB) errors e B e = uncertainty set Be = { ẽ R N : ẽ i ε,i =,...,N } { } = ẽ R N : ẽ = max ẽ i ε i=,...,n { { } Be= ẽ R N : ẽ T ẽ = N N ẽ i ε }= ẽ R N : ẽ = ẽ i ε i= i= e ε % e % e % e ε ε e e ε e 3 Assumption: B e is symmetric with respect to the origin ofr N 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 4 e 3
6 Problem: how to evaluate the uncertainty onˆθ induced by the uncertainty setb e? A = ( Φ T Φ ) Φ T = least squares operator : R N R n }{{}}{{} measurement space parameter space ˆθ θo = ( Φ T Φ ) Φ T e = Ae θ o = ˆθ Ae θ o EUS= ˆθ A[B e =Ay A[B e =A[y B e = Estimate Uncertainty Set y ε A = (Φ T Φ)  Φ T θ y e Φθ o y B e ε ε A = (Φ T Φ)  Φ T θ θ o EUS y y θ y Parameter space R 3 Measurement space R n N Note thatθ o EUS and that the distance betweenφθ o andy is not greater thanε 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 5 θ
7 TheEUS volume gives an idea of the estimation quality and, in particular, the Estimate Uncertainty IntervalsEUI j,j=,...,n, provide this measure: [ EUI j = min θ j, max θ j = [ˆθm j,ˆθ M j R θ EUS θ EUS }{{}}{{} ˆθ m j ˆθ M j the range of thejth component of the estimate is such that: ˆθm j [θ o j ˆθ M j an upper bound on the estimation error of thejth component is: ˆθ j [θ o j (ˆθ M j ˆθ m / j ) ˆθ is the symmetry center ofeus, becauseeus is the image of a symmetric set under a linear mapping EUI θ θ M θ θ m θ m θ θ EUS θ o θ M θ EUI 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 6
8 Evaluation ofeus The uncertainty set is a cube inr N centered in the origin: B e = { ẽ R N : ẽ i ε,i =,...,N } y = Φθ o +e the set of any possible measurement (called Measurement Uncertainty Set) is a cube inr N whose symmetry center is the data vectory: MUS = y B e = { ỹ R N : ỹ i y i ε,i =,...,N } R N the vertices ofmus are denoted byȳ k,k =,..., N Theorem: conv{θ,...,θ p } : EUS =A[MUS =conv { Aȳ k,k=,..., N} R n convex hull of the set{θ,...,θ p } is the smallest convex polyhedron (polytope) containingθ,...,θ p y y 6 y y 7 y3 y 3 y y 5 y B e y y 8 y 4 y y y 3 Measurement space R N A = (Φ T Φ)  Φ T θ EUS Ay 6 Ay θ Ay 3 Ay 7 Ay 4 Ay 8 Ay 5 Ay Parameter space R n 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 7 θ
9 Theorem: EUI j = whereˆθ m j = N k= Evaluation ofeui j [ˆθm j,ˆθ M j R a jk [ yk ε sign ( a jk ), ˆθ M j = ˆθ j ˆθ m j, A = [ a jk = (Φ T Φ) Φ T, ˆθ = [ˆθj = Ay Proof:ˆθ m j = min θ EUS θ j= min ỹ MUS (Aỹ) j = N = min a jk ỹ k = min ỹ: ỹ i y i εk= i=,...,n EUI N ỹ: ε ỹ i y i εk= i=,...,n θ θ M θ θ m a jk ỹ k = θ m min θ θ EUI EUS θ o θ M N ỹ:y i ε ỹ i y i +εk= i=,...,n and such a minimum is achieved byỹ k = y k εifa jk > 0, or byỹ k = y k +εifa jk < 0. SinceMUS = y Be is symmetric with respect to the data vectory, then EUS = A[MUS is symmetric with respect to the estimateˆθ = Ay and then: ˆθj = (ˆθ m j +ˆθ M / j ),j =,..., n ˆθ M j = ˆθ j ˆθ m j,j =,..., n θ a jk ỹ k 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 8
10 Description of ellipsoids LetΩ x be an ellipsoid inr N centered inx o : Ω x = { } x R N : (x x o ) T Σ x (x x o ) The form matrixσ x R N N is symmetric and positive definite it is invertible The directions of the main axes ofω x are given by the eigenvectorsu i ofσ x, which are orthogonal becauseσ x is positive definite The lengths of the semiaxes ofω x are given by λi (Σ x ), whereλ i (Σ x ) is theith eigenvalue ofσ x x u u u u x x o 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 9 x
11 Linear transformation of ellipsoids LetΩ x be an ellipsoid inr N centered inx o : Ω x = { x R N : (x x o ) T Σ x (x x o ) ε } and consider the linear transformation: z = Px R n, withp R n N,n < N Theorem: ifrank(p) = n, then { Ω z = P [Ω x = z R n : (z z o ) T Σ z (z z o ) ε } z o = Px o R n, Σ z = PΣ x P T R n n x Ω x x o P z P x o Ω z x 3 Space R N x x 3 x Space R n, n < N z 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 0
12 Evaluation ofeus The uncertainty set is a sphere inr N centered in the origin: B e = { ẽ R N : ẽ T ẽ ε } y = Φθ o +e the set of any possible measurement (called Measurement Uncertainty Set) is a sphere inr N whose symmetry center is the data vectory: { MUS = y Be = ỹ R N : (ỹ y) T (ỹ y) ε } R N Theorem: EUS =A [ } y Be = { θ R n :( θ ˆθ) T Φ T Φ( θ ˆθ) ε R n is an ellipsoid inr n withˆθ=ay as symmetry center and(φ T Φ) as form matrix y y 3 ε y Φθ o y B e y A = (Φ T Φ)  Φ T A = (Φ T Φ)  Φ T EUI θ θ M θ m θ m EUS θ M θ EUI Measurement space R N Parameter space R n y 3 y θ θ θ θ o 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification
13 Theorem: EUS =A [ } y Be = { θ R n :( θ ˆθ) T Φ T Φ( θ ˆθ) ε R n is an ellipsoid inr n withˆθ=ay as symmetry center and(φ T Φ) as form matrix y y 3 ε y Φθ o y B e y A = (Φ T Φ)  Φ T A = (Φ T Φ)  Φ T EUI θ M θ m EUS θ EUI Measurement space R N Parameter space R n y 3 y Proof: by definition,eus is the linear mapping ofmus = y Be EUS = A [ y Be = { θ R n : ( θ Ay) T [AA T ( θ Ay) ε } ButAy = ˆθ,A = (Φ T Φ) Φ T and then: θ θ θ m θ θ θ o θ M through the matrixa: AA T = (Φ T Φ) Φ T [(Φ T Φ) Φ T T = (Φ T Φ) Φ T { Φ[(Φ T Φ) T} = = (Φ T Φ) Φ T Φ }{{} I [(Φ T Φ) T = (Φ T Φ) 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification
14 Evaluation ofeui j Theorem: EUIj [ˆθ = j ε σ j, }{{} ˆθ j +ε σ j = }{{} ˆθ m j ˆθ M j [ˆθm j,ˆθ M j R σ j = [ (Φ T Φ) jj θ M θ EUS EUI θ θ θ o θ m θ m θ EUI θ M θ 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 3
15 Optimal (with minimal uncertainty) estimates Is theeus the smallest set containing the true parameterθ o? Are theeui j the smallest possible uncertainty intervals? Does the LS estimator provide the minimal uncertainty intervals? To answer all these questions, it is necessary to analyze the set of all the parameters that are consistent with both the data and the available information on noise Definition: a parameter θ is said to be feasible (or consistent) if(y Φ θ) B e } FPS = { θ R n : (y Φ θ) B e = Feasible Parameter Set = = set of all the parameters consistent with both the data and the information on noise and on the estimation problem FPS is independent of the estimation algorithm If data are generated by the true parameterθ o, thenθ o is feasible; in fact: y = Φθ o +e,e B e y Φθ o = e B e θ o FPS 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 4
16 Theorem: Relationship between FPS and EUS FPS EUS Proof: if θ FPS, then [ (y Φ θ) B e Φ θ y B e A Φ θ A[y B e = EUS [ ButA Φ θ = (Φ T Φ) Φ T Φ θ = θ and then θ EUS. The Parameter Uncertainty IntervalsPUI j,j =,...,n are defined as: [ [ PUI j = min θ j, max θ j = θj m,θj M R θ FPS θ FPS }{{}}{{} θj m θj M from the above theorem: PUI j EUI j,j =,...,n ˆθ m j θ m j [θ o j θ M j ˆθ M j 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 5
17 Evaluation offps andpui j If θ FPS, then(y Φ θ) B e = { ẽ R N : ẽ i ε,i =,...,N } (y Φ θ) i = FPS = { θ R n : ϕ T i : ith row ofφ y i ϕ T θ i ε, i =,...,N y i ϕ T θ } i ε,i =,...,N i.e.,fps is a polytope (a convex polyhedron) generated by linear inequalities: y i ϕ T θ i ε ε y i ϕ T θ i ε y i ε ϕ T θ i y i +ε [ [ Moreover,PUIj = min θ j, max θ j = θj m,θj M R θ FPS }{{ θ FPS }}{{ } θj m θj M withθ m j andθ M j solutions of linear programming problems of the standard form: min x c T x with the constraint: Ax b 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 6
18 Evaluation offps andpui j Theorem: FPS = { θ R n : ( θ ˆθ) [ T Φ T Φ } ( θ ˆθ) ε α α = (y Φˆθ) T (y Φˆθ) = y Φˆθ ε = fitting error between measured outputs and estimated outputs a greater fitting error asmallerfps a lower uncertainty on parameters Moreover,PUI j [ˆθj = σ j ε α }{{,ˆθ j +σ j ε α }}{{ = } [ θ m j,θ M j R σ j = θ m j θ M j [(Φ T Φ) jj EUS EUI PUI M m FPS o m M PUI EUI 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 7
19 Optimal estimates Definition: given an estimateˆθ, the estimate errore(ˆθ) is given by: E(ˆθ) = sup θ ˆθ θ FPS Definition: an estimateˆθ opt is optimal if: Central estimate: ˆθC = [ˆθC j E(ˆθ opt ) E(ˆθ), ˆθ R n /, whereˆθ C j = (θj m +θj M ), j =,...,n the central estimate is optimal both ifb e = Be and ifb e = Be /, since: [θ o j ˆθ C j (θj M θj m ), j =,...,n ifb e = B e, the least squares estimateˆθ LS = (Φ T Φ) Φ T y is central ˆθ LS is optimal ifb e = B e, but in general it is not optimal ifb e = B e 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 8
20 Example: parametric estimation of a position transducer model 8 Position transducer 6 4 Voltage V z in V Position z in m The static characteristic of the positionvoltage transducer is nearly linear in the range between.3 e 3.5 cm the characteristic can be linearly approximated by: V z = K t z +V o 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 9
21 In the linearity interval between.3 e3.5 cm: V z = K }{{} t unknown z + V o }{{} unknown The most relevant error occurs in the positionz measurement and it is not greater than0.5 mm to account for this error, the model equation can be rewritten as: z = K t V z V o K t +e where the unknown parameters are: θ =, θ = V o K t K t The N measurements taken in the linearity interval form a system of equations: z = V z, θ +θ +e z = V z, θ +θ +e z N. = V z,n θ +θ +e N V z,i : voltage provided by the transducer when the position value isz i 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 0
22 In matrix form: z z.. z N = V z, V z,.. V z,n θ θ + e e.. e N i.e., the estimation problem is in the standard form: y = Φ θ+e wherey R N,Φ R N,e R N and the unknown isθ R Using the Least Squares estimation algorithm: ˆθ = A y, witha = (Φ T Φ) Φ T ˆθ = [ ˆθ ˆθ = [ ˆK t = ˆθ = 549.6V/m, ˆVo = ˆθ ˆθ =.56V 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification
23 Evaluation of the Estimate Uncertainty IntervalsEUI j e N Be = { ẽ N R N : ẽ i ε, i =,...,N }, ε = EUIj = [ˆθm j = min θ EUS θ j,ˆθ M j = max θ EUS θ j, j =, ˆθm j = min θ EUS θ j = N k= a jk [y k ε sign(a jk ) ˆθM j = max θ EUS θ j = N k= a jk [y k +ε sign(a jk ) = ˆθ j ˆθ m j [ˆθm,ˆθ M = [ , [ˆθm,ˆθ M = [.9 0,.39 0 [ ˆKm t, ˆK [ t M = /ˆθ M,/ˆθ m = [55.67, V/m m [ˆV o, ˆV [ o M = ˆθ M /ˆθ m, ˆθ m /ˆθ M = [ 3.703,.495 V 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification
24 Envelope of the static characteristics of models whose parameters θ are taken as the extremes of the Estimate Uncertainty IntervalsEUI j,j =, Experimental data (z, V z ) z min(θ M * Vz, θ m * Vz ) + θ m z = θ * V z + θ z max(θ M * Vz, θ m * Vz ) + θ M Position transducer 4 Voltage V z in V linearity interval Position z in m 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 3
25 Evaluation of the Parameter Uncertainty IntervalsPUI j } FPS = { θ R dim( θ) : y i [Φ θ i ε, i =,...,N [ PUIj = min θ FPS θ j, max θ FPS θ j EUIj, j =, The extremes ofpui j min θ FPS θ j= min M θ b ct θ max θ FPS θ j= min M θ b ( c)t θ [ [ θ m = min θ FPS θ,θ M = max,j =,, are solutions of the linear programming problems M= θ FPS θ [ Φ Φ θ m = min θ FPS θ,θ M = max θ FPS θ [ [ K m t,kt M = /θ M,/θ m [ [ V m o,vo M = θ M /θ m, θ m /θ M, b= [ y y +ε, c=jth column ofi =[ , =[.596 0, = [54.0, V/m = [.735,.5 V 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 4
26 Envelope of the static characteristics of models whose parameters θ belong to the Feasible Parameter SetFPS 0 8 Experimental data (z, V z ) z = θ * V z + θ Position transducer 6 4 Voltage V z in V linearity interval Position z in m 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 5
27 Feasible Parameter SetFPS (continuous line) and set of estimates given by the extremes of Parameter Uncertainty IntervalsPUI j,j =, 0.09 Feasible Parameter Set FPS φ LS (y) 0.07 θ θ x 0 3 0RKYQW / 0RKYOV  Estimation, Filtering and System Identification 6
28 Essential references F. C. Schweppe, Uncertain Dynamics Systems. Englewood Cliffs, NJ: Prentice Hall, 973. M. Milanese, R. Tempo, A. Vicino (editors), Robustness in Identification and Control. New York: Plenum Press, 989. M. Milanese, A. Vicino, Optimal estimation theory for dynamic systems with set membership uncertainty: an overview, Automatica, vol. 7, no. 6, pp , 99. Special Issue on System Identification for Robust Control Design, IEEE Transactions on Automatic Control, vol. AC37, no. 7, pp , 99. R. S. Smith, M. Dahleh (editors), The Modeling of Uncertainty in Control Systems, vol. 9 of Lecture Notes in Control and Information Sciences. London, UK: SpringerVerlag, 994. M. Milanese, J. Norton, H. PietLahanier, É. Walter (editors), Bounding Approaches to System Identification. New York: Plenum Press, 996. J. R. Partington, Interpolation, Identification, and Sampling, vol. 7 of London Mathematical Society Monographs New Series. New York: Clarendon Press  Oxford, 997. A. Garulli, A. Tesi, A. Vicino (editors), Robustness in Identification and Control, vol. 45 of Lecture Notes in Control and Information Sciences. Godalming, UK: SpringerVerlag, 999. J. Chen, G. Gu, ControlOriented System Identification: AnH Approach. New York: John Wiley & Sons, Inc., RKYQW / 0RKYOV  Estimation, Filtering and System Identification 7