Information Weighted Consensus

Size: px
Start display at page:

Download "Information Weighted Consensus"

Transcription

1 Informaton Weghted Consensus A. T. Kamal, J. A. Farrell and A. K. Roy-Chowdhury Unversty of Calforna, Rversde, CA Abstract Consensus-based dstrbuted estmaton schemes are becomng ncreasngly popular n sensor networks due to ther scalablty and fault tolerance capabltes. In a consensusbased state estmaton framework, multple neghborng nodes teratvely communcate wth each other, exchangng ther own local estmates of a target s state wth the goal of convergng to a sngle state estmate over the entre network. However, the state estmaton problem becomes challengng when a node has lmted observablty of the state. In addton, the consensus estmate s sub-optmal when the cross-covarances between the ndvdual state estmates across dfferent nodes are not ncorporated n the dstrbuted estmaton framework. The crosscovarance s usually neglected because the computatonal and bandwdth requrements for ts computaton grow exponentally wth the number of nodes. These lmtatons can be overcome by notng that, as the state estmates at dfferent nodes converge, the nformaton at each node becomes redundant. Ths fact can be utlzed to compute the optmal estmate by proper weghtng of the pror state and measurement nformaton. Motvated by ths dea, we propose nformaton-weghted consensus algorthms for dstrbuted maxmum a posteror parameter estmates, and ther extenson to the nformaton-weghted consensus flter ICF for state estmaton. We show both theoretcally and expermentally that the proposed methods asymptotcally approach the optmal centralzed performance. Smulaton results show that ICF s robust even when the optmalty condtons are not met and has low communcaton requrements. I. INTRODUCTION Dstrbuted estmaton schemes are becomng ncreasngly popular n the sensor networks communty due to ther scalablty for large networks and hgh fault tolerance. Unlke centralzed schemes, dstrbuted schemes usually rely on peer-to-peer communcaton between sensor nodes and the task of nformaton fuson s dstrbuted across multple nodes. In a sensor network, each sensor may get multple measurements of a target s state. The objectve of a dstrbuted estmaton framework s to mantan an accurate estmate of the target s state usng all the measurements n the network wthout requrng a centralzed node for nformaton fuson. Among many types of dstrbuted estmaton schemes, consensus algorthms [1] are schemes where each node, by teratvely communcatng wth ts network neghbors, can compute a functon of the measurements at each node e.g. average. The consensus estmates asymptotcally converge to the global result. In practce, only a lmted number of teratons can be performed due to lmted bandwdth Ths work was partally supported by ONR award N ttled Dstrbuted Dynamc Scene Analyss n a Self-Confgurng Multmodal Sensor Network. and target dynamcs. Thus, true convergence may not be always reached. In the presence of state dynamcs, usually a predctor-corrector model s used for state estmaton, where a state predcton s made from the pror nformaton and corrected usng new measurements. The Kalman Consensus Flter KCF [2] s a popular dstrbuted state estmaton framework based on the average consensus algorthm. KCF works well n stuatons where each node gets a measurement of the target. In a sensor network, a node mght have lmted observablty when t does not have any measurement of a target avalable n ts local neghborhood consstng of the node and ts mmedate network neghbors. Due to lmted observablty and lmted number of teratons, the node becomes nave about the target s state. A nave node contans less nformaton about the state. If a nave node s estmate s gven an equal weght n the nformaton fuson scheme as n KCF, the performance of the overall state estmaton framework may decrease. The effect of navety s severe n sparse networks where the total number of edges s much smaller than the maxmum possble number of edges. The Generalzed Kalman Consensus Flter GKCF [3], was proposed to overcome ths ssue by utlzng a weghtedaveragng consensus scheme where the prors of each node were weghted by ther covarance matrces. The reason that these dstrbuted schemes are usually sub-optmal s that the cross-covarances between the prors across dfferent nodes are not ncorporated n the estmaton framework. As the consensus progresses, the errors n the nformaton at each node become hghly correlated wth each other. Thus, to compute the optmal state estmate, the error cross-covarances cannot be neglected. However, t s dffcult to compute the cross-covarance n a dstrbuted framework. We note that n a consensus-based framework, the state estmates at dfferent nodes acheve reasonable convergence over multple teratons. At ths pont, each node contans almost dentcal/redundant nformaton. Ths fact can be utlzed to compute the optmal estmate n a dstrbuted framework wthout explctly computng the cross-covarances. Motvated by ths dea, we propose nformaton-weghted consensus algorthms for dstrbuted state and parameter estmaton whch are guaranteed to converge to the optmal centralzed estmates as the pror state estmates become equal at dfferent nodes.e., the total number of teratons approach to nfnty at the prevous tme step. We also show expermentally that even wth lmted number of teratons, the proposed algorthms acheve near-optmal performance.

2 The ssue of navety and optmalty s handled by proper nformaton weghtng of the pror and measurement nformaton. The communcaton bandwdth requrement s also low for the proposed methods. Related works Consensus algorthms [1] are one of the many types of dstrbuted nformaton fuson schemes. Consensus algorthms are protocols that are run ndvdually by each agent where each agent communcates wth just ts network neghbors and corrects ts own nformaton teratvely usng the nformaton sent by ts neghbors. The protocol, over multple teratons, ensures the convergence of all the agents n the network to a sngle consensus. Consensus algorthms have been extended to perform varous tasks n a network of agents such as lnear algebrac operatons lke SVD, least squares, PCA, GPCA [4], dstrbuted state and parameter estmaton frameworks such as the KCF [2], the GKCF [3] and the dstrbuted maxmum lkelhood estmator DMLE [5]. A detaled revew of dstrbuted state estmaton methods and comparsons wth centralzed and decentralzed approaches can be found n [6]. These dstrbuted state and parameter estmaton frameworks have been appled n varous felds ncludng camera networks for dstrbuted mplementatons of 3-D pont trangulaton, pose estmaton [4], trackng [7], acton recognton [7], [8], collaboratve trackng and camera control [9] etc. The ssue of lmted observablty of the ndvdual nodes has been consdered prevously n dstrbuted estmaton frameworks. In [10], the authors proposed a hybrd peerto-peer/herarchcal framework for state estmaton requrng fuson centers. Thus, the soluton was not fully dstrbuted. In ths paper, we propose a fully dstrbuted framework wthout the requrement of fuson centers. Average consensus: Revew Average consensus [1] s a popular dstrbuted algorthm for computng the arthmetc mean of some values. Suppose, there are N nodes and each node C has the state a. Usng average consensus, the average value of these states.e. 1 N N a can be computed n a dstrbuted manner. Here, a can be a scalar, a vector or a matrx. In average consensus algorthm, each node ntalzes ts consensus state as a 0 a. At the begnnng of teraton k, a node C sends ts prevous state a k1 to ts mmedate network neghbors C j N and smlarly receves the neghbors prevous states a k1 j. Then t updates ts own state as a k a k1 + ɛ j a k1. 1 j N a k1 By teratvely dong so, the values of the states at all the nodes converge to the average of the ntal values. Here ɛ s the rate parameter whch should be chosen between 0 and 1 max, where max s the maxmum degree of the network graph G. Usng a hgher value of ɛ would gve a hgher rate of convergence. However, choosng a value greater than or 1 equal to max would render the system unstable. II. PROBLEM FORMULATION Consder a sensor network wth N sensors. The communcaton n the network can be represented usng an undrected connected graph G = C, E. The set C = {C 1,..., C N } contans the vertces of the graph and represents the sensor nodes. The set E contans the edges of the graph whch represents the avalable communcaton channels between dfferent nodes. The set of nodes havng drect communcaton channel wth node C sharng an edge wth C s represented by N. The true state of the targets s represented by xt R p. For multple targets xt s the concatenaton of the ndvdual state vectors. A data assocaton scheme mght be necessary for multple targets. As our focus s on the dstrbuted state estmaton problem, we would assume that the data assocaton s gven. For smplcty of notaton, tme ndex t wll be dropped where the ssue under consderaton can be understood wthout t. Each node has a pror estmate of x as x R p. The error n the pror estmate at C s η = x x Rp wth covarance P R p p. The nformaton form of the estmators wll be used throughout ths paper. Thus, we wll have notatons n the nverse covarance form whch s also known as the nformaton/precson matrx. We denote the pror nformaton matrx of node C as W R p p, where W = P 1. 2 The observaton of node C s denoted by z R m wth nose covarance R R m m, where m s the length of the measurement vector at node C. The observatons from all the nodes are modeled as, Z = Hx + ν. 3 Here, Z = [z T 1, z T 2,..., z T N ]T R m and observaton matrx H = [H T 1, H T 2,..., H T N ]T R m p where, H R m p and m = N m. Observaton nose ν s assumed to be Gaussan wth ν N 0, R R m. The nverse of R R m m s denoted by B R m m. The measurements are assumed to be uncorrelated across nodes. Thus, the measurement nformaton matrx s block dagonal and can be expressed as, B B 2 B = B N Here, B = R 1 R m m. III. DISTRIBUTED MAP ESTIMATION DMAP In ths secton, frst we wll present the centralzed soluton for our problem and later wll derve the dstrbuted mplementaton of t.

3 A. Centralzed case The task n a centralzed a posteror estmaton process s to estmate the state x from the measurements Z and pror state x c wth nformaton matrx Wc. The centralzed maxmum a posteror MAP [11] estmate x + c and ts nformaton matrx W c + n nformaton form can be expressed as, 1 x + c = Wc + H T BH Wc x c + H T BZ, 5 W c + = Wc + H T BH. 6 Let us defne U = H T B H and u = H T B z. Due to the block dagonal structure of B, we have H T BH = H T BZ = H T B H = U, 7 H T B z = u. 8 Thus, we have the centralzed MAP estmate as, 1 x + c = Wc + U W c x c + u N 1 W = c N + U N W c N x c + u, 9 W W c + = c N + U. 10 B. Dervaton of Dstrbuted MAP Now, we wll derve the dstrbuted mplementaton of the centralzed MAP estmates of In the centralzed case, after the estmaton at tme t 1, we have the state estmate x + c t 1 that s used as a pror x c t for the estmaton at tme t. For the dstrbuted case, each node wll have ts own pror x t. Ideally, for all, x t should be equal to x c t. However, n practce, due to lmted number of consensus teratons at prevous tme steps, there may be some dscrepances among the prors n the dstrbuted case. Here, we wll derve the dstrbuted MAP estmaton framework for the case where the prors n the dstrbuted framework have converged to the pror n the centralzed framework. Later n Sec IV, we wll dscuss the mportance of ths case for consensus-based estmaton frameworks. Under ths condton, for all, we have x = x c and = Wc. Thus, from 9 and 10 we have W x + c = W + c = N W N + U W N + U 1 N W N x + u Intutvely, ths dvson by the number of nodes N s very mportant because when all the nodes have the same pror state, from the centralzed perspectve, the state nformaton Algorthm 1 Dstrbuted Maxmum A Posteror DMAP at C Input: Pror state estmate x, nformaton matrx W, observaton matrx H, measurement z, measurement nformaton matrx B, consensus rate parameter ɛ and total consensus teratons K. 1 Compute ntal nformaton matrx and vector V 0 1 N W + H T B H 19 v 0 1 N W x + H T B z 20 2 Perform average consensus on V 0 and v0 ndependently for k = 1 to K do a Send V k1 and v k1 to all neghbors j N b Receve V k1 j and v k1 j from all neghbors j N c Update: V k V k1 + ɛ V k1 j V k1 j N v k v k1 + ɛ j N v k1 j v k1 end for 3 Compute MAP estmate and Informaton matrx x + V K 1 v K 23 W + NV K 24 Output: MAP estmate x + and nformaton matrx W +. matrx should only be used once n the calculaton of the MAP estmate. However, n the dstrbuted case, f ths dvson s not performed, the pror nformaton gets N tmes more weght than t should. As a result, the estmator gets more based towards the pror states and gves less weght to the new measurement nformaton. Let, V 0 = W N + U, 13 v 0 = W N x + u. 14 Each node can compute V 0 and v 0 from the nformaton avalable to t.e. x, W, u, U and N. Then, each node communcates to ts neghbors wth ts own nformaton matrx V k R p p and nformaton vector v k R p, usng average consensus algorthm as descrbed n Sec I to asymptotcally compute the global averages of each of these two quanttes as N V0 lm k Vk =, 15 N N lm k vk = v0 N. 16 Therefore, from we have x + c = lm NV k 1 Nv k k = lm V k 1 v k 17 k W c + = lm k NVk 18 From 17 and 18 we can see that as k, the state estmate and nformaton matrx at each node converges to the optmal centralzed MAP estmate. The DMAP framework s summarzed n Algorthm 1.

4 IV. INFORMATION-WEIGHTED CONSENSUS FILTER ICF In the prevous secton, we derved a dstrbuted MAP estmator for the case where each node has the pror nformaton that equals to the pror n the centralzed framework. In ths secton, we wll extend the DMAP algorthm consderng state dynamcs. The state evoluton s modeled usng the followng lnear dynamcal model, xt + 1 = Φxt + γt. 25 Here Φ s the state propagaton matrx and process nose γt N 0, Q. For the centralzed case, where the state estmate at tme t s x + c t wth nformaton matrx W + c t, we have the Kalman flter [12] state propagaton equatons as, W c t + 1 = ΦW + c t 1 Φ T + Q 1, 26 x c t + 1 = Φx + c t. 27 Combnng ths wth our dstrbuted MAP estmator n Sec III-B, we get the nformaton-weghted consensus flter ICF. The approach s summarzed n Algorthm 2. At each tme step, f k, the DMAP estmator n Algorthm 1 guarantees that the prors for the next tme step at each node wll be equal to the optmal centralzed one. Ths n turn sets the optmalty condton for the next tme step. Ths guarantees the optmalty of Algorthm 2 wth k at each tme step. In realty, reachng true convergence may not be possble due to lmted number of consensus teratons. The number of teratons needed to reach a reasonable convergence depends on the network sze and number and poston of nave nodes n the network graph. In Sec V, we wll show expermentally that n case of only one or a few teratons, ICF s robust to small dscrepances between the state estmates across the nodes and acheves near optmal performance. In a practcal mplementaton scenaro, at system startup or for the frst few teratons n a nave node, V K n 32 can be 0 thus, not nvertble, f there s no pror or measurement nformaton avalable n the local neghborhood. At that stuaton, a node wll not perform step 4 and 5 n Algorthm 2 untl t receves non-zero nformaton from ts neghbors through step 3 or gets a measurement tself through step 1 yeldng V K to be non-zero. V. EXPERIMENTAL EVALUATION In ths secton, we evaluate the performance of the proposed ICF algorthm n a smulated envronment and compare t wth other methods: the Centralzed Kalman Flter CKF [12], the Kalman Consensus Flter KCF [2] and the Generalzed Kalman Consensus Flter GKCF [3]. We smulate a camera network n ths experment, where the state estmaton algorthms are used for trackng a target roamng wthn a space. The target s ntal state vector s random. The target s state vector s a 4D vector, wth the 2D poston and 2D velocty components. The Algorthm 2 ICF at node C at tme step t Input: Pror state estmate x t, pror nformaton matrx W t, observaton matrx H, consensus rate parameter ɛ, total consensus teratons K, state transton matrx Φ and process covarance Q. 1 Get measurement z and measurement nformaton matrx B 2 Compute ntal nformaton matrx and vector V 0 1 N W t + HT B H 28 v 0 1 N W tx t + HT B z 29 3 Perform average consensus on V 0 and v0 ndependently for k = 1 to K do a Send V k1 and v k1 to all neghbors j N b Receve V k1 j and v k1 j from all neghbors j N c Update: V k V k1 + ɛ V k1 j V k1 j N v k v k1 + ɛ j N v k1 j v k end for 4 Compute a posteror state estmate and nformaton matrx for tme t 5 Predct for next tme step t + 1 x + t VK 1 v K 32 W + t NVK 33 W t + 1 ΦW + t1 Φ T + Q 1 34 x t + 1 Φx+ t 35 Output: State estmate x + t and nformaton matrx W+ t. ntal speed s unformly pcked from 1020 unts per tme step, wth a random drecton unformly chosen from 0 to 2π. The targets evolve for 40 tme steps usng the target dynamcal model of 25. The state transton matrx Φ and process covarance Q are chosen as the followng Φ= , Q= The target randomly changes ts drecton and s reflected back when t reaches the grd boundary. A set of N = 5 camera sensors montor the area. The observatons are generated usng 3. The observaton matrx H and the communcaton adjacency matrx A are set as the followng [ ] H =, A= If a camera has a measurement, measurement nformaton matrx B = 0.01I 2 s used. Otherwse, B s set to 0I 2. The consensus rate parameter ɛ s set to 0.65/ max where max = 2. For the frst experment, the ntal pror state x 1 and pror covarance P 1 s set equal at each node. A dagonal

5 KCF at C 1 KCF at C 2 KCF at C 3 KCF at C 4 KCF at C 5 ICF at C 1 ICF at C 2 ICF at C 3 ICF at C 4 ICF at C 5 Fg. 1: Here the smulaton framework s shown along wth the trackng results for KCF and ICF for K = 2. The rectangular boxes represent the same smulaton area from each camera s perspectve. Wthn each rectangle, the blue trangles represent a camera s feld of vew FOV. The green lne represents the ground truth track and the blue dots represent the observatons at the ndvdual cameras. The state estmates of KCF and ICF are shown n black and red lnes respectvely. The gray ellpses depct the covarances of the estmates. It shows that even for K = 2 and the presence of navety, ICF performs sgnfcantly better than KCF. matrx s used for P 1 wth the man dagonal elements as {100, 100, 10, 10}. The ntal pror state x 1 s generated by addng zero-mean Gaussan nose of covarance P 1 to the ground truth state. The trackng results for KCF and ICF s shown n Fg. 1. In ths experment, the total number of consensus teratons K s set to 2. The cameras C 3, C 4 and C 5 s nave about the target s state for most of the tme steps. Compared to the state estmates of KCF, n all the cameras, especally n the nave nodes, the state estmates of ICF are much closer to the ground truth. As a measure of performance, we compute the estmaton error e, defned as the Eucldean dstance between the ground truth poston and the estmated posteror poston. The mean error ē s computed by averagng the errors over all cameras and tme steps. In ths experment, the mean error ē for KCF s and for ICF s Next, we compare the performance of KCF, GKCF and ICF wth CKF after convergence. The smulaton s run Method Mean error Standard devaton of error KCF K= GKCF K= ICF K= ICF K= ICF K= ICF K= CKF TABLE I: Mean and standard devaton of the errors of dfferent methods for dfferent total number of consensus teratons K. Mean Error, ē KCF GKCF ICF CKF Independent smulaton runs Fg. 2: Mean error of the converged estmates of dfferent algorthms at multple ndependent smulaton runs. It supports the theory that wth hgh number of consensus teratons e.g., K = 100, ICF approaches the optmal centralzed performance. 15 tmes wth dfferent randomly generated tracks. The convergence was assumed after 100 consensus teratons. The results of ths experment are shown n Fg 2. It s apparent from ths fgure that ICF performs better than KCF and GKCF and acheves optmal centralzed performance wth hgh number of consensus teratons. In Table I, the mean and standard devaton of the errors for each method n ths experment are shown.

6 Fnally, to show the robustness of ICF, we conduct an experment by relaxng the optmalty condton where the ntal pror states and covarances are dfferent at dfferent nodes. The pror states are ntalzed by addng Gaussan noses generated usng the correspondng pror covarance matrces to the ntal ground truth states. The ntal pror error across dfferent cameras are correlated wth correlaton coeffcent ρ = 0.5. The total number of consensus teratons K s vared from 1 to 20 wth ncrements of 1. A total of 15 cameras are used and the camera locatons, orentatons, network topologes and ground truth tracks are generated randomly. The results of ths experment s shown n Fg 3. The smulaton results are averaged over 400 ndependent smulaton runs. The mean error sold lnes and the standard devaton ±0.2σ wth dotted lnes for dfferent methods are shown usng dfferent colors. The results show that ICF acheves near-optmal performance even when the optmalty condtons are not met. Ths s because ICF s a consensus based approach and rrespectve of the ntal condton, after several tme steps or consensus teratons, the states reach a reasonable converge. ICF was proved to be optmal wth converged pror states. Thus, after a few tme steps t acheves near-optmal performance as the system approaches the optmalty condtons. Comparng Fg 2 and 3 we can see that the performance of KCF deterorated n the latter but the performance of GKCF and ICF was not affected much. As the number of cameras was ncreased from 5 to 15 n Fg. 3 wth the same number of neghbors per node, the number of nave nodes ncreased. Ths shows that unlke KCF, ICF handles the ssue of navety well. The ssue of navety n dstrbuted frameworks was one of the man motvatons for the dervaton of the ICF approach. ICF requres low communcaton bandwdth whch s half the requred bandwdth of GKCF and comparable wth that of KCF. The nformaton sent from each node to a neghbor at each teraton for varous methods s shown n Table II. Method KCF GKCF ICF Message content For 1 st consensus step: u R p, U R p p, x R p For addtonal consensus steps: x R p u R p, U R p p, x R p, W R p p v R p, V R p p TABLE II: Informaton sent at each consensus step VI. CONCLUSION In ths paper, we proposed nformaton-weghted consensus algorthms,.e., a dstrbuted maxmum a posteror DMAP estmaton framework for parameter estmaton and ts extenson to a nformaton-weghted consensus flter ICF for state estmaton. We showed both theoretcally and expermentally that ICF approaches the optmal centralzed performance even n the presence of nave nodes as the total number of consensus teratons K ncreases. Smulaton results showed that ICF s robust even when the optmalty Mean Error, ē and S.D. ±0.2σ KCF GKCF ICF CKF Number of Consensus Iteratons, K Fg. 3: Performance comparson of dfferent approaches by varyng the total number of consensus teratons, K. Each lne represents the mean error ē for each method. The dotted lnes represent the standard devaton ±0.2σ. The prors at t = 1 were set to be dfferent and correlated wth ρ = 0.5. Ths fgure shows that ICF s robust and acheves near-optmal performance even when the optmalty condtons are not met. condtons were not met and has near-optmal performance whle requrng low communcaton resources. REFERENCES [1] R. Olfat-Saber, J. A. Fax, and R. M. Murray, Consensus and cooperaton n networked mult-agent systems, n Proceedngs of the IEEE, 2007, p [2] R. Olfat-Saber, Kalman-consensus flter: Optmalty, stablty, and performance, n IEEE Conference on Decson and Control, Dec. 2009, pp [3] A. T. Kamal, C. Dng, B. Song, J. A. Farrell, and A. K. Roy- Chowdhury, A generalzed Kalman consensus flter for wde area vdeo networks, n IEEE Conference on Decson and Control, Dec. 2011, pp [4] R. Tron and R. Vdal, Dstrbuted computer vson algorthms, Sgnal Processng Magazne, IEEE, vol. 28, no. 3, pp , May [5] A. T. Kamal, J. A. Farrell, and A. Roy-Chowdhury, Optmal dstrbuted maxmum a posteror estmaton, n Intl. Conf. on Image Processng, [6] M. Taj and A. Cavallaro, Dstrbuted and decentralzed multcamera trackng, IEEE Sgnal Processng Magazne, vol. 28, no. 3, pp , May [7] B. Song, A. T. Kamal, C. Soto, C. Dng, J. A. Farrell, and A. K. Roy-Chowdhury, Trackng and actvty recognton through consensus n dstrbuted camera networks, IEEE Trans. on Image Processng, vol. 19, no. 10, pp , Oct [8] A. T. Kamal, B. Song, and A. K. Roy-Chowdhury, Belef consensus for dstrbuted acton recognton, n Intl. Conf. on Image Processng, Sept. 2011, pp [9] B. Song, C. Dng, A. T. Kamal, J. A. Farrell, and A. K. Roy- Chowdhury, Dstrbuted camera networks: Integrated sensng and analyss for wde area scene understandng, IEEE Sgnal Processng Magazne, vol. 28, no. 3, pp , May [10] R. Olfat-Saber and N. F. Sandell, Dstrbuted trackng n sensor networks wth lmted sensng range, Amercan Control Conference, June [11] S. M. Kay, Fundamentals of statstcal sgnal processng: estmaton theory. Upper Saddle Rver, NJ, USA: Prentce-Hall, Inc., [12] R. E. Kalman, A new approach to lnear flterng and predcton problems, Transactons of the ASME Journal of Basc Engneerng, vol. 82, no. Seres D, pp , 1960.

Information Consensus for Distributed Multi-Target Tracking

Information Consensus for Distributed Multi-Target Tracking 13 IEEE onference on omputer Vson and Pattern Recognton Informaton onsensus for Dstrbuted Mult-Target Trackng Ahmed T. Kamal, Jay A. Farrell, Amt K. Roy-howdhury Department of Electrcal Engneerng Unversty

More information

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1]

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1] DYNAMIC SHORTEST PATH SEARCH AND SYNCHRONIZED TASK SWITCHING Jay Wagenpfel, Adran Trachte 2 Outlne Shortest Communcaton Path Searchng Bellmann Ford algorthm Algorthm for dynamc case Modfcatons to our algorthm

More information

Tracking with Kalman Filter

Tracking with Kalman Filter Trackng wth Kalman Flter Scott T. Acton Vrgna Image and Vdeo Analyss (VIVA), Charles L. Brown Department of Electrcal and Computer Engneerng Department of Bomedcal Engneerng Unversty of Vrgna, Charlottesvlle,

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

ECE559VV Project Report

ECE559VV Project Report ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Parameter Estimation for Dynamic System using Unscented Kalman filter

Parameter Estimation for Dynamic System using Unscented Kalman filter Parameter Estmaton for Dynamc System usng Unscented Kalman flter Jhoon Seung 1,a, Amr Atya F. 2,b, Alexander G.Parlos 3,c, and Klto Chong 1,4,d* 1 Dvson of Electroncs Engneerng, Chonbuk Natonal Unversty,

More information

Chapter - 2. Distribution System Power Flow Analysis

Chapter - 2. Distribution System Power Flow Analysis Chapter - 2 Dstrbuton System Power Flow Analyss CHAPTER - 2 Radal Dstrbuton System Load Flow 2.1 Introducton Load flow s an mportant tool [66] for analyzng electrcal power system network performance. Load

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Hierarchical State Estimation Using Phasor Measurement Units

Hierarchical State Estimation Using Phasor Measurement Units Herarchcal State Estmaton Usng Phasor Measurement Unts Al Abur Northeastern Unversty Benny Zhao (CA-ISO) and Yeo-Jun Yoon (KPX) IEEE PES GM, Calgary, Canada State Estmaton Workng Group Meetng July 28,

More information

Singular Value Decomposition: Theory and Applications

Singular Value Decomposition: Theory and Applications Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real

More information

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30 STATS 306B: Unsupervsed Learnng Sprng 2014 Lecture 10 Aprl 30 Lecturer: Lester Mackey Scrbe: Joey Arthur, Rakesh Achanta 10.1 Factor Analyss 10.1.1 Recap Recall the factor analyss (FA) model for lnear

More information

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

Hidden Markov Models & The Multivariate Gaussian (10/26/04) CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

Synchronized Multi-sensor Tracks Association and Fusion

Synchronized Multi-sensor Tracks Association and Fusion Synchronzed Mult-sensor Tracks Assocaton and Fuson Dongguang Zuo Chongzhao an School of Electronc and nformaton Engneerng X an Jaotong Unversty Xan 749, P.R. Chna Zlz_3@sna.com.cn czhan@jtu.edu.cn Abstract

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

Hidden Markov Models

Hidden Markov Models Hdden Markov Models Namrata Vaswan, Iowa State Unversty Aprl 24, 204 Hdden Markov Model Defntons and Examples Defntons:. A hdden Markov model (HMM) refers to a set of hdden states X 0, X,..., X t,...,

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

Clock Synchronization in WSN: from Traditional Estimation Theory to Distributed Signal Processing

Clock Synchronization in WSN: from Traditional Estimation Theory to Distributed Signal Processing Clock Synchronzaton n WS: from Tradtonal Estmaton Theory to Dstrbuted Sgnal Processng Yk-Chung WU The Unversty of Hong Kong Emal: ycwu@eee.hku.hk, Webpage: www.eee.hku.hk/~ycwu Applcatons requre clock

More information

Time-Varying Systems and Computations Lecture 6

Time-Varying Systems and Computations Lecture 6 Tme-Varyng Systems and Computatons Lecture 6 Klaus Depold 14. Januar 2014 The Kalman Flter The Kalman estmaton flter attempts to estmate the actual state of an unknown dscrete dynamcal system, gven nosy

More information

4DVAR, according to the name, is a four-dimensional variational method.

4DVAR, according to the name, is a four-dimensional variational method. 4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

Spatial Prediction with Mobile Sensor Networks Using Gaussian Processes with Built-in Gaussian Markov Random Fields

Spatial Prediction with Mobile Sensor Networks Using Gaussian Processes with Built-in Gaussian Markov Random Fields Spatal Predcton wth Moble Sensor Networks Usng Gaussan Processes wth Bult-n Gaussan Markov Random Felds Yunfe Xu a, Jongeun Cho b, a Department of Mechancal Engneerng, Mchgan State Unversty b Department

More information

JAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger

JAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger JAB Chan Long-tal clams development ASTIN - September 2005 B.Verder A. Klnger Outlne Chan Ladder : comments A frst soluton: Munch Chan Ladder JAB Chan Chan Ladder: Comments Black lne: average pad to ncurred

More information

DISTRIBUTED SENSOR FUSION USING DYNAMIC CONSENSUS. California Institute of Technology

DISTRIBUTED SENSOR FUSION USING DYNAMIC CONSENSUS. California Institute of Technology DISTRIBUTED SENSOR FUSION USING DYNAMIC CONSENSUS Demetr P. Spanos Rchard M. Murray Calforna Insttute of Technology Abstract: Ths work s an extenson to a companon paper descrbng consensustrackng for networked

More information

Report on Image warping

Report on Image warping Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.

More information

Appendix B: Resampling Algorithms

Appendix B: Resampling Algorithms 407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles

More information

Adaptive Consensus Control of Multi-Agent Systems with Large Uncertainty and Time Delays *

Adaptive Consensus Control of Multi-Agent Systems with Large Uncertainty and Time Delays * Journal of Robotcs, etworkng and Artfcal Lfe, Vol., o. (September 04), 5-9 Adaptve Consensus Control of Mult-Agent Systems wth Large Uncertanty and me Delays * L Lu School of Mechancal Engneerng Unversty

More information

This column is a continuation of our previous column

This column is a continuation of our previous column Comparson of Goodness of Ft Statstcs for Lnear Regresson, Part II The authors contnue ther dscusson of the correlaton coeffcent n developng a calbraton for quanttatve analyss. Jerome Workman Jr. and Howard

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

Markov Chain Monte Carlo Lecture 6

Markov Chain Monte Carlo Lecture 6 where (x 1,..., x N ) X N, N s called the populaton sze, f(x) f (x) for at least one {1, 2,..., N}, and those dfferent from f(x) are called the tral dstrbutons n terms of mportance samplng. Dfferent ways

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

DO NOT DO HOMEWORK UNTIL IT IS ASSIGNED. THE ASSIGNMENTS MAY CHANGE UNTIL ANNOUNCED.

DO NOT DO HOMEWORK UNTIL IT IS ASSIGNED. THE ASSIGNMENTS MAY CHANGE UNTIL ANNOUNCED. EE 539 Homeworks Sprng 08 Updated: Tuesday, Aprl 7, 08 DO NOT DO HOMEWORK UNTIL IT IS ASSIGNED. THE ASSIGNMENTS MAY CHANGE UNTIL ANNOUNCED. For full credt, show all work. Some problems requre hand calculatons.

More information

Feature Selection & Dynamic Tracking F&P Textbook New: Ch 11, Old: Ch 17 Guido Gerig CS 6320, Spring 2013

Feature Selection & Dynamic Tracking F&P Textbook New: Ch 11, Old: Ch 17 Guido Gerig CS 6320, Spring 2013 Feature Selecton & Dynamc Trackng F&P Textbook New: Ch 11, Old: Ch 17 Gudo Gerg CS 6320, Sprng 2013 Credts: Materal Greg Welch & Gary Bshop, UNC Chapel Hll, some sldes modfed from J.M. Frahm/ M. Pollefeys,

More information

Annexes. EC.1. Cycle-base move illustration. EC.2. Problem Instances

Annexes. EC.1. Cycle-base move illustration. EC.2. Problem Instances ec Annexes Ths Annex frst llustrates a cycle-based move n the dynamc-block generaton tabu search. It then dsplays the characterstcs of the nstance sets, followed by detaled results of the parametercalbraton

More information

COMPUTATIONALLY EFFICIENT WAVELET AFFINE INVARIANT FUNCTIONS FOR SHAPE RECOGNITION. Erdem Bala, Dept. of Electrical and Computer Engineering,

COMPUTATIONALLY EFFICIENT WAVELET AFFINE INVARIANT FUNCTIONS FOR SHAPE RECOGNITION. Erdem Bala, Dept. of Electrical and Computer Engineering, COMPUTATIONALLY EFFICIENT WAVELET AFFINE INVARIANT FUNCTIONS FOR SHAPE RECOGNITION Erdem Bala, Dept. of Electrcal and Computer Engneerng, Unversty of Delaware, 40 Evans Hall, Newar, DE, 976 A. Ens Cetn,

More information

THE Kalman filter (KF) rooted in the state-space formulation

THE Kalman filter (KF) rooted in the state-space formulation Proceedngs of Internatonal Jont Conference on Neural Networks, San Jose, Calforna, USA, July 31 August 5, 211 Extended Kalman Flter Usng a Kernel Recursve Least Squares Observer Pngpng Zhu, Badong Chen,

More information

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models Computaton of Hgher Order Moments from Two Multnomal Overdsperson Lkelhood Models BY J. T. NEWCOMER, N. K. NEERCHAL Department of Mathematcs and Statstcs, Unversty of Maryland, Baltmore County, Baltmore,

More information

Chapter 9: Statistical Inference and the Relationship between Two Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,

More information

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1 On an Extenson of Stochastc Approxmaton EM Algorthm for Incomplete Data Problems Vahd Tadayon Abstract: The Stochastc Approxmaton EM (SAEM algorthm, a varant stochastc approxmaton of EM, s a versatle tool

More information

Gaussian Mixture Models

Gaussian Mixture Models Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

Regularized Discriminant Analysis for Face Recognition

Regularized Discriminant Analysis for Face Recognition 1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths

More information

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,

More information

A Comparative Study for Estimation Parameters in Panel Data Model

A Comparative Study for Estimation Parameters in Panel Data Model A Comparatve Study for Estmaton Parameters n Panel Data Model Ahmed H. Youssef and Mohamed R. Abonazel hs paper examnes the panel data models when the regresson coeffcents are fxed random and mxed and

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Computing MLE Bias Empirically

Computing MLE Bias Empirically Computng MLE Bas Emprcally Kar Wa Lm Australan atonal Unversty January 3, 27 Abstract Ths note studes the bas arses from the MLE estmate of the rate parameter and the mean parameter of an exponental dstrbuton.

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

IV. Performance Optimization

IV. Performance Optimization IV. Performance Optmzaton A. Steepest descent algorthm defnton how to set up bounds on learnng rate mnmzaton n a lne (varyng learnng rate) momentum learnng examples B. Newton s method defnton Gauss-Newton

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

Communication-efficient Distributed Solutions to a System of Linear Equations with Laplacian Sparse Structure

Communication-efficient Distributed Solutions to a System of Linear Equations with Laplacian Sparse Structure Communcaton-effcent Dstrbuted Solutons to a System of Lnear Equatons wth Laplacan Sparse Structure Peng Wang, Yuanq Gao, Nanpeng Yu, We Ren, Janmng Lan, and D Wu Abstract Two communcaton-effcent dstrbuted

More information

x = , so that calculated

x = , so that calculated Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

PHYS 450 Spring semester Lecture 02: Dealing with Experimental Uncertainties. Ron Reifenberger Birck Nanotechnology Center Purdue University

PHYS 450 Spring semester Lecture 02: Dealing with Experimental Uncertainties. Ron Reifenberger Birck Nanotechnology Center Purdue University PHYS 45 Sprng semester 7 Lecture : Dealng wth Expermental Uncertantes Ron Refenberger Brck anotechnology Center Purdue Unversty Lecture Introductory Comments Expermental errors (really expermental uncertantes)

More information

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.

More information

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland

More information

Bayesian predictive Configural Frequency Analysis

Bayesian predictive Configural Frequency Analysis Psychologcal Test and Assessment Modelng, Volume 54, 2012 (3), 285-292 Bayesan predctve Confgural Frequency Analyss Eduardo Gutérrez-Peña 1 Abstract Confgural Frequency Analyss s a method for cell-wse

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Semiparametric geographically weighted generalised linear modelling in GWR 4.0

Semiparametric geographically weighted generalised linear modelling in GWR 4.0 Semparametrc geographcally weghted generalsed lnear modellng n GWR 4.0 T. Nakaya 1, A. S. Fotherngham 2, M. Charlton 2, C. Brunsdon 3 1 Department of Geography, Rtsumekan Unversty, 56-1 Tojn-kta-mach,

More information

Multiple Sound Source Location in 3D Space with a Synchronized Neural System

Multiple Sound Source Location in 3D Space with a Synchronized Neural System Multple Sound Source Locaton n D Space wth a Synchronzed Neural System Yum Takzawa and Atsush Fukasawa Insttute of Statstcal Mathematcs Research Organzaton of Informaton and Systems 0- Mdor-cho, Tachkawa,

More information

Chapter 6. Supplemental Text Material

Chapter 6. Supplemental Text Material Chapter 6. Supplemental Text Materal S6-. actor Effect Estmates are Least Squares Estmates We have gven heurstc or ntutve explanatons of how the estmates of the factor effects are obtaned n the textboo.

More information

Comparison of Regression Lines

Comparison of Regression Lines STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence

More information

Number of cases Number of factors Number of covariates Number of levels of factor i. Value of the dependent variable for case k

Number of cases Number of factors Number of covariates Number of levels of factor i. Value of the dependent variable for case k ANOVA Model and Matrx Computatons Notaton The followng notaton s used throughout ths chapter unless otherwse stated: N F CN Y Z j w W Number of cases Number of factors Number of covarates Number of levels

More information

6 Supplementary Materials

6 Supplementary Materials 6 Supplementar Materals 61 Proof of Theorem 31 Proof Let m Xt z 1:T : l m Xt X,z 1:t Wethenhave mxt z1:t ˆm HX Xt z 1:T mxt z1:t m HX Xt z 1:T + mxt z 1:T HX We consder each of the two terms n equaton

More information

Section 8.3 Polar Form of Complex Numbers

Section 8.3 Polar Form of Complex Numbers 80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the

More information

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function Advanced Scence and Technology Letters, pp.83-87 http://dx.do.org/10.14257/astl.2014.53.20 A Partcle Flter Algorthm based on Mxng of Pror probablty densty and UKF as Generate Importance Functon Lu Lu 1,1,

More information

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton

More information

IMPROVED STEADY STATE ANALYSIS OF THE RECURSIVE LEAST SQUARES ALGORITHM

IMPROVED STEADY STATE ANALYSIS OF THE RECURSIVE LEAST SQUARES ALGORITHM IMPROVED STEADY STATE ANALYSIS OF THE RECURSIVE LEAST SQUARES ALGORITHM Muhammad Monuddn,2 Tareq Y. Al-Naffour 3 Khaled A. Al-Hujal 4 Electrcal and Computer Engneerng Department, Kng Abdulazz Unversty

More information

Relevance Vector Machines Explained

Relevance Vector Machines Explained October 19, 2010 Relevance Vector Machnes Explaned Trstan Fletcher www.cs.ucl.ac.uk/staff/t.fletcher/ Introducton Ths document has been wrtten n an attempt to make Tppng s [1] Relevance Vector Machnes

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Formulas for the Determinant

Formulas for the Determinant page 224 224 CHAPTER 3 Determnants e t te t e 2t 38 A = e t 2te t e 2t e t te t 2e 2t 39 If 123 A = 345, 456 compute the matrx product A adj(a) What can you conclude about det(a)? For Problems 40 43, use

More information

One-sided finite-difference approximations suitable for use with Richardson extrapolation

One-sided finite-difference approximations suitable for use with Richardson extrapolation Journal of Computatonal Physcs 219 (2006) 13 20 Short note One-sded fnte-dfference approxmatons sutable for use wth Rchardson extrapolaton Kumar Rahul, S.N. Bhattacharyya * Department of Mechancal Engneerng,

More information

A Fast and Fault-Tolerant Convex Combination Fusion Algorithm under Unknown Cross-Correlation

A Fast and Fault-Tolerant Convex Combination Fusion Algorithm under Unknown Cross-Correlation 1th Internatonal Conference on Informaton Fuson Seattle, WA, USA, July 6-9, 9 A Fast and Fault-Tolerant Convex Combnaton Fuson Algorthm under Unknown Cross-Correlaton Ymn Wang School of Electroncs and

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

The Two-scale Finite Element Errors Analysis for One Class of Thermoelastic Problem in Periodic Composites

The Two-scale Finite Element Errors Analysis for One Class of Thermoelastic Problem in Periodic Composites 7 Asa-Pacfc Engneerng Technology Conference (APETC 7) ISBN: 978--6595-443- The Two-scale Fnte Element Errors Analyss for One Class of Thermoelastc Problem n Perodc Compostes Xaoun Deng Mngxang Deng ABSTRACT

More information

DUE: WEDS FEB 21ST 2018

DUE: WEDS FEB 21ST 2018 HOMEWORK # 1: FINITE DIFFERENCES IN ONE DIMENSION DUE: WEDS FEB 21ST 2018 1. Theory Beam bendng s a classcal engneerng analyss. The tradtonal soluton technque makes smplfyng assumptons such as a constant

More information

Mathematical Preparations

Mathematical Preparations 1 Introducton Mathematcal Preparatons The theory of relatvty was developed to explan experments whch studed the propagaton of electromagnetc radaton n movng coordnate systems. Wthn expermental error the

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

Statistical pattern recognition

Statistical pattern recognition Statstcal pattern recognton Bayes theorem Problem: decdng f a patent has a partcular condton based on a partcular test However, the test s mperfect Someone wth the condton may go undetected (false negatve

More information

Power Allocation for Distributed BLUE Estimation with Full and Limited Feedback of CSI

Power Allocation for Distributed BLUE Estimation with Full and Limited Feedback of CSI Power Allocaton for Dstrbuted BLUE Estmaton wth Full and Lmted Feedback of CSI Mohammad Fanae, Matthew C. Valent, and Natala A. Schmd Lane Department of Computer Scence and Electrcal Engneerng West Vrgna

More information

Feb 14: Spatial analysis of data fields

Feb 14: Spatial analysis of data fields Feb 4: Spatal analyss of data felds Mappng rregularly sampled data onto a regular grd Many analyss technques for geophyscal data requre the data be located at regular ntervals n space and/or tme. hs s

More information

Chapter Newton s Method

Chapter Newton s Method Chapter 9. Newton s Method After readng ths chapter, you should be able to:. Understand how Newton s method s dfferent from the Golden Secton Search method. Understand how Newton s method works 3. Solve

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

Comparison of Wiener Filter solution by SVD with decompositions QR and QLP

Comparison of Wiener Filter solution by SVD with decompositions QR and QLP Proceedngs of the 6th WSEAS Int Conf on Artfcal Intellgence, Knowledge Engneerng and Data Bases, Corfu Island, Greece, February 6-9, 007 7 Comparson of Wener Flter soluton by SVD wth decompostons QR and

More information

U-Pb Geochronology Practical: Background

U-Pb Geochronology Practical: Background U-Pb Geochronology Practcal: Background Basc Concepts: accuracy: measure of the dfference between an expermental measurement and the true value precson: measure of the reproducblty of the expermental result

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

arxiv:cs.cv/ Jun 2000

arxiv:cs.cv/ Jun 2000 Correlaton over Decomposed Sgnals: A Non-Lnear Approach to Fast and Effectve Sequences Comparson Lucano da Fontoura Costa arxv:cs.cv/0006040 28 Jun 2000 Cybernetc Vson Research Group IFSC Unversty of São

More information

Chapter 5 Multilevel Models

Chapter 5 Multilevel Models Chapter 5 Multlevel Models 5.1 Cross-sectonal multlevel models 5.1.1 Two-level models 5.1.2 Multple level models 5.1.3 Multple level modelng n other felds 5.2 Longtudnal multlevel models 5.2.1 Two-level

More information

Convergence of random processes

Convergence of random processes DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large

More information

Explaining the Stein Paradox

Explaining the Stein Paradox Explanng the Sten Paradox Kwong Hu Yung 1999/06/10 Abstract Ths report offers several ratonale for the Sten paradox. Sectons 1 and defnes the multvarate normal mean estmaton problem and ntroduces Sten

More information

2 Finite difference basics

2 Finite difference basics Numersche Methoden 1, WS 11/12 B.J.P. Kaus 2 Fnte dfference bascs Consder the one- The bascs of the fnte dfference method are best understood wth an example. dmensonal transent heat conducton equaton T

More information

Motion Perception Under Uncertainty. Hongjing Lu Department of Psychology University of Hong Kong

Motion Perception Under Uncertainty. Hongjing Lu Department of Psychology University of Hong Kong Moton Percepton Under Uncertanty Hongjng Lu Department of Psychology Unversty of Hong Kong Outlne Uncertanty n moton stmulus Correspondence problem Qualtatve fttng usng deal observer models Based on sgnal

More information

Neuro-Adaptive Design - I:

Neuro-Adaptive Design - I: Lecture 36 Neuro-Adaptve Desgn - I: A Robustfyng ool for Dynamc Inverson Desgn Dr. Radhakant Padh Asst. Professor Dept. of Aerospace Engneerng Indan Insttute of Scence - Bangalore Motvaton Perfect system

More information

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal

More information