FastSLAM: An Efficient Solution to the Simultaneous Localization And Mapping Problem with Unknown Data Association

Size: px
Start display at page:

Download "FastSLAM: An Efficient Solution to the Simultaneous Localization And Mapping Problem with Unknown Data Association"

Transcription

1 FasSLAM: An Efficien Soluion o he Simulaneous Localizaion And Mapping Problem wih Unknown Daa Associaion Sebasian Thrun 1, Michael Monemerlo 1, Daphne Koller 1, Ben Wegbrei 1 Juan Nieo 2, and Eduardo Nebo 2 1 Compuer Science Deparmen 2 Ausralian Cenre for Field Roboics Sanford Universiy The Universiy of Sydney, Ausralia Absrac This aricle provides a comprehensive descripion of FasSLAM, a new family of algorihms for he simulaneous localizaion and mapping problem, which specifically address hard daa associaion problems. The algorihm uses a paricle filer for sampling robo pahs, and exended Kalman filers for represening maps acquired by he vehicle. This aricle presens wo varians of his algorihm, he original algorihm along wih a more recen varian ha provides improved performance in cerain operaing regimes. In addiion o a mahemaical derivaion of he new algorihm, we presen a proof of convergence and experimenal resuls on is performance on real-world daa. 1 Inroducion The simulaneous localizaion and mapping (SLAM) problem has received remendous aenion in he roboics lieraure. The SLAM problem involves a moving vehicle aemping o recover a spaial map of is environmen, while simulaneously esimaing is own pose (locaion and orienaion) relaive o he map. SLAM problems arises in he navigaion of mobile robos hrough unknown environmens for which no accurae map is available. Since robo moion is subjec o error, he mapping problem necessarily induces a robo localizaion problem hence he name SLAM. Applicaions of SLAM include indoors [8, 68], oudoors [3], underwaer [73], underground [69, 62], and planeary exploraion [36, 71]. The number of robos ha use maps for navigaion is long [7, 10, 13, 32, 33]. Mapping problems come a varying degrees of difficuly. In he mos basic case, he vehicle has access o a global posiioning sysem (GPS) which provides i wih accurae pose informaion. The problem of acquiring a map wih known robo poses [49, 66] is significanly easier han he general SLAM problem. When GPS is unavailable, as is he case indoors, underground, or underwaer, he vehicle will ineviably accrue pose errors during mapping. Such pose errors have he displeasing sideeffec ha hey induce sysemaic errors in he map. SLAM addresses his very problem of acquiring a map wihou an exernal source of vehicle pose informaion. The dominan approach o he SLAM problem was inroduced in a seminal paper by Smih, Self, and Cheeseman [64]. I was firs developed ino an implemened sysem by Mouarlier and Chaila [50, 51]. This approach uses an exended Kalman filer (EKF) for esimaing he poserior disribuion over he map and he robo pose. The EKF approach represens he vehicle s inernal map (and he robo pose esimae) by a high-dimensional Gaussian, over all feaures in he map and he vehicle pose. The off-diagonal elemens in he covariance marix of his mulivariae Gaussian represen he correlaions beween errors in he vehicle pose and he feaures in he map. As a resul, he EKF can accommodae he correlaed 1

2 naure of errors in he map. The EKF approach has been he basis of many recen developmens in he field [17, 34]. One limiaion of he EKF approach is compuaional in naure. Mainaining a mulivariae Gaussian requires ime quadraic in he number of feaures in he map. This limiaion has been recognized, and a number of more efficien approaches has been proposed [3, 11, 25, 35, 56, 57, 70]. The common idea underlying mos of hese approaches is o decompose he problem of building one large map ino a collecion of smaller maps, which can be updaed more efficienly. Depending on he naure of he local maps and he mechanics of racing dependencies among hem, he resuling savings range from a much reduced consan facor o implemenaions ha require consan updae ime [35, 56, 70]. A second and more imporan limiaion of he EKF approach is relaed o he daa associaion problem, also known as he correspondence problem [4, 14]. The daa associaion problem arises when differen feaures in he environmen look alike. In such cases, differen daa associaion hypoheses induce muliple, disinc looking maps. Gaussians canno represen such muli-modal disribuions. The sandard approach in he SLAM lieraure is o resric he inference o he mos plausible of hese map hypoheses, incorporaing only he mos likely daa associaion given he robo s curren map. The deerminaion of he mos likely daa associaion may be performed on a per-measuremen basis [17], or i may incorporae muliple measuremens a a ime [3, 53]. The laer approach is more robus; however, boh approaches end o fail caasrophically when he alleged daa associaion is incorrec. Alernaive approaches exis ha inerleave daa associaion decisions wih map building in a way ha enables hem o revise pas daa associaion decisions, such as he RANSAC algorihm [22], he expecaion maximizaion approach [63, 68], or approaches based on MCMC echniques [1]. However, such echniques canno be execued in real-ime and are herefore of lesser relevance o he problems sudied here. This aricle describes a family of algorihms called FasSLAM [27, 46]. FasSLAM is a SLAM algorihm ha inegraes paricle filers [18, 37] and exended Kalman filers. I explois a srucural propery of he SLAM problem firs poined ou by Murphy [52]: feaure esimaes are condiional independen given he robo pah. More specifically, correlaions in he uncerainy among differen map feaures arise only hrough robo pose uncerainy. If he robo was old is correc pah, he errors in is feaure esimaes would be independen of each oher. This observaion allows us o define a facored represenaion of he poserior over poses and maps. FasSLAM implemens such a facored represenaion, using paricle filers for esimaing he robo pah. Condiioned on hese paricles he individual map errors are independen, hence he mapping problem can be facored ino separae problems, one for each feaure in he map. FasSLAM esimaes hese feaure locaions by EKFs. The basic algorihm can be implemened in ime logarihmic in he number of landmarks, using efficien ree represenaions of he map [45]. Hence, FasSLAM offers compuaional advanages over plain EKF implemenaions and many of is descendans. The key advanage of FasSLAM, however, is he fac ha daa associaion decisions can be made on a per-paricle basis, similar o muli-hypohesis racking algorihms [60]. As a resul, he filer mainains poseriors over muliple daa associaions, no jus he mos likely one. As shown empirically, his feaure makes FasSLAM significanly more robus o daa associaion problems han algorihms based on maximum likelihood daa associaion. A final, advanage of FasSLAM over EKF-syle approaches arises from he fac ha paricle filers can cope wih non-linear robo moion models, whereas EKF-syle echniques approximae such models via linear funcions. This aricle describes wo insaniaions of he FasSLAM algorihm, referred here o FasSLAM 1.0 and 2.0. FasSLAM 1.0 is he original FasSLAM algorihm [45], which is concepually simple and easy o implemen. In cerain siuaions, however, he paricle filer componen of FasSLAM 1.0 generaes samples inefficienly. The algorihm FasSLAM 2.0 [45] overcomes his problem hrough an improved proposal disribuion, bu a he expense of an implemenaion ha is significanly more involved (as is he mahemaical derivaion). For boh algorihms, we provide echniques for esimaing daa associaion in SLAM [44, 55]. The derivaion of all algorihms is carried ou using probabilisic noaion; however, 2

3 he resuling expressions will be provided using linear algebraic equaions familiar from he filering lieraure. We offer a proof of convergence in expecaion for FasSLAM 2.0 in linear-gaussian SLAM. Furher, we provide exensive experimenal resuls using real-world daa. We show empirically ha he FasSLAM 2.0 ouperforms EKFs in siuaions plagued wih hard daa associaion problems, hanks o is abiliy o pursue muliple daa associaion hypoheses simulaneously. We also provide experimenal resuls for learning maps wih as many as 10 6 feaures, which is orders of magniude larger han he larges maps ever buil wih EKFs. 2 The SLAM Problem The SLAM problem is defined as he problem of recovering a map and a robo pose (locaion and orienaion) from daa acquired by a mobile robo. The robo gahers informaion abou nearby landmarks, and i also measures is own moion. Boh ypes of measuremens are subjec o noise. They are compiled ino a probabilisic esimae of he map along wih he robo s momenary pose (locaion and orienaion). Figure 1 illusraes he SLAM problem graphically. Panel (a) shows he uncerainy accrued along a robo s pah, along wih he uncerainy in he locaion of all feaures seen hus far. As his graphic illusraes, he robo s pose uncerainy increases over ime, as does is esimae on he absolue locaion of individual feaures. A key characerisic of he SLAM problem is highlighed in Figure 1b: Here he robo senses a previously observed landmark whose posiion is relaively well known. This observaion provides he robo wih informaion abou is momenary posiion. I also increases is knowledge of oher feaure locaions in he map, which leads o a reducion of map uncerainy as indicaed in Figure 1b. Noice ha while in principle, he robo could also improve is esimae of pas poses, i is common in SLAM no o consider pas poses so as o keep he amoun of compuaion independen on he lengh of he robo s hisory. To describe SLAM more formally, le us denoe he map by Θ. The map consiss of a collecion of feaures, each of which will be denoed θ n. The oal number of saionary feaures will be denoed N. The robo pose is defined a s, where is a discree ime index. Poses of robos operaing on he plane ypically comprise he robo s wo-dimensional Caresian coordinaes, along wih is angular orienaion. The sequence s = s 1, s 2,..., s denoes he pah of he robo up o ime. Throughou his aricle, we will use he superscrip s o denoe sequence of variables from ime 1 up o ime. To acquire a map, he robo can sense. Sensor measuremens convey informaion abou he range, disance, appearance ec. of nearby feaures. This is illusraed in Figure 2, in which a robo measures he range and bearing o a nearby landmark. Wihou loss of generaliy, we assume ha he robo observes exacly one landmark a a ime. The measuremen a ime, denoed z, may be he range and bearing of a nearby feaure. The assumpion of observing a single feaure a a ime is adoped for convenience; muliple feaure sighings are easily processed sequenially. Highly resricive, however, is an assumpion ha we will iniially adop bu evenually drop in laer secions of his paper, namely ha he robo can deermine he idenify of each feaure. For each measuremen z, n specifies he ideniy of he observed feaure. The range of he correspondence variable n is he finie se {1,..., N}. A he core of our SLAM algorihm is a generaive model of sensor measuremens, ha is, a probabilisic law ha specifies he process according o which measuremens are generaed. This model will be referred o as measuremen model and is of he following form: p(z s, θ n, n ) = g(θ n, s ) + ε (1) The measuremen model is condiioned on he robo pose s, he landmark ideniy n, and he specific feaure θ n ha is being observed. I is governed by a (deerminisic) funcion g disored by noise. The noise a ime is modeled by he random variable ε, which will be assumed o be normally disribued wih mean zero and covariance R. The Gaussian noise assumpion is usually jus an approximaion, 3

4 (a) (b) Figure 1: The SLAM problem: (a) A robo navigaes hrough unknown errain; as i progresses, is own pose uncerainy increases, as indicaed by he shaded ellipses along he robo s pah, and so does he uncerainy in he map (red ellipses). (b) Loop closure: Revisiing a previously seen landmark leads o a reducion in he uncerainy of he momenary pose and all landmarks. In online SLAM algorihms, his reducion is usually only applied o he momenary pose. bu one ha ends o work well across a range of sensors [64, 17]. The measuremen funcion g is generally nonlinear in is argumens. A common example is ha of range and bearing measuremen, as discussed above. The range (disance) and bearing (angle) o a landmarks are easily calculaed hrough simple rigonomeric funcions ha are non-linear in he coordinae variables of he robo and he sensed feaure. A second source of informaion for solving SLAM problems are he conrols of he vehicle. Conrols are denoed u, and refer o he collecive moor commands carried ou in he ime inerval [ 1, ). The probabilisic law governing he evoluion of poses is commonly referred o as kinemaic moion model, 4

5 r φ Figure 2: Vehicle observing he range and bearing o a nearby landmark. and will be assumed o be of he following form: p(s u, s 1 ) = h(u, s 1 ) + δ (2) As his expression suggess, he pose a ime is a funcion h of he robo s pose one ime sep earlier, disored by Gaussian noise. The laer is capured by he random variable δ, whose mean is zero and whose covariance will be denoed by P. As was in he case of he measuremen model, he funcion h is usually nonlinear in is argumen. The goal of SLAM is he recovery of he map from sensor measuremens z and conrols u. Mos SLAM algorihms are insances of Bayes filers [29], and as such recover a any insan in ime a probabiliy disribuion over he map Θ and he momenary robo pose s : p(s, Θ z, u, n ) (3) If his probabiliy is calculaed recursively from earlier probabiliies of he same kind, he esimaion algorihm is a filer. Mos SLAM algorihms are insaniaions of he Bayes filer, which compues his poserior from he one calculaed one ime sep earlier (see [65] for a derivaion): p(s, Θ z, u, n ) = η p(z s, θ n, n ) p(s s 1, u ) p(s 1, Θ z 1, u 1, n 1 ) ds 1 (4) Here η is a normalizaion consan (which is equivalen o p(z z 1, u, n ) in his equaion). The normalizer η does no depend on any of he variables over which he poserior is being compued. Throughou his aricle, we will adop he common noaion of using he leer η for generic normalizaion consans, regardless of heir acual values. The Bayes filer (4) is a he core of many conemporary SLAM algorihms. In cases where boh g and h are linear, (4) is equivalen o he well-known Kalman filer [31, 40]. Exended Kalman filers (EKFs) allow for nonlinear funcions g and h, bu approximae hose using a linear funcion, obained hrough a firs degree Taylor expansion. Taylor expansions are used by he seminal EKF algorihm for SLAM [64]. Oher, less explored opions for linearizaion include he unscened filer [30, 72] and momens maching [42]. A firs glance, one migh consider ha he poserior (3) capures all relevan informaion, hence is should he gold sandard for roboic SLAM. However, here are oher, more elaborae disribuions ha can be esimaed in SLAM. The algorihm FasSLAM, in paricular, esimaes a poserior over robo pahs, no jus momenary poses, along wih he map: p(s, Θ z, u, n ) (5) 5

6 A firs glance, esimaing he enire pah poserior migh appear o be a quesionable choice. As he pah lengh increases, so does he space over which he poserior (5) is defined. Such a propery seems o be a odds wih he real-ime execuion of a filer. However, as we will see below, specific ypes of filers calculae poseriors over pahs jus as efficienly as over momenary poses. This alone, however, would barely serve as a moivaion o prefer pah poseriors over pose poseriors. The rue moivaion behind (5) arises from he fac ha i can be decomposed ino a produc of much smaller erms a opic ha will be discussed in a separae secion below. The filer for calculaing he poserior (5) is as follows: p(s, Θ n, z, u ) = η p(z s, θ n, n ) p(s s 1, u ) p(s 1, Θ n 1, z 1, u 1 ) (6) This updae equaion differs from he sandard Bayes filer (4) in he absence of an inegral sign: in paricular, he pose a ime 1, s 1, is no inegraed ou. Is derivaion is mosly analogous o ha of he regular Bayes filer (4), as provided in [65]. Bayes rule enables us o ransform he lef-hand side of (6) ino he following produc: p(s, Θ n, z, u ) = η p(z s, Θ, n, z 1, u ) p(s, Θ n, z 1, u ) (7) We now exploi he fac ha he measuremen z depends only on hree variables: he robo pose s a he ime he measuremen was aken and he ideniy n and locaion θ n of he observed feaure. Pu ino equaions, we have p(z s, Θ, n, z 1, u ) = p(z s, θ n, n ) (8) Furhermore, he probabiliy p(s, Θ n, z 1, u ) in (7) can be facored as follows: p(s, Θ n, z 1, u ) = p(s s 1, Θ, n, z 1, u ) p(s 1, Θ n, z 1, u ) (9) Boh erms can grealy be simplified, by dropping variables ha convey no informaion for he specific probabiliy. In paricular, knowledge of s 1 and u are sufficien o predic s ; all oher variables in he firs erm on he righ hand side of (9) carry no addiional informaion and can herefore be omied. Similarly, n and u carry no informaion abou he poserior over s 1 and Θ. Hence, we can re-wrie (9) as follows: p(s, Θ n, z 1, u ) = p(s s 1, u ) p(s 1, Θ n 1, z 1, u 1 ) (10) As he reader may easily verify, subsiuing his equaion and (8) back ino (7) yields he desired filer (6). This filer, and he poserior i represens, form he core of all FasSLAM algorihms. 3 Facoring he SLAM Poserior A key mahemaical insigh perains o he fac ha he poserior (5) possess an imporan characerisic. This characerisic was firs repored in [52] and laer exploied in [2, 47] and various FasSLAM 2.0 papers [45, 44, 55]. I also was used in an earlier mapping algorihms [67] bu was no made explici a ha ime. The insigh is ha he SLAM poserior can be wrien in he facored form give by he following produc: p(s, Θ n, z, u ) = N p(s n, z, u ) p(θ n s, n, z ) (11) n=1 This facorizaion saes ha he calculaion of he poserior over pahs and maps can be decomposed ino N + 1 recursive esimaors, an esimaor over robo pahs, p(s n, z, u ), and N separae esimaors 6

7 θ 1 z1 z 3... s 1 s 2 s 3 s u 2 u 3 u z 2 z θ 2 Figure 3: The SLAM problem: The robo moves from pose s 1 hrough a sequence of conrols, u 1, u 2,.... As i moves, i measures nearby landmarks. A ime = 1, i observes landmark θ 1 ou of wo landmarks, {θ 1, θ 2}. The measuremen is denoed z 1 (range and bearing). A ime = 1, i observes he oher landmark, θ 2, and a ime = 3, i observes θ 1 again. The SLAM problem is concerned wih esimaing he locaions of he landmarks and he robo s pah from he conrols u and he measuremens z. The gray shading illusraes a condiional independence relaion. over feaure locaions p(θ n s, n, z ) condiioned on he pah esimae, one for each n = 1,..., N. The produc of hese probabiliies represen he desired poserior in a facored way. This facored represenaion is exac, no jus an approximaion. I is a generic propery of he SLAM problem. To illusrae he correcness of his facorizaion, Figure 3 depics he daa acquisiion process graphically, in form of a dynamic Bayesian nework [24]. As his graph suggess, each measuremen z 1,..., z is a funcions of he posiion of he corresponding feaure, along wih he robo pose a he ime he measuremen was aken. Knowledge of he robo pah d-separaes [58] he individual feaure esimaion problems and renders hem independen of each oher. Knowledge of he exac locaion of one feaure will herefore ell us nohing abou he locaions of oher feaures. The same observaion is easily derived mahemaically. The saed independence is given by he following produc form: p(θ s, n, z ) = N p(θ n s, n, z ) (12) n=1 Noice ha all probabiliies are condiioned on he robo pah s. Our derivaion of (12) requires he disincion of wo possible cases, depending on wheher or no he feaure θ n was observed in he mos recen measuremen. In paricular, if n n, he mos recen measuremen z has no effec on he poserior, and neiher has he robo pose s or he correspondence n. Thus, we obain: p(θ n s, n, z ) = p(θ n s 1, n 1, z 1 ) (13) If n = n, ha is, if θ n = θ n was observed by he mos recen measuremen z, he siuaion calls for applying Bayes rule, followed by some sandard simplificaions: p(θ n s, n, z ) = p(z θ n, s, n, z 1 ) p(θ n s, n, z 1 ) p(z s, n, z 1 ) = p(z s, θ n, n ) p(θ n s 1, n 1, z 1 ) p(z s, n, z 1 ) (14) This gives us he following expression for he probabiliy p(θ n s 1, n 1, z 1 ): p(θ n s 1, n 1, z 1 ) = p(θ n s, n, z ) p(z s, n, z 1 ) p(z s, θ n, n ) (15) 7

8 The proof of he correcness of (12) is now carried ou by mahemaical inducion. Le us assume ha he poserior a ime 1 is already facored: p(θ s 1, n 1, z 1 ) = N p(θ n s 1, n 1, z 1 ) (16) n=1 This saemen is rivially rue a = 1, since in he beginning he robo has no knowledge abou any feaure, and hence all esimaes are independen. A ime, he poserior is of he following form: p(θ s, n, z ) = p(z Θ, s, n, z 1 ) p(θ s, n, z 1 ) p(z s, n, z 1 ) = p(z s, θ n, n ) p(θ s 1, n 1, z 1 ) p(z s, n, z 1 ) Plugging in our inducive hypohesis (16) gives us: p(θ s, n, z ) N p(θ n s 1, n 1, z 1 ) n=1 = p(z s, θ n, n ) p(z s, n, z 1 ) p(θ n s 1, n 1, z 1 ) }{{} Eq. (15) = p(θ n s, n, z ) = p(z s, θ n, n ) p(z s, n, z 1 ) = p(θ n s 1, n 1, z 1 ) }{{} n n Eq. (13) p(θ n s, n, z ) n n N p(θ n s, n, z ) (18) n=1 Noice ha we have subsiued our Equaions (13) and (15) as indicaed. This shows he correcness of Equaion (12). The correcness of he main form (11) follows now direcly from his resul and he following generic ransformaion: p(s, Θ n, z, u ) = p(s n, z, u ) p(θ s, n, z, u ) (17) = p(s n, z, u ) p(θ s, n, z ) N = p(s n, z, u ) p(θ n s, n, z ) (19) n=1 We noe ha condiioning on he enire pah s is indeed essenial for his resul. The mos recen pose s would be insufficien as condiioning variable, as dependencies may arise hrough previous poses This observaion provides he moivaion for our choice of poserior over pahs and maps (5), in place of he much more common form saed in Equaion (3). 4 FasSLAM wih Known Daa Associaion Hisorically, FasSLAM 1.0 was he earlies version of he FasSLAM family of algorihms, and i is also he easies o implemen [45]. We will herefore begin our descripion of FasSLAM wih version 1.0, alhough mos of he observaions in his secions apply equally o FasSLAM 2.0. Boh FasSLAM algorihms exploi he facored poserior derived in he previous secion. The facorial naure of he poserior provides us wih significan compuaional advanages over SLAM algorihms ha esimae an unsrucured poserior disribuion. FasSLAM explois he facored represenaion by mainaining N + 1 filers, one for each facor in (11). By doing so, all N + 1 filers are low dimensional. 8

9 Robo Pose Landmark 1 Landmark 2 Landmark N Paricle 1: x y θ µ Σ 1 1 µ 2 Σ 2... µ N Σ N Paricle 2: x y θ µ 1 Σ 1 µ 2 Σ 2... µ Ν Σ Ν... Paricle M: x y θ µ 1 Σ 1 µ 2 Σ 2... µ Ν Σ Ν Figure 4: Paricles in FasSLAM. More specifically, boh FasSLAM versions calculaes he poserior over robo pahs p(s n, z, u ) by a paricle filer [18, 37], similar o previous work in mobile robo localizaion [23], mapping [65], and visual racking [28]. The paricle filer has he pleasing propery ha he amoun of compuaion needed for each incremenal updae says consan, regardless of he pah lengh. Addiionally, i can cope gracefully wih non-linear robo moion models. The remaining N (condiional) poseriors over feaure locaions p(θ n s, n, z, u ) are calculaed by exended Kalman filers (EKFs). Each EKF esimaes a single landmark pose, hence i is low-dimensional. The individual EKFs are condiioned on robo pahs. Hence, each paricle possesses is own se of EKFs. In oal here are NM EKFs, one for each feaure in he map and one for each landmark. 1 Figure 4 illusraes he srucure of he M paricles in FasSLAM. Pu ino equaions, each paricle is of he form S = s,, µ 1,, Σ 1,,..., µ N,, Σ N, The brackeed noaion indicaes he index of he paricle; s, is is pah esimae, and µ Σ n, (20) n, and are he mean and variance of he Gaussian represening he n-h feaure locaion. Togeher, all hese quaniies form he m-h paricle S, of which here are a oal of M in he FasSLAM poserior. Filering, ha is, calculaing he poserior a ime from he one a ime 1 involves generaing a new paricle se S from S 1, he paricle se one ime sep earlier. This new paricle se incorporaes a new conrol u and a measuremen z (wih associaed correspondence n ). This updae is performed in he following seps: 1. Exending he pah poserior by sampling new poses. FasSLAM 1.0 uses he conrol u o sample new robo pose s for each paricle in S 1. More specifically, consider he m-he paricle in S 1, denoed by S. FasSLAM 1.0 samples he pose s in accordance wih he m-h paricle, by drawing a sample according o he moion poserior s p(s s 1, u ) (21) Here s 1 is he poserior esimae for he robo locaion a ime 1, residing in he m-h paricle. The resuling sample s is hen added o a emporary se of paricles, along wih he pah of previous poses, s 1,. This operaion requires consan ime per paricle, independen of he map size N. The sampling sep is graphically depiced in Figure 5, which illusraes a se of pose paricles drawn from a single iniial pose. 1 Readers familiar wih he saisical lieraure may wan o noe ha boh FasSLAM versions are insances of so-called Rao-Blackwellized paricle filers [19, 52], by virue of he fac ha i combines paricle represenaions wih closed-form represenaions of cerain marginals. 9

10 Figure 5: Samples drawn from he probabilisic moion model. 2. Updaing he observed landmark esimae. Nex, FasSLAM 1.0 updaes he poserior over he landmark esimaes, represened by he mean µ n, 1 and he covariance Σ n, 1. The updaed values are hen added o he emporary paricle se, along wih he new pose. The updae depends on wheher or no a landmark n was observed a ime. For n n, we already esablished in Equaion (13) ha he poserior over he landmark remains unchanged. This implies he simple updae: µ n,, Σ n, = µ n, 1, Σ n, 1 For he observed feaure n = n, he updae is specified hrough Equaion (14), resaed here wih he normalizer denoed by η: p(θ n s, n, z ) = η p(z s, θ n, n ) p(θ n s 1, n 1, z 1 ) (23) The probabiliy p(θ n µ n, 1 (22) s 1, n 1, z 1 ) a ime 1 is represened by a Gaussian wih mean and covariance Σ n, 1. For he new esimae a ime o also be Gaussian, FasSLAM linearizes he percepual model p(z s, θ n, n ) in he same way as EKFs [40]. In paricular, FasSLAM approximaes he measuremen funcion g by he following firs-degree Taylor expansion: g(θ n, s ) g(µ n,, 1 s ) + g (s, µ n }{{} ), 1 (θ n µ }{{} =: ẑ =: G = ẑ + G n, 1 ) (θ n µ n, 1 ) (24) Here he derivaive g is aken wih respec o he feaure coordinaes θ n. This linear approximaion is angen o g a s and µ n, 1. Under his approximaion, he poserior for he locaion of feaure n is indeed Gaussian. The new mean and covariance are obained using he sandard EKF measuremen updae [40]: K = Σ µ n, = µ Σ n, 1 G n +, 1 K (G T Σ n, 1 G + R ) 1 (z ẑ ) T n, = (I K G T )Σ n, 1 (25) 10

11 Proposal Targe Samples from proposal disribuion Weighed samples Figure 6: Samples canno be drawn convenienly from he arge arge disribuion (shown as a solid line). Insead, he imporance sampler draws samples from he proposal disribuion (dashed line), which has a simpler form. Below, samples drawn from he proposal disribuion are drawn wih lenghs proporional o heir imporance weighs.. Seps 1 and 2 are repeaed M imes, resuling in a emporary se of M paricles. 3. Resampling. In a final sep, FasSLAM resamples his se of paricles, ha is, FasSLAM draws from his emporary se M paricles (wih replacemen), which hen form he new paricle se, S. The necessiy o resample arises from he fac ha he paricles in he emporary se are no disribued according o he desired poserior: Sep 1 generaes poses s only in accordance wih he mos recen conrol u, paying no aenion o he measuremen z. Resampling is a common echnique in paricle filering o correc for such mismaches. This siuaion is illusraed for a simplified 1-D example in Figure 6. Here he dashed line symbolizes he proposal disribuion, which is he disribuion a which paricles are generaed, and he solid line is he arge disribuion [41]. In FasSLAM, he proposal disribuion does no depend on z, bu he arge disribuion does. By weighing paricles as shown in he boom of his figure, and resampling according o hose weighs, he resuling paricle se indeed approximaes he arge disribuion. The weigh of each sample used in he resampling sep is called he imporance facor [61]. To deermine imporance facor of each paricle, i will prove useful o calculae he acual proposal disribuion of he pah paricles in he emporary se. Under he assumpion ha he se of pah paricles in S 1 is disribued according o p(s 1 z 1, u 1, n 1 ) (which is an asympoically correc approximaion), pah paricles in he emporary se are disribued according o: p(s, z 1, u, n 1 ) = p(s The facor p(s s 1, u ) p(s 1, z 1, u 1, n 1 ) (26) s 1, u ) is he sampling disribuion used in Equaion (21). The arge disribuion akes ino accoun he measuremen a ime z, along wih he correspondence n : p(s, z, u, n ) (27) The resampling process accouns for he difference of he arge and he proposal disribuion. The imporance facor for resampling is given by he quoien of he arge and he proposal disribu- 11

12 ion [41]: w = = arge disribuion proposal disribuion p(s, z, u, n ) p(s, z 1, u, n 1 ) = η p(z s,, z 1, n ) (28) The las ransformaion is a direc consequence of he following ransformaion of he enumeraor in (28): p(s, z, u, n ) = η p(z s,, z 1, u, n ) p(s, z 1, u, n ) = η p(z s,, z 1, n ) p(s, z 1, u, n 1 ) (29) To calculae he probabiliy p(z s,, z 1, n ) in (28), i will be necessary o ransform i furher. In paricular, i is equivalen o he following inegraion, where we once again omi variables irrelevan o he predicion of sensor measuremens: w = η = η p(z θ n, s,, z 1, n ) p(θ n s,, z 1, n ) dθ n p(z θ n, n, s ) p(θ n s 1,, z 1, n 1 ) }{{} N (µ n, 1,Σ n, 1 ) dθ n (30) Here N (x; µ, Σ) denoes a Gaussian disribuion over he variable x wih mean µ and covariance Σ. The inegraion in (30) involves he esimae of he observed landmark locaion a ime, and he measuremen model. To calculae (30) in closed form, FasSLAM employs he very same linear approximaion used in he measuremen updae in Sep 2. In paricular, he imporance facor is given by w η 2πQ 1 { 2 exp 1 2 (z ẑ ) T Q 1 (z ẑ } ) (31) wih he covariance Q = G T Σ n, 1 G + R (32) This expression is he probabiliy of he acual measuremen z under he Gaussian ha resuls from he convoluion of he disribuions in (30), exploiing our linear approximaion of g. The resuling imporance weighs are used o draw (wih replacemen) M new samples from he emporary sample se. Through his resampling process, paricles survive in proporion of heir measuremen probabiliy. Unforunaely, resampling may ake ime linear in he number of feaures N, since enire maps may have o be duplicaed when a paricle is drawn more han once. These hree seps ogeher consiue he updae rule of he FasSLAM 1.0 algorihm for SLAM problems wih known daa associaion. We noe ha he execuion ime of he updae does no depend on he oal pah lengh. In fac, only he mos recen pose s 1 is used in he process of generaing a new paricle a ime. Consequenly, pas poses can safely be discarded. This has he pleasing consequence ha neiher he ime requiremens, no he memory requiremens of FasSLAM depend on. 12

13 (a) (b) Figure 7: Mismach beween proposal and poserior disribuions: (a) illusraes he forward samples generaed by FasSLAM 1.0, and he poserior induced by he measuremen (ellipse). Diagram (b) shows he sample se afer he resampling sep. 5 FasSLAM 2.0: Improved Proposal Disribuion FasSLAM 2.0 [46] is largely equivalen o FasSLAM 1.0, wih one imporan excepion: Is proposal disribuion akes he measuremen z ino consideraion. By doing o i can avoid some imporan problems ha can arise in FasSLAM 1.0. In paricular, FasSLAM 1.0 samples poses bases on he conrol u only, and hen uses he measuremen z o resample hose poses. This is problemaic when he accuracy of conrol is low relaive o he accuracy of he robo s sensors. Such a siuaion is illusraed in Figure 7: Here he proposal generaes a large specrum of samples shown in Figure 7a, bu only a small subse of hese samples have high likelihood, as indicaed by he ellipsoid. Afer resampling, only paricles wihin he ellipsoid survive wih reasonably high likelihood. Clearly, i would be advanageous o ake he measuremen ino consideraion when generaing paricles which FasSLAM 1.0 fails o do. Fas- SLAM 2.0 achieves his by sampling poses based on he measuremen z, in addiion o he conrol u. Thus, as a resul, FasSLAM 2.0 is less waseful wih is paricle han FasSLAM 1.0. We will quanify he effec of his change in deain in he experimenal resuls secion of his paper. Unforunaely Fas- SLAM 2.0 is more difficul o implemen han FasSLAM 1.0, and is mahemaical derivaion is more involved. In he remainder of his secion we will discuss he individual updae seps in FasSLAM 2.0, which parallel he corresponding seps in FasSLAM 1.0 as described in he previous secion. 5.1 Exending The Pah Poserior By Sampling A New Pose In FasSLAM 2.0, he pose s is drawn from he poserior s p(s s 1,, u, z, n ) (33) which differs from he proposal disribuion provided in (21) in ha (33) akes he measuremen z ino consideraion, along wih he correspondence n. The reader may recall ha s 1, is he pah up o ime 1 of he m-h paricle. The mechanism for sampling from (33) requires furher analysis. Firs, we rewrie (33) in erms of he known disribuions, such as he measuremen and moion models, and he Gaussian feaure esimaes in he m-h paricle. p(s s 1,, u, z, n ) Bayes = p(z s, s 1,, u, z 1, n ) p(s s 1,, u, z 1, n ) p(z s 1,, u, z 1, n ) = η p(z s, s 1,, u, z 1, n ) p(s s 1,, u, z 1, n ) 13

14 Markov = η p(z s, s 1,, u, z 1, n ) p(s s 1, u ) = η p(z θ n, s, s 1,, u, z 1, n ) p(θ n s, s 1,, u, z 1, n ) dθ n Markov = η p(s s 1, u ) p(z θ n, s, n ) }{{} N (z ;g(θ n,s ),R ) p(θ n s 1,, z 1, n 1 ) }{{} N (θ n ;µ n, 1,Σ n, 1 ) dθ n p(s s 1, u ) }{{} N (s ;h(s 1,u),P) (34) This expression makes apparen ha our sampling disribuion is ruly he convoluion of wo Gaussians muliplied by a hird. Unforunaely, in he general case he sampling disribuion possesses no closed form from which we could easily sample. The culpri is he funcion g: If i were linear, his probabiliy would be Gaussian, a fac ha shall become more obvious below. In he general case, no even he inegral in (34) possess a closed form soluion. For his reason, sampling from he probabiliy (34) is difficul. This observaion moivaes he replacemen of g by a linear approximaion. As in FasSLAM 1.0, his approximaion is obained hrough a firs order Taylor expansion, given by he following linear funcion: g(θ n, s ) ẑ Here we use he following abbreviaions: + G θ (θ n µ n, 1 ) + G s(s ŝ ) (35) ẑ = g(µ n, 1, ŝ ) (36) ŝ = h(s 1, u ) The marices G θ and G s are he Jacobians of g, ha is, hey are he derivaives of g wih respec o θ n and s, respecively, evaluaed a he expeced values of heir argumens: G θ = θn g(θ n, s ) (38) s=ŝ ;θ n =µ n, 1 G s = s g(θ n, s ) s=ŝ ;θ n =µ n, 1 Under his approximaion, he desired sampling disribuion (34) is a Gaussian wih he following parameers: [ Σ s = G T s Q 1 ] 1 G s + P 1 (40) µ s s = Σ where he marix Q Q G T s Q 1 (37) (39) (z ẑ ) + ŝ (41) is defined as follows: = R + G θ Σ n, 1 GT θ (42) To see, we noe ha under ou linear approximaion he convoluion heorem provides us wih a closed form for he inegral erm in (34): N (z ; ẑ + G s s G s ŝ, Q ) (43) The sampling disribuion (34) is now given by he produc of his normal disribuion and he righmos erm in (34), he normal N (s ; ŝ, P ). Wrien in Gaussian form, we have { } p(s s 1,, u, z, n ) = η exp (44) y 14

15 wih y [ = 1 2 (z ẑ +(s ŝ G s s + G s ŝ ) T P 1 (s ŝ ) ) T Q 1 ] (z ẑ G s s + G s ŝ ) (45) This expression is obviously quadraic in our arge variable s, hence p(s s 1,, u, z, n ) is Gaussian. The mean and covariance of his Gaussian are equivalen o he minimum of y and is curvaure. Those are idenified by calculaing he firs and second derivaives of y wih respec o s : y = G T s Q 1 s 2 y s 2 = (G T s Q 1 (z ẑ G s + P 1 G s s + G s ŝ )s G T s Q 1 (z ẑ ) + P 1 (s ŝ ) + G s ŝ ) P 1 ŝ (46) = G T s Q 1 G s + P 1 (47) The covariance Σ s of he sampling disribuion is now obained by he inverse of he second derivaive [ Σ s = G T s Q 1 ] G s + P 1 1 (48) The mean µ s gives us: [ µ s = Σ s of he sample disribuion is obained by seing he firs derivaive (46) o zero, which = Σ s = Σ s G T s Q 1 G T s Q 1 G T s Q 1 (z ẑ + G s ŝ (z ẑ [ ) + Σ s ) + P 1 G T s Q 1 ŝ ] G s + P 1 ] ŝ (z ẑ ) + ŝ (49) This Gaussian is he approximaion of he desired sampling disribuion (33) in FasSLAM 2.0. Obviously, his proposal disribuion is quie a bi more involved han he much simpler one for FasSLAM 1.0 in Equaion (21). Is advanage will be characerized below, when we will empirically compare boh FasSLAM algorihms. 5.2 Updaing The Observed Landmark Esimae Jus like FasSLAM 1.0, FasSLAM 2.0 updaes he poserior over he landmark esimaes based on he measuremen z and he sampled pose s. The esimaes a ime 1 are once again represened by he mean µ n, 1 and he covariance Σ n, 1, and he updaed esimaes are he mean µ n, and he covariance Σ n,. The naure of he updae depends on wheher or no a landmark n was observed a ime. For n n, we already esablished in Equaion (13) ha he poserior over he landmark remains unchanged. This implies ha insead of updaing he esimaed, we merely have o copy i. For he observed feaure n = n, he siuaion is more inricae. Equaion (14) already specified he poserior over observed feaures, here wrien wih he paricle index m: p(θ n s,, n, z ) = η p(z θ n, s, n ) }{{} N (z ;g(θ n,s ),R ) p(θ n s 1,, z 1, n 1 ) }{{} N (θ n ;µ n, 1,Σ n, 1 ) As in (34), he nonlineariy of g causes his poserior o be non-gaussian, which is a odds wih Fas- SLAM 2.0 s Gaussian represenaion for feaure esimaes. Luckily, he exac same linearizaion as above provides he soluion: g(θ n, s ) ẑ (50) + G θ (θ n µ n, 1 ) (51) 15

16 (Noice ha s is no a free variable here, hence we can omi he hird erm in (35).) This approximaion renders he probabiliy (50) Gaussian in he arge variable θ n : p(θ n s,, n, z ) { = η exp 1 2 (z ẑ G θ (θ n µ n, 1 ))T R 1 (z ẑ 1 2 (θ n µ n, 1 )T Σ 1 n (θ, 1 n µ } n ), 1 G θ (θ n µ n, 1 )) The new mean and covariance are obained using he sandard EKF measuremen updae equaions [29, 40], whose derivaion can be found in in Appendix A. K = Σ n, 1 GT θ Q 1 (53) µ n, = µ Σ n +, 1 K (52) (z ẑ ) (54) n, = (I K G θ )Σ n, 1 (55) 5.3 Calculaing Imporance Facors The paricles generaed hus far do no ye mach he desired poserior. In FasSLAM 2.0, he culpri is he normalizer η in (34), which may be differen for differen paricles m. These differences are no ye accouned for in he re sampling process. As in FasSLAM 1.0, he imporance facor is given by he following quoien. w = arge disribuion proposal disribuion Once again, he arge disribuion ha we would like our paricles o assume is given by he pah poserior, p(s, z, u, n ). Under he (asympoically correc) assumpions ha pahs in s 1, were generaed according o he arge disribuion one ime sep earlier, p(s 1, z 1, u 1, n 1 ), we noe ha he proposal disribuion is now given by he produc p(s 1, z 1, u 1, n 1 ) p(s s 1,, u, z, n ) (57) The second erm in his produc is he pose sampling disribuion (34). The imporance weigh is obained as follows: w = = = p(s, u, z, n ) p(s s 1,, u, z, n ) p(s 1, u 1, z 1, n 1 ) p(s s 1,, u, z, n ) p(s 1, u, z, n ) p(s s 1,, u, z, n ) p(s 1, u 1, z 1, n 1 ) p(s 1, u, z, n ) p(s 1, u 1, z 1, n 1 ) Bayes = η p(z s 1,, u, z 1, n ) p(s 1, u, z 1, n ) p(s 1, u 1, z 1, n 1 ) Markov = η p(z s 1,, u, z 1, n ) p(s 1, u 1, z 1, n 1 ) p(s 1, u 1, z 1, n 1 ) = η p(z s 1,, u, z 1, n ) (58) The reader may noice ha his expression is in essence he inverse of our normalizaion consan η in (34). Furher ransformaions give us he following form: w 16 (56)

17 Pose uncerainy Figure 8: The daa associaion problem in SLAM. = η p(z s, s 1,, u, z 1, n ) p(s s 1,, u, z 1, n ) ds Markov = η p(z s, s 1,, u, z 1, n ) p(s s 1, u ) ds = η p(z θ n, s, s 1,, u, z 1, n ) p(θ n s, s 1,, u, z 1, n ) dθ n Markov = η p(s s 1, u ) ds p(z θ n, s, n ) }{{} N (z ;g(θ n,s ),R ) p(θ n s 1,, u 1, z 1, n 1 ) }{{} N (θ n ;µ n, 1,Σ n, 1 ) dθ n p(s s 1, u ) }{{} N (s ;ŝ 1,P) We find ha his expression can again be approximaed by a Gaussian over measuremens z by linearizing g. As i is easily shown, he mean of he resuling Gaussian is ẑ, and is covariance is L [] = G s P G T s + G θ Σ n, 1 GT θ + R (60) Pu differenly, he (non-normalized) imporance facor of he m-he paricle is given by he following expression: w = 2πL [] 1 { 2 exp 1 2 (z ẑ ) T L [] 1 } (z ẑ ) As in FasSLAM 1.0, paricles generaed in Seps 1 and 2, along wih heir imporance facor calculaed in Sep 3, are colleced in a emporary paricle se. The final sep of he FasSLAM 2.0 updae is a resampling sep. Jus like in FasSLAM 1.0, FasSLAM 2.0 draws (wih replacemen) M paricles from he emporary paricle se. Each paricle is drawn wih a probabiliy proporional o is imporance facor w. The resuling paricle se represen (asympoically) he desired facored poserior a ime. 6 Unknown Daa Associaion 6.1 Daa Associaion in SLAM The bigges limiaion of boh FasSLAM algorihms, as described so far, has been he assumpion of known daa associaion. Real-world feaures are usually ambiguous. This secion exends he FasSLAM algorihms o cases where he correspondence variables n are unknown. ds (59) (61) 17

18 Formally, he daa associaion problem a ime is he problem of deermining he variable n based on he available daa. This problem is illusraed in Figure 8: Here a robo observes o landmarks. Depending on is acual pose relaive o hese landmarks, hese measuremens correspond o differen landmarks in he map (depiced as sars in Figure 8). The classical soluion o he daa associaion problem [4, 14, 17] is o chose n so ha i maximizes he likelihood of he sensor measuremen z : ˆn = argmax n p(z n, ˆn 1, s, z 1, u ) (62) Such an esimaor is called maximum likelihood esimaor (ML). The erm p(z n, ˆn 1, s, z 1, u ) is usually referred o as likelihood. ML daa associaion is ofen referred o as neares neighbor mehod, inerpreing he negaive log likelihood as disance funcion. For Gaussians, he negaive log likelihood is a Mahalanobis disance, and ML selecs daa associaions by minimizing his Mahalanobis disance. An alernaive o he ML mehod is daa associaion sampling (DAS): ˆn η p(z n, ˆn 1, s, z 1, u ) (63) DAS samples he daa associaion variable according o he likelihood funcion, insead of deerminisically selecing is mos likely value. Boh echniques, ML and DAS, make i possible o esimae he number of feaures in he map. SLAM echniques using ML creae new feaures in he map if he likelihood falls below a hreshold p 0 for all known feaures in he map. DAS associaes an observed measuremen wih a new, previously unobserved feaure sochasically. They do so wih probabiliy proporional o ηp 0, where η is a normalizer defined in (63). In EKF-syle approaches o he SLAM problem, ML is usually given preference over DAS, since he number of daa associaion errors in ML is smaller. Because only a single daa associaion decision is made for each measuremen in mos EKF-based implemenaions, hese approaches end o be brile wih regards o daa associaion errors. A single daa associaion error can induce significan errors in he map which in urn cause new daa associaion errors, ofen wih faal consequences. Therefore, he corresponding SLAM algorihms end o work well only when ambiguous feaures in he environmen are spaced sufficienly far apar from each oher o make confusions unlikely. For his reason, many implemenaions of SLAM exrac sparse feaures from oherwise rich sensor measuremens. 6.2 Daa Associaion in FasSLAM The key advanage of he FasSLAM over EKF-syle approaches is is abiliy o pursue muliple daa associaion hypoheses a he same ime. This is due o he fac ha he poserior is represened by muliple paricles. In paricular, FasSLAM esimaes he correspondences on a per-paricle basis, no on a per-filer basis as is he case for he EKF. This enables FasSLAM o use ML or even DAS for generaing paricle-specific daa associaion decisions. As long as a small subse of he paricles is based on he correc daa associaion, daa associaion errors are no as faal as in EKF approaches. This is because paricles subjec o such errors end o possess inconsisen maps, which increases he probabiliy ha hey are simply sampled away in fuure resampling seps. The mahemaical definiion of he per-paricle daa associaion is sraighforward. Each paricle mainains a local se of daa associaion variables, denoed ˆn deermined by maximizing he likelihood of he measuremen z :. In ML daa associaion, each ˆn ˆn = argmax n p(z n, ˆn 1,, s,, z 1, u ) (64) DAS daa associaion samples from he likelihood: ˆn η p(z n, ˆn 1,, s,, z 1, u ) (65) is 18

19 For boh echniques cases, he likelihood is calculaed as follows: p(z n, ˆn 1,, s,, z 1, u ) = p(z θ n, n, ˆn 1,, s,, z 1, u ) p(θ n n, ˆn 1,, s,, z 1, u ) dθ n = p(z θ n, n, s ) }{{} p(θ n ˆn 1,, s 1,, z 1 ) dθ n }{{} (66) N (z ;g(θ n,s ),R ) N (µ n, 1,Σ n, 1 ) Linearizaion of g enables us o obain his in closed form: p(z n, ˆn 1,, s,, z 1, u ) = 2πQ 1 { 2 exp 1 2 (z g(µ n,, 1 s )) T Q 1 (z g(µ } n,, 1 s )) (67) The variable Q was defined in Equaion (42), as a funcion of he daa associaion variable n. New feaures are added o he map in exacly he same way as oulined above. In he ML approach, a new feaure is added when he probabiliy p(z n, ˆn 1,, s,, z 1, u ) falls beyond a hreshold p 0. The DAS includes he hypohesis ha an observaion corresponds o a previously unobserved feaure in is se of hypoheses, and samples i wih probabiliy ηp 0. To accommodae he paricle-specific map size, each paricle carries is own feaure coun. This coun will be denoed N. 6.3 Feaure Iniializaion As in EKF-implemenaions of SLAM, iniializing he newly added Kalman filer can be ricky, especially when individual measuremens are insufficien o consrain he feaure s locaion in all dimensions [15]. In many SLAM problems he measuremen funcion g is inverible. This he case, for example, for robos measuring range and bearing o landmarks in he plane, in which a single measuremen suffices o produce a (non-degenerae) esimae on he feaure locaion. The iniializaion of he EKF is hen sraighforward: s p(s s 1, u ) (68) µ n, = g 1 (z, s ) (69) Σ n, = (G w = p 0 ˆn R 1 G T ˆn ) 1 (70) Noice ha for newly observed feaures, he pose s is sampled according o he moion model p(s s 1, u ). This disribuion is equivalen o he FasSLAM sampling disribuion (33) in siuaions where no previous locaion esimae for he observed feaure is available. Iniializaion echniques for siuaions in which g is no inverible are discussed in [15]. In general, such siuaions require he accumulaion of muliple measuremens, o obain a good esimae for he linearizaion of g. 6.4 Feaure Eliminaion and Negaive Informaion To accommodae feaures inroduced erroneously ino he map, FasSLAM feaures a mechanism for eliminaing feaures ha are no suppored by sufficien evidence. In paricular, our approach keeps rack of he probabiliies on he acual exisence of individual feaures in he map, a echnique commonly used in EKF-syle algorihms [17]. Le i n of he feaure θ n {0, 1} be a binary variable ha indicaes he exisence. Our approach explois he fac ha each sensor measuremen z carries evidence 19 (71)

20 wih regards o he physical exisence of nearby feaures θ n : Observing he feaure provides posiive evidence for is exisence, whereas no observing i when µ n falls wihin he robo s percepual range provides negaive evidence. The resuling poserior probabiliy p(i n ˆn,, s,, z 1 ) (72) is esimaed by a binary Bayes filer, familiar from he lieraure on occupancy grid maps [49]. FasSLAM represens he poserior in is log-odds form: τ n = ln p(i n ˆn,, s,, z 1 ) 1 p(i n ˆn,, s,, z 1 ) = ln p(i n 1 p(i n s, z, ˆn ) s, z, ˆn ) The advanage of his rule lies in he fac ha updaes are addiive (see [66] for a derivaion). In he mos simple implemenaion, observing of a feaure leads o he addiion of a posiive value (73) ρ + = ln p(i ˆn 1 p(i ˆn s, z, ˆn ) s, z, ˆn ) o he log-odds value, and no observing i leads o he addiion of a negaive value (74) ρ = ln p(i n ˆn s 1 p(i n ˆn s, z, ˆn ), z, ˆn ) To implemen his approach in real-ime, he variable sars a he ime a feaure is firs inroduced in he map. Feaures are erminaed when heir log-odds of exisence falls beyond a cerain bound. This mechanism enables FasSLAM s paricles o free hemselves of spurious feaures. 6.5 The FasSLAM Algorihms Tables 1 and 2 summarize boh FasSLAM algorihms. In boh algorihms, paricles are of he form S = s, N, µ 1,, Σ 1,, τ 1,..., µ, N, Σ, τ (76) N, In addiion o he pose s and he feaure esimaes µ n, and Σ n,, each paricle mainains he number in is local map, and each feaure carries a probabilisic esimae of is exisence τ n. Ieraing he filer requires ime linear in he maximum number of feaures max m N in each map, and of feaures N i is also linear in he number of paricles M. Furher below, we will discuss advanced daa srucure ha yield more efficien implemenaions. We noe ha boh versions of FasSLAM, as described here, consider a single measuremen a a ime. As discussed above, his choice has been made for noaional convenience. Mos compeiive SLAM implemenaions (including ours) consider muliple feaures in he daa associaion sep [3, 26, 53, 65]. Doing so ends o decreases he daa associaion error rae, and he resuling maps become more accurae. This follows from a muual exclusion propery, which saes ha no landmark can be observed a wo differen locaions a he same ime [16]. Like many oher implemenaions before, our implemenaion considers all feaures observed in a single sensor scan when calculaing he measuremen likelihood. The necessary modificaion of FasSLAM is sraighforward bu will no be furher elaboraed here. Below, we will provide an empirical comparison of boh varians of his algorihm, highlighing he advanages of FasSLAM 2.0 over 1.0. N (75) 20

21 Algorihm FasSLAM 1.0(z, u, S 1 ): for m = 1 o M do rerieve s 1, N 1, µ 1, 1, Σ 1, 1, i 1,..., µ N 1 Σ, 1, N 1 // loop over all paricles from S 1, 1, i N 1 s p(s s 1, u ) // sample new pose for n = 1 o N 1 do // calculae measuremen likelihoods ẑ n = g(µ n, 1, s ) // measuremen predicion G n = g (s, µ n, 1 ) // calculae Jacobian Q n = G T n Σ n, 1 G n + R // measuremen covariance w n = 2πQ n 1 2 endfor w N 1 +1 = p 0 ˆn = N argmax w n n=1,...,n 1 +1 = max{n 1, ˆn} exp { 1 2 (z ẑ n ) T Q 1 n (z ẑ n ) } // likelihood of correspondence // imporance facor of new landmark // max likelihood correspondence // new number of feaures in map for n = 0 o N do // updae Kalman filers if n = N hen // is new feaure? µ n, = g 1 (z, s ) // iniialize mean Σ n, = G 1 ˆn R (G 1 ˆn )T // iniialize covariance i n, = 1 // iniialize couner else if n = ˆn hen // is observed feaure? K = Σ 1 n, 1GˆnQ ˆn // calculae Kalman gain µ n, = µ n, 1 + K(z ẑˆn ) T // updae mean Σ n, = (I KGn Ṱ )Σ n, 1 // updae covariance i n, = i n, // incremen couner else // all oher feaures µ n, = µ n, 1 // copy old mean Σ n, = Σ n, 1 // copy old covariance if µ n, 1 ouside percepual range of s hen // should feaure have been observed? i n, = i n, 1 // no, do no change else i n, = i // yes, decremen couner endif endif endfor add if i n, 1 s, N, n, 1 1 < 0 hen discard feaure n endif µ 1,, Σ 1,, i 1,..., endfor S = for m = 1 o M do draw random index m wih probabiliy w add s, N, µ 1,, Σ 1,, i 1,..., end for reurn S end algorihm µ, N, Σ N µ, N, Σ N,, i N,, i N o S aux // discard dubious feaures // consruc new paricle se // resample M paricles // resample o S Table 1: Summary of he algorihm FasSLAM 1.0 wih unknown daa associaion, as published in [45]. This version does no implemen any of he efficien ree represenaions discussed in he paper, and i relies on an inferior proposal disribuion. Is chief advanage is ha i easier o implemen han FasSLAM

22 Algorihm FasSLAM 2.0(z, u, S 1 ): for m = 1 o M do rerieve s 1, N 1, µ 1, 1, Σ 1, 1, τ 1 for n = 1 o N ŝ G θ,n Q,n Σ s,n = 1 do = h(s 1, u ); ẑ,n = g(µ n, 1, ŝ,..., ) = θn g(θ n, s ) s=ŝ ;θ n =µ n, 1 = R + G θ,n Σ [ G T s,n Q 1,n s n, N (µ s,n, Σ = 2πQ,n 1 2 n, 1 GT θ,n s,n ) G s,n + P 1 ] 1; µ µ N Σ, 1, N 1 1 // loop over all paricles from S 1, 1, τ N 1 // calculae sampling disribuion ; G s,n = s g(θ n, s ) s=ŝ ;θ n =µ n, 1 s,n { 1 2 (z g(µ n, 1, s = Σ s,n G T s,n Q 1 (z ẑ,n ) + ŝ,n // sample pose } (z g(µ n,, 1 s n,)) p n exp n,)) T Q 1,n endfor p N 1 +1 = p 0 // likelihood of new feaure ˆn = argmax n p n or draw random ˆn wih probabiliy p n // daa associaion for n = 1 o N do // process measuremen if n = ˆn N 1 hen // known feaure? N = N 1 ; τ n, = τ n, + ρ + ; K = Σ ˆn, 1 GT θ,ˆn Q 1,ˆn µ ˆn, = µ L [] w ˆn +, 1 K (z ẑ,ˆn ); Σ ˆn, = (I K = G s,ˆn P G T s,ˆn + G θ,ˆn Σ n, 1 GT θ,ˆn + R = 2πL [] 1 2 exp else if n = ˆn = N hen n = N = N 1 { 1 2 (z ẑ,ˆn ) T L [] 1 (z ẑ,ˆn ) + 1; τ n, = ρ + ; w = p 0 s n, p(s s 1, u ); G θ,n = θn g(θ n, s ) s=s n, ;θn=µ n, µ n, = g 1 (z, s n, ); Σ n, = (G θ,n R 1 G T θ,n ) 1 else if n ˆn and n N 1 µ n, = µ n ;, 1 Σ n, = Σ n, 1 if µ n, range(s ˆn ) hen 1, τ n, = τ n, 1 else τ n, = τ n, 1 ρ ; if τ n, < 0 hen remove n endif endif endfor add s, N, µ 1,, Σ 1,, τ 1,..., endfor S = for m = 1 o M do draw random index m wih probabiliy w add s, N, µ 1,, Σ 1,, τ 1,..., end for reurn S end algorihm µ, N, Σ N µ, N, Σ N G θ,ˆn )Σ ˆn, 1 }, τ o S, N aux, τ o S, N // new feaure? Table 2: The FasSLAM 2.0 Algorihm, saed here unknown daa associaion. // handle unobserved feaures // ouside sensor range? // inside sensor range? // disconinue feaure? // end loop over all paricles // consruc new paricle se // generae M paricles // resample 22

23 7 Convergence of FasSLAM 2.0 For Linear-Gaussian SLAM In his secion, we will esablish a convergence resul for FasSLAM 2.0. This resul crucially explois he proposal disribuion in FasSLAM 2.0 and herefore is no direcly applicable o FasSLAM 1.0. I applies o a subse of all SLAM problems, namely for linear SLAM problems wih Gaussian noise. LG-SLAM problems are defined o possess moion and measuremen models of he following linear form: g(θ n, s ) = θ n s (77) h(u, s 1 ) = u + s 1 (78) The LG-SLAM framework can be hough of as a robo operaing in a Caresian space equipped wih a noise-free compass, and sensors ha measure disances o feaures along he coordinae axes. While LG-SLAM is clearly oo resricive o be of pracical significance, i plays an imporan role in he lieraure. To our knowledge, he only known convergence proof for a SLAM algorihm is a recenly published resul for Kalman filers (KF) applied o specific linear-gaussian problems. As shown in [17, 54], he KF approach (which is equivalen o EKFs for linear-gaussian SLAM problems) converges o a sae in which all map feaures are fully correlaed. If he locaion of one feaure is known, he KF asympoically recovers he locaion of all oher feaure. The cenral convergence resul in his paper is he following: Theorem. Linear-Gaussian FasSLAM 2.0 converges in expecaion o he correc map wih M = 1 paricle if all feaures are observed infiniely ofen, and if he locaion of one feaure is known in advance. If no feaure locaion is known in advance, he map will be correc in relaive erms, up o a fixed offse ha uniformly applies o all feaures. The proof of his resul can be found in Appendix B. Is significance lies in he fac ha is shows ha for specific SLAM problems, FasSLAM 2.0 may converge wih a finie number of paricles. In paricular, he number of paricles required for convergence in LG- SLAM is independen of he size of he map N. This resul holds even if all feaures are arranged in a large cycle, a siuaion ofen hough of as wors case for SLAM problems [26]. However, our analysis says nohing abou he convergence speed of he algorihm, which in pracice depends on he paricle se size M. Below, we will invesigae he speed of convergence hrough empirical means. 8 Efficien Implemenaion A firs glance, i may appear ha each updae in FasSLAM requires ime O(MN), where M is he number of paricles M and N he number of feaures in he map. The linear complexiy in M is unavoidable, given ha we have o process M paricles wih each updae. The linear complexiy in N is he resul of he resampling process. Whenever a paricle is drawn more han once in he resampling process, a naive implemenaion migh duplicae he enire map aached o he paricle. Such a duplicaion process is linear in he size of he map N. Furhermore, a naive implemenaion of daa associaion may resul in evaluaing he measuremen likelihood for each of he N feaures in he map, resuling again in linear complexiy in N. We noe ha a poor implemenaion of he sampling process migh easily add anoher facor of logn o he updae complexiy. FasSLAM ieraions can be execued in O(M log N) ime; in paricular, FasSLAM updaes can be implemened in ime logarihmic in he size of he map N. Firs, consider he siuaion wih known daa associaion. Linear copying coss can be avoided by inroducing a daa srucure for represening paricles ha allow for more selecive updaes. The basic idea is o organize he map as a balanced binary ree. Figure 9a shows such a ree for a single paricle, in he case of K = 8 feaures. Noice ha 23

24 (a) T n 4? F n 2? n 6? T F T F n 1? n 3? n 5? n 7? T F T F T F T F µ 1,Σ 1 µ 2,Σ 2 µ 3,Σ 3 µ 4,Σ 4 µ 5,Σ 5 µ 6,Σ 6 µ 7,Σ 7 µ 8,Σ 8 (b) n 4? T F n 2? F T n 3? T µ 3,Σ 3 F new paricle T n 4? F old paricle n 2? n 6? T F T F n 1? n 3? n 1? n 3? T F T F T F T F µ 1,Σ 1 µ 2,Σ 2 µ 3,Σ 3 µ 4,Σ 4 µ 5,Σ 5 µ 6,Σ 6 µ 7,Σ 7 µ 8,Σ 8 Figure 9: (a)a ree represening N = 8 feaure esimaes wihin a single paricle. (b) Generaing a new paricle from an old one, while modifying only a single Gaussian. The new paricle receives only a parial ree, consising of a pah o he modified Gaussian. All oher poiners are copied from he generaing ree. This can be done in ime logarihmic in N. he Gaussian parameers µ k and Σ k are locaed a he leaves of he ree. Assuming ha he ree is balanced, accessing a leaf required ime logarihmic in N. Suppose FasSLAM incorporaes a new conrol u and a new measuremen z. Each new paricle in S will differ from he corresponding one in S 1 in wo ways: Firs, i will possess a differen pose esimae obained via (33), and second, he observed feaure s Gaussian will have been updaed, as specified in Equaions (54) and (55). All oher Gaussian feaure esimaes, however, will be equivalen o he generaing paricle. When copying he paricle, hus, only a single pah has o be modified in he ree represening all Gaussians. An example is shown in Figure 9b: Here we assume n = 3, ha is, only he Gaussian parameers µ 3 and Σ 3 are updaed. Insead of generaing an enire new ree, only a single pah is creaed, leading o he Gaussian n = 3. This pah is an incomplee ree. The ree is compleed by copying he missing poiners from he ree of he generaing paricle. Thus, branches ha leave he pah will poin o he same (unmodified) subree as ha of he generaing ree. Clearly, generaing his ree akes only ime logarihmic in N. Moreover, accessing a Gaussian also akes ime logarihmic in 24

25 Figure 10: The uiliy car used for collecing oudoor daa is equipped wih a SICK laser range and bearing sensor, linear variable differenial ransformer sensor for he seering and back wheel velociy encoder. This image shows he vehicle in he Vicoria Park environmen. N, since he number of seps required o navigae o a leaf of he ree is equivalen o he lengh of he pah (which is by definiion logarihmic). Thus, boh generaing and accessing a parial ree can be done in ime O(log N). Since in each updaing sep M new paricles are creaed, an enire updae requires ime in O(M log N). The insigh of using rees for efficien mapping can be found in [45]; a similar ree represenaion can be found in [21]. Organizing paricles in rees raises he quesion as o when o deallocae memory. Memory deallocaion can equally be implemened in amorized logarihmic ime. The idea is o assign a variable o each node inernal or leaf ha couns he number of poiners poining o i. The couner of a newly creaed node will be iniialized by 1. I will be incremened as poiners o a node are creaed in oher paricles. Decremens occur when poiners are removed (e.g., poiners of pose paricles ha fail o survive he resampling process). When a couner reaches zero, is children s couners are decremened and he memory of he corresponding node is deallocaed. The processes is hen applied recursively o all children of he node whose couner may have reached zero. This recursive process will require O(M log N) ime on average. Furhermore, i can be shown o be an opimal deallocaion algorihm in ha all unneeded memory will be freed insananeously. To obain logarihmic ime complexiy for FasSLAM wih unknown daa associaion furher assumpions are needed. In paricular, he number feaures in he robo s sensor range mus be independen may require more han logarihmic ime. Furhermore, a DAS sampler mus be resriced o feaures in he robo s viciniy o avoid calculaing he likelihood (63) for all N feaures. Finally, he number of rejeced feaures should be small (e.g., wihin a consan facor of all acceped ones). All hese assumpions are plausible when applying FasSLAM o real-world SLAM problems. Under hese assumpions, varians of kd-rees [6, 48] can guaranee logarihmic ime search for high likelihood feaures, and feaures in he robo s measuremen range. Incremenal echniques for consrucing balanced kd-rees are described in [39, 59]. For example, he bkd-ree proposed in [59] mainains a sequence of rees of growing complexiy. By carefully shifing iems across hose rees, a logarihmic ime recall can be guaraneed under amorized logarihmic ime for insering new feaures in he map. In his way, all necessary operaions in FasSLAM can be carried ou in logarihmic ime on average. of N; oherwise simple operaions such as keeping rack of he exisence poseriors τ n 25

EKF SLAM vs. FastSLAM A Comparison

EKF SLAM vs. FastSLAM A Comparison vs. A Comparison Michael Calonder, Compuer Vision Lab Swiss Federal Insiue of Technology, Lausanne EPFL) michael.calonder@epfl.ch The wo algorihms are described wih a planar robo applicaion in mind. Generalizaion

More information

Estimation of Poses with Particle Filters

Estimation of Poses with Particle Filters Esimaion of Poses wih Paricle Filers Dr.-Ing. Bernd Ludwig Chair for Arificial Inelligence Deparmen of Compuer Science Friedrich-Alexander-Universiä Erlangen-Nürnberg 12/05/2008 Dr.-Ing. Bernd Ludwig (FAU

More information

Probabilistic Robotics SLAM

Probabilistic Robotics SLAM Probabilisic Roboics SLAM The SLAM Problem SLAM is he process by which a robo builds a map of he environmen and, a he same ime, uses his map o compue is locaion Localizaion: inferring locaion given a map

More information

Probabilistic Robotics SLAM

Probabilistic Robotics SLAM Probabilisic Roboics SLAM The SLAM Problem SLAM is he process by which a robo builds a map of he environmen and, a he same ime, uses his map o compue is locaion Localizaion: inferring locaion given a map

More information

Probabilistic Robotics

Probabilistic Robotics Probabilisic Roboics Bayes Filer Implemenaions Gaussian filers Bayes Filer Reminder Predicion bel p u bel d Correcion bel η p z bel Gaussians : ~ π e p N p - Univariae / / : ~ μ μ μ e p Ν p d π Mulivariae

More information

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS NA568 Mobile Roboics: Mehods & Algorihms Today s Topic Quick review on (Linear) Kalman Filer Kalman Filering for Non-Linear Sysems Exended Kalman Filer (EKF)

More information

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017 Two Popular Bayesian Esimaors: Paricle and Kalman Filers McGill COMP 765 Sep 14 h, 2017 1 1 1, dx x Bel x u x P x z P Recall: Bayes Filers,,,,,,, 1 1 1 1 u z u x P u z u x z P Bayes z = observaion u =

More information

Robot Motion Model EKF based Localization EKF SLAM Graph SLAM

Robot Motion Model EKF based Localization EKF SLAM Graph SLAM Robo Moion Model EKF based Localizaion EKF SLAM Graph SLAM General Robo Moion Model Robo sae v r Conrol a ime Sae updae model Noise model of robo conrol Noise model of conrol Robo moion model

More information

Sequential Importance Resampling (SIR) Particle Filter

Sequential Importance Resampling (SIR) Particle Filter Paricle Filers++ Pieer Abbeel UC Berkeley EECS Many slides adaped from Thrun, Burgard and Fox, Probabilisic Roboics 1. Algorihm paricle_filer( S -1, u, z ): 2. Sequenial Imporance Resampling (SIR) Paricle

More information

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II Roland Siegwar Margaria Chli Paul Furgale Marco Huer Marin Rufli Davide Scaramuzza ETH Maser Course: 151-0854-00L Auonomous Mobile Robos Localizaion II ACT and SEE For all do, (predicion updae / ACT),

More information

Introduction to Mobile Robotics

Introduction to Mobile Robotics Inroducion o Mobile Roboics Bayes Filer Kalman Filer Wolfram Burgard Cyrill Sachniss Giorgio Grisei Maren Bennewiz Chrisian Plagemann Bayes Filer Reminder Predicion bel p u bel d Correcion bel η p z bel

More information

Notes on Kalman Filtering

Notes on Kalman Filtering Noes on Kalman Filering Brian Borchers and Rick Aser November 7, Inroducion Daa Assimilaion is he problem of merging model predicions wih acual measuremens of a sysem o produce an opimal esimae of he curren

More information

Introduction to Mobile Robotics SLAM: Simultaneous Localization and Mapping

Introduction to Mobile Robotics SLAM: Simultaneous Localization and Mapping Inroducion o Mobile Roboics SLAM: Simulaneous Localizaion and Mapping Wolfram Burgard, Maren Bennewiz, Diego Tipaldi, Luciano Spinello Wha is SLAM? Esimae he pose of a robo and he map of he environmen

More information

Chapter 2. First Order Scalar Equations

Chapter 2. First Order Scalar Equations Chaper. Firs Order Scalar Equaions We sar our sudy of differenial equaions in he same way he pioneers in his field did. We show paricular echniques o solve paricular ypes of firs order differenial equaions.

More information

Vehicle Arrival Models : Headway

Vehicle Arrival Models : Headway Chaper 12 Vehicle Arrival Models : Headway 12.1 Inroducion Modelling arrival of vehicle a secion of road is an imporan sep in raffic flow modelling. I has imporan applicaion in raffic flow simulaion where

More information

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle Chaper 2 Newonian Mechanics Single Paricle In his Chaper we will review wha Newon s laws of mechanics ell us abou he moion of a single paricle. Newon s laws are only valid in suiable reference frames,

More information

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter Sae-Space Models Iniializaion, Esimaion and Smoohing of he Kalman Filer Iniializaion of he Kalman Filer The Kalman filer shows how o updae pas predicors and he corresponding predicion error variances when

More information

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB Elecronic Companion EC.1. Proofs of Technical Lemmas and Theorems LEMMA 1. Le C(RB) be he oal cos incurred by he RB policy. Then we have, T L E[C(RB)] 3 E[Z RB ]. (EC.1) Proof of Lemma 1. Using he marginal

More information

SEIF, EnKF, EKF SLAM. Pieter Abbeel UC Berkeley EECS

SEIF, EnKF, EKF SLAM. Pieter Abbeel UC Berkeley EECS SEIF, EnKF, EKF SLAM Pieer Abbeel UC Berkeley EECS Informaion Filer From an analyical poin of view == Kalman filer Difference: keep rack of he inverse covariance raher han he covariance marix [maer of

More information

2016 Possible Examination Questions. Robotics CSCE 574

2016 Possible Examination Questions. Robotics CSCE 574 206 Possible Examinaion Quesions Roboics CSCE 574 ) Wha are he differences beween Hydraulic drive and Shape Memory Alloy drive? Name one applicaion in which each one of hem is appropriae. 2) Wha are he

More information

CSE-571 Robotics. Sample-based Localization (sonar) Motivation. Bayes Filter Implementations. Particle filters. Density Approximation

CSE-571 Robotics. Sample-based Localization (sonar) Motivation. Bayes Filter Implementations. Particle filters. Density Approximation Moivaion CSE57 Roboics Bayes Filer Implemenaions Paricle filers So far, we discussed he Kalman filer: Gaussian, linearizaion problems Paricle filers are a way o efficienly represen nongaussian disribuions

More information

FastSLAM 2.0: An Improved Particle Filtering Algorithm for Simultaneous Localization and Mapping that Provably Converges

FastSLAM 2.0: An Improved Particle Filtering Algorithm for Simultaneous Localization and Mapping that Provably Converges Proceedings of IJCAI 2003 FasSLAM 2.0: An Improved Paricle Filering Algorihm for Simulaneous Localizaion and Mapping ha Provably Converges Michael Monemerlo and Sebasian Thrun School of Compuer Science

More information

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles Diebold, Chaper 7 Francis X. Diebold, Elemens of Forecasing, 4h Ediion (Mason, Ohio: Cengage Learning, 006). Chaper 7. Characerizing Cycles Afer compleing his reading you should be able o: Define covariance

More information

Simultaneous Localization and Mapping with Unknown Data Association Using FastSLAM

Simultaneous Localization and Mapping with Unknown Data Association Using FastSLAM Simulaneous Localizaion and Mapping wih Unknown Daa Associaion Using FasSLAM Michael Monemerlo, Sebasian Thrun Absrac The Exended Kalman Filer (EKF has been he de faco approach o he Simulaneous Localizaion

More information

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H.

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H. ACE 56 Fall 005 Lecure 5: he Simple Linear Regression Model: Sampling Properies of he Leas Squares Esimaors by Professor Sco H. Irwin Required Reading: Griffihs, Hill and Judge. "Inference in he Simple

More information

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes Represening Periodic Funcions by Fourier Series 3. Inroducion In his Secion we show how a periodic funcion can be expressed as a series of sines and cosines. We begin by obaining some sandard inegrals

More information

7630 Autonomous Robotics Probabilistic Localisation

7630 Autonomous Robotics Probabilistic Localisation 7630 Auonomous Roboics Probabilisic Localisaion Principles of Probabilisic Localisaion Paricle Filers for Localisaion Kalman Filer for Localisaion Based on maerial from R. Triebel, R. Käsner, R. Siegwar,

More information

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.

More information

STATE-SPACE MODELLING. A mass balance across the tank gives:

STATE-SPACE MODELLING. A mass balance across the tank gives: B. Lennox and N.F. Thornhill, 9, Sae Space Modelling, IChemE Process Managemen and Conrol Subjec Group Newsleer STE-SPACE MODELLING Inroducion: Over he pas decade or so here has been an ever increasing

More information

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still.

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still. Lecure - Kinemaics in One Dimension Displacemen, Velociy and Acceleraion Everyhing in he world is moving. Nohing says sill. Moion occurs a all scales of he universe, saring from he moion of elecrons in

More information

Announcements. Recap: Filtering. Recap: Reasoning Over Time. Example: State Representations for Robot Localization. Particle Filtering

Announcements. Recap: Filtering. Recap: Reasoning Over Time. Example: State Representations for Robot Localization. Particle Filtering Inroducion o Arificial Inelligence V22.0472-001 Fall 2009 Lecure 18: aricle & Kalman Filering Announcemens Final exam will be a 7pm on Wednesday December 14 h Dae of las class 1.5 hrs long I won ask anyhing

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Linear Response Theory: The connecion beween QFT and experimens 3.1. Basic conceps and ideas Q: How do we measure he conduciviy of a meal? A: we firs inroduce a weak elecric field E, and

More information

Testing for a Single Factor Model in the Multivariate State Space Framework

Testing for a Single Factor Model in the Multivariate State Space Framework esing for a Single Facor Model in he Mulivariae Sae Space Framework Chen C.-Y. M. Chiba and M. Kobayashi Inernaional Graduae School of Social Sciences Yokohama Naional Universiy Japan Faculy of Economics

More information

10. State Space Methods

10. State Space Methods . Sae Space Mehods. Inroducion Sae space modelling was briefly inroduced in chaper. Here more coverage is provided of sae space mehods before some of heir uses in conrol sysem design are covered in he

More information

Using the Kalman filter Extended Kalman filter

Using the Kalman filter Extended Kalman filter Using he Kalman filer Eended Kalman filer Doz. G. Bleser Prof. Sricker Compuer Vision: Objec and People Tracking SA- Ouline Recap: Kalman filer algorihm Using Kalman filers Eended Kalman filer algorihm

More information

GMM - Generalized Method of Moments

GMM - Generalized Method of Moments GMM - Generalized Mehod of Momens Conens GMM esimaion, shor inroducion 2 GMM inuiion: Maching momens 2 3 General overview of GMM esimaion. 3 3. Weighing marix...........................................

More information

Physics 127b: Statistical Mechanics. Fokker-Planck Equation. Time Evolution

Physics 127b: Statistical Mechanics. Fokker-Planck Equation. Time Evolution Physics 7b: Saisical Mechanics Fokker-Planck Equaion The Langevin equaion approach o he evoluion of he velociy disribuion for he Brownian paricle migh leave you uncomforable. A more formal reamen of his

More information

Probabilistic Robotics The Sparse Extended Information Filter

Probabilistic Robotics The Sparse Extended Information Filter Probabilisic Roboics The Sparse Exended Informaion Filer MSc course Arificial Inelligence 2018 hps://saff.fnwi.uva.nl/a.visser/educaion/probabilisicroboics/ Arnoud Visser Inelligen Roboics Lab Informaics

More information

Georey E. Hinton. University oftoronto. Technical Report CRG-TR February 22, Abstract

Georey E. Hinton. University oftoronto.   Technical Report CRG-TR February 22, Abstract Parameer Esimaion for Linear Dynamical Sysems Zoubin Ghahramani Georey E. Hinon Deparmen of Compuer Science Universiy oftorono 6 King's College Road Torono, Canada M5S A4 Email: zoubin@cs.orono.edu Technical

More information

1 Review of Zero-Sum Games

1 Review of Zero-Sum Games COS 5: heoreical Machine Learning Lecurer: Rob Schapire Lecure #23 Scribe: Eugene Brevdo April 30, 2008 Review of Zero-Sum Games Las ime we inroduced a mahemaical model for wo player zero-sum games. Any

More information

Robust estimation based on the first- and third-moment restrictions of the power transformation model

Robust estimation based on the first- and third-moment restrictions of the power transformation model h Inernaional Congress on Modelling and Simulaion, Adelaide, Ausralia, 6 December 3 www.mssanz.org.au/modsim3 Robus esimaion based on he firs- and hird-momen resricions of he power ransformaion Nawaa,

More information

Christos Papadimitriou & Luca Trevisan November 22, 2016

Christos Papadimitriou & Luca Trevisan November 22, 2016 U.C. Bereley CS170: Algorihms Handou LN-11-22 Chrisos Papadimiriou & Luca Trevisan November 22, 2016 Sreaming algorihms In his lecure and he nex one we sudy memory-efficien algorihms ha process a sream

More information

An introduction to the theory of SDDP algorithm

An introduction to the theory of SDDP algorithm An inroducion o he heory of SDDP algorihm V. Leclère (ENPC) Augus 1, 2014 V. Leclère Inroducion o SDDP Augus 1, 2014 1 / 21 Inroducion Large scale sochasic problem are hard o solve. Two ways of aacking

More information

KINEMATICS IN ONE DIMENSION

KINEMATICS IN ONE DIMENSION KINEMATICS IN ONE DIMENSION PREVIEW Kinemaics is he sudy of how hings move how far (disance and displacemen), how fas (speed and velociy), and how fas ha how fas changes (acceleraion). We say ha an objec

More information

Tracking. Announcements

Tracking. Announcements Tracking Tuesday, Nov 24 Krisen Grauman UT Ausin Announcemens Pse 5 ou onigh, due 12/4 Shorer assignmen Auo exension il 12/8 I will no hold office hours omorrow 5 6 pm due o Thanksgiving 1 Las ime: Moion

More information

Modal identification of structures from roving input data by means of maximum likelihood estimation of the state space model

Modal identification of structures from roving input data by means of maximum likelihood estimation of the state space model Modal idenificaion of srucures from roving inpu daa by means of maximum likelihood esimaion of he sae space model J. Cara, J. Juan, E. Alarcón Absrac The usual way o perform a forced vibraion es is o fix

More information

Tracking. Many slides adapted from Kristen Grauman, Deva Ramanan

Tracking. Many slides adapted from Kristen Grauman, Deva Ramanan Tracking Man slides adaped from Krisen Grauman Deva Ramanan Coures G. Hager Coures G. Hager J. Kosecka cs3b Adapive Human-Moion Tracking Acquisiion Decimaion b facor 5 Moion deecor Grascale convers. Image

More information

Final Spring 2007

Final Spring 2007 .615 Final Spring 7 Overview The purpose of he final exam is o calculae he MHD β limi in a high-bea oroidal okamak agains he dangerous n = 1 exernal ballooning-kink mode. Effecively, his corresponds o

More information

FastSLAM: A Factored Solution to the Simultaneous Localization and Mapping Problem

FastSLAM: A Factored Solution to the Simultaneous Localization and Mapping Problem asslam: A acored Soluion o he Simulaneous Localizaion and Mapping Problem Michael Monemerlo and Sebasian hrun School of Compuer Science Carnegie Mellon Universiy Pisburgh, PA 15213 mmde@cs.cmu.edu, hrun@cs.cmu.edu

More information

m = 41 members n = 27 (nonfounders), f = 14 (founders) 8 markers from chromosome 19

m = 41 members n = 27 (nonfounders), f = 14 (founders) 8 markers from chromosome 19 Sequenial Imporance Sampling (SIS) AKA Paricle Filering, Sequenial Impuaion (Kong, Liu, Wong, 994) For many problems, sampling direcly from he arge disribuion is difficul or impossible. One reason possible

More information

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED 0.1 MAXIMUM LIKELIHOOD ESTIMATIO EXPLAIED Maximum likelihood esimaion is a bes-fi saisical mehod for he esimaion of he values of he parameers of a sysem, based on a se of observaions of a random variable

More information

Object tracking: Using HMMs to estimate the geographical location of fish

Object tracking: Using HMMs to estimate the geographical location of fish Objec racking: Using HMMs o esimae he geographical locaion of fish 02433 - Hidden Markov Models Marin Wæver Pedersen, Henrik Madsen Course week 13 MWP, compiled June 8, 2011 Objecive: Locae fish from agging

More information

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Simulaion-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Week Descripion Reading Maerial 2 Compuer Simulaion of Dynamic Models Finie Difference, coninuous saes, discree ime Simple Mehods Euler Trapezoid

More information

14 Autoregressive Moving Average Models

14 Autoregressive Moving Average Models 14 Auoregressive Moving Average Models In his chaper an imporan parameric family of saionary ime series is inroduced, he family of he auoregressive moving average, or ARMA, processes. For a large class

More information

4.6 One Dimensional Kinematics and Integration

4.6 One Dimensional Kinematics and Integration 4.6 One Dimensional Kinemaics and Inegraion When he acceleraion a( of an objec is a non-consan funcion of ime, we would like o deermine he ime dependence of he posiion funcion x( and he x -componen of

More information

20. Applications of the Genetic-Drift Model

20. Applications of the Genetic-Drift Model 0. Applicaions of he Geneic-Drif Model 1) Deermining he probabiliy of forming any paricular combinaion of genoypes in he nex generaion: Example: If he parenal allele frequencies are p 0 = 0.35 and q 0

More information

3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon

3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon 3..3 INRODUCION O DYNAMIC OPIMIZAION: DISCREE IME PROBLEMS A. he Hamilonian and Firs-Order Condiions in a Finie ime Horizon Define a new funcion, he Hamilonian funcion, H. H he change in he oal value of

More information

A Bayesian Approach to Spectral Analysis

A Bayesian Approach to Spectral Analysis Chirped Signals A Bayesian Approach o Specral Analysis Chirped signals are oscillaing signals wih ime variable frequencies, usually wih a linear variaion of frequency wih ime. E.g. f() = A cos(ω + α 2

More information

Random Walk with Anti-Correlated Steps

Random Walk with Anti-Correlated Steps Random Walk wih Ani-Correlaed Seps John Noga Dirk Wagner 2 Absrac We conjecure he expeced value of random walks wih ani-correlaed seps o be exacly. We suppor his conjecure wih 2 plausibiliy argumens and

More information

Article from. Predictive Analytics and Futurism. July 2016 Issue 13

Article from. Predictive Analytics and Futurism. July 2016 Issue 13 Aricle from Predicive Analyics and Fuurism July 6 Issue An Inroducion o Incremenal Learning By Qiang Wu and Dave Snell Machine learning provides useful ools for predicive analyics The ypical machine learning

More information

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t Exercise 7 C P = α + β R P + u C = αp + βr + v (a) (b) C R = α P R + β + w (c) Assumpions abou he disurbances u, v, w : Classical assumions on he disurbance of one of he equaions, eg. on (b): E(v v s P,

More information

Online Appendix to Solution Methods for Models with Rare Disasters

Online Appendix to Solution Methods for Models with Rare Disasters Online Appendix o Soluion Mehods for Models wih Rare Disasers Jesús Fernández-Villaverde and Oren Levinal In his Online Appendix, we presen he Euler condiions of he model, we develop he pricing Calvo block,

More information

Matlab and Python programming: how to get started

Matlab and Python programming: how to get started Malab and Pyhon programming: how o ge sared Equipping readers he skills o wrie programs o explore complex sysems and discover ineresing paerns from big daa is one of he main goals of his book. In his chaper,

More information

Failure of the work-hamiltonian connection for free energy calculations. Abstract

Failure of the work-hamiltonian connection for free energy calculations. Abstract Failure of he work-hamilonian connecion for free energy calculaions Jose M. G. Vilar 1 and J. Miguel Rubi 1 Compuaional Biology Program, Memorial Sloan-Keering Cancer Cener, 175 York Avenue, New York,

More information

Some Basic Information about M-S-D Systems

Some Basic Information about M-S-D Systems Some Basic Informaion abou M-S-D Sysems 1 Inroducion We wan o give some summary of he facs concerning unforced (homogeneous) and forced (non-homogeneous) models for linear oscillaors governed by second-order,

More information

Lecture Notes 2. The Hilbert Space Approach to Time Series

Lecture Notes 2. The Hilbert Space Approach to Time Series Time Series Seven N. Durlauf Universiy of Wisconsin. Basic ideas Lecure Noes. The Hilber Space Approach o Time Series The Hilber space framework provides a very powerful language for discussing he relaionship

More information

d 1 = c 1 b 2 - b 1 c 2 d 2 = c 1 b 3 - b 1 c 3

d 1 = c 1 b 2 - b 1 c 2 d 2 = c 1 b 3 - b 1 c 3 and d = c b - b c c d = c b - b c c This process is coninued unil he nh row has been compleed. The complee array of coefficiens is riangular. Noe ha in developing he array an enire row may be divided or

More information

Lab #2: Kinematics in 1-Dimension

Lab #2: Kinematics in 1-Dimension Reading Assignmen: Chaper 2, Secions 2-1 hrough 2-8 Lab #2: Kinemaics in 1-Dimension Inroducion: The sudy of moion is broken ino wo main areas of sudy kinemaics and dynamics. Kinemaics is he descripion

More information

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé Bias in Condiional and Uncondiional Fixed Effecs Logi Esimaion: a Correcion * Tom Coupé Economics Educaion and Research Consorium, Naional Universiy of Kyiv Mohyla Academy Address: Vul Voloska 10, 04070

More information

References are appeared in the last slide. Last update: (1393/08/19)

References are appeared in the last slide. Last update: (1393/08/19) SYSEM IDEIFICAIO Ali Karimpour Associae Professor Ferdowsi Universi of Mashhad References are appeared in he las slide. Las updae: 0..204 393/08/9 Lecure 5 lecure 5 Parameer Esimaion Mehods opics o be

More information

Air Traffic Forecast Empirical Research Based on the MCMC Method

Air Traffic Forecast Empirical Research Based on the MCMC Method Compuer and Informaion Science; Vol. 5, No. 5; 0 ISSN 93-8989 E-ISSN 93-8997 Published by Canadian Cener of Science and Educaion Air Traffic Forecas Empirical Research Based on he MCMC Mehod Jian-bo Wang,

More information

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8)

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8) I. Definiions and Problems A. Perfec Mulicollineariy Econ7 Applied Economerics Topic 7: Mulicollineariy (Sudenmund, Chaper 8) Definiion: Perfec mulicollineariy exiss in a following K-variable regression

More information

Matrix Versions of Some Refinements of the Arithmetic-Geometric Mean Inequality

Matrix Versions of Some Refinements of the Arithmetic-Geometric Mean Inequality Marix Versions of Some Refinemens of he Arihmeic-Geomeric Mean Inequaliy Bao Qi Feng and Andrew Tonge Absrac. We esablish marix versions of refinemens due o Alzer ], Carwrigh and Field 4], and Mercer 5]

More information

Tracking. Many slides adapted from Kristen Grauman, Deva Ramanan

Tracking. Many slides adapted from Kristen Grauman, Deva Ramanan Tracking Man slides adaped from Krisen Grauman Deva Ramanan Coures G. Hager Coures G. Hager J. Kosecka cs3b Adapive Human-Moion Tracking Acquisiion Decimaion b facor 5 Moion deecor Grascale convers. Image

More information

Simultaneous Localization and Mapping With Sparse Extended Information Filters

Simultaneous Localization and Mapping With Sparse Extended Information Filters Sebasian Thrun Yufeng Liu Carnegie Mellon Universiy Pisburgh, PA, USA Daphne Koller Andrew Y. Ng Sanford Universiy Sanford, CA, USA Zoubin Ghahramani Gasby Compuaional Neuroscience Uni Universiy College

More information

SOLUTIONS TO ECE 3084

SOLUTIONS TO ECE 3084 SOLUTIONS TO ECE 384 PROBLEM 2.. For each sysem below, specify wheher or no i is: (i) memoryless; (ii) causal; (iii) inverible; (iv) linear; (v) ime invarian; Explain your reasoning. If he propery is no

More information

Basilio Bona ROBOTICA 03CFIOR 1

Basilio Bona ROBOTICA 03CFIOR 1 Indusrial Robos Kinemaics 1 Kinemaics and kinemaic funcions Kinemaics deals wih he sudy of four funcions (called kinemaic funcions or KFs) ha mahemaically ransform join variables ino caresian variables

More information

CSE-473. A Gentle Introduction to Particle Filters

CSE-473. A Gentle Introduction to Particle Filters CSE-473 A Genle Inroducion o Paricle Filers Bayes Filers for Robo Localizaion Dieer Fo 2 Bayes Filers: Framework Given: Sream of observaions z and acion daa u: d Sensor model Pz. = { u, z2, u 1, z 1 Dynamics

More information

Chapter 3 Boundary Value Problem

Chapter 3 Boundary Value Problem Chaper 3 Boundary Value Problem A boundary value problem (BVP) is a problem, ypically an ODE or a PDE, which has values assigned on he physical boundary of he domain in which he problem is specified. Le

More information

On Multicomponent System Reliability with Microshocks - Microdamages Type of Components Interaction

On Multicomponent System Reliability with Microshocks - Microdamages Type of Components Interaction On Mulicomponen Sysem Reliabiliy wih Microshocks - Microdamages Type of Componens Ineracion Jerzy K. Filus, and Lidia Z. Filus Absrac Consider a wo componen parallel sysem. The defined new sochasic dependences

More information

On Measuring Pro-Poor Growth. 1. On Various Ways of Measuring Pro-Poor Growth: A Short Review of the Literature

On Measuring Pro-Poor Growth. 1. On Various Ways of Measuring Pro-Poor Growth: A Short Review of the Literature On Measuring Pro-Poor Growh 1. On Various Ways of Measuring Pro-Poor Growh: A Shor eview of he Lieraure During he pas en years or so here have been various suggesions concerning he way one should check

More information

Math 2142 Exam 1 Review Problems. x 2 + f (0) 3! for the 3rd Taylor polynomial at x = 0. To calculate the various quantities:

Math 2142 Exam 1 Review Problems. x 2 + f (0) 3! for the 3rd Taylor polynomial at x = 0. To calculate the various quantities: Mah 4 Eam Review Problems Problem. Calculae he 3rd Taylor polynomial for arcsin a =. Soluion. Le f() = arcsin. For his problem, we use he formula f() + f () + f ()! + f () 3! for he 3rd Taylor polynomial

More information

Navneet Saini, Mayank Goyal, Vishal Bansal (2013); Term Project AML310; Indian Institute of Technology Delhi

Navneet Saini, Mayank Goyal, Vishal Bansal (2013); Term Project AML310; Indian Institute of Technology Delhi Creep in Viscoelasic Subsances Numerical mehods o calculae he coefficiens of he Prony equaion using creep es daa and Herediary Inegrals Mehod Navnee Saini, Mayank Goyal, Vishal Bansal (23); Term Projec

More information

CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK

CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 175 CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 10.1 INTRODUCTION Amongs he research work performed, he bes resuls of experimenal work are validaed wih Arificial Neural Nework. From he

More information

Inventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions

Inventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions Muli-Period Sochasic Models: Opimali of (s, S) Polic for -Convex Objecive Funcions Consider a seing similar o he N-sage newsvendor problem excep ha now here is a fixed re-ordering cos (> 0) for each (re-)order.

More information

Non-uniform circular motion *

Non-uniform circular motion * OpenSax-CNX module: m14020 1 Non-uniform circular moion * Sunil Kumar Singh This work is produced by OpenSax-CNX and licensed under he Creaive Commons Aribuion License 2.0 Wha do we mean by non-uniform

More information

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis Speaker Adapaion Techniques For Coninuous Speech Using Medium and Small Adapaion Daa Ses Consaninos Boulis Ouline of he Presenaion Inroducion o he speaker adapaion problem Maximum Likelihood Sochasic Transformaions

More information

Overview. COMP14112: Artificial Intelligence Fundamentals. Lecture 0 Very Brief Overview. Structure of this course

Overview. COMP14112: Artificial Intelligence Fundamentals. Lecture 0 Very Brief Overview. Structure of this course OMP: Arificial Inelligence Fundamenals Lecure 0 Very Brief Overview Lecurer: Email: Xiao-Jun Zeng x.zeng@mancheser.ac.uk Overview This course will focus mainly on probabilisic mehods in AI We shall presen

More information

Two Coupled Oscillators / Normal Modes

Two Coupled Oscillators / Normal Modes Lecure 3 Phys 3750 Two Coupled Oscillaors / Normal Modes Overview and Moivaion: Today we ake a small, bu significan, sep owards wave moion. We will no ye observe waves, bu his sep is imporan in is own

More information

Numerical Dispersion

Numerical Dispersion eview of Linear Numerical Sabiliy Numerical Dispersion n he previous lecure, we considered he linear numerical sabiliy of boh advecion and diffusion erms when approimaed wih several spaial and emporal

More information

In this chapter the model of free motion under gravity is extended to objects projected at an angle. When you have completed it, you should

In this chapter the model of free motion under gravity is extended to objects projected at an angle. When you have completed it, you should Cambridge Universiy Press 978--36-60033-7 Cambridge Inernaional AS and A Level Mahemaics: Mechanics Coursebook Excerp More Informaion Chaper The moion of projeciles In his chaper he model of free moion

More information

Ensamble methods: Bagging and Boosting

Ensamble methods: Bagging and Boosting Lecure 21 Ensamble mehods: Bagging and Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Ensemble mehods Mixure of expers Muliple base models (classifiers, regressors), each covers a differen par

More information

Echocardiography Project and Finite Fourier Series

Echocardiography Project and Finite Fourier Series Echocardiography Projec and Finie Fourier Series 1 U M An echocardiagram is a plo of how a porion of he hear moves as he funcion of ime over he one or more hearbea cycles If he hearbea repeas iself every

More information

From Particles to Rigid Bodies

From Particles to Rigid Bodies Rigid Body Dynamics From Paricles o Rigid Bodies Paricles No roaions Linear velociy v only Rigid bodies Body roaions Linear velociy v Angular velociy ω Rigid Bodies Rigid bodies have boh a posiion and

More information

FastSLAM: A Factored Solution to the Simultaneous Localization and Mapping Problem

FastSLAM: A Factored Solution to the Simultaneous Localization and Mapping Problem rom: AAAI-02 Proceedings. Copyrigh 2002, AAAI (www.aaai.org). All righs reserved. asslam: A acored Soluion o he Simulaneous Localizaion and Mapping Problem Michael Monemerlo and Sebasian hrun School of

More information

Anno accademico 2006/2007. Davide Migliore

Anno accademico 2006/2007. Davide Migliore Roboica Anno accademico 2006/2007 Davide Migliore migliore@ele.polimi.i Today Eercise session: An Off-side roblem Robo Vision Task Measuring NBA layers erformance robabilisic Roboics Inroducion The Bayesian

More information

Nature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time.

Nature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time. Supplemenary Figure 1 Spike-coun auocorrelaions in ime. Normalized auocorrelaion marices are shown for each area in a daase. The marix shows he mean correlaion of he spike coun in each ime bin wih he spike

More information

A New Perturbative Approach in Nonlinear Singularity Analysis

A New Perturbative Approach in Nonlinear Singularity Analysis Journal of Mahemaics and Saisics 7 (: 49-54, ISSN 549-644 Science Publicaions A New Perurbaive Approach in Nonlinear Singulariy Analysis Ta-Leung Yee Deparmen of Mahemaics and Informaion Technology, The

More information

A First Course on Kinetics and Reaction Engineering. Class 19 on Unit 18

A First Course on Kinetics and Reaction Engineering. Class 19 on Unit 18 A Firs ourse on Kineics and Reacion Engineering lass 19 on Uni 18 Par I - hemical Reacions Par II - hemical Reacion Kineics Where We re Going Par III - hemical Reacion Engineering A. Ideal Reacors B. Perfecly

More information

Time series model fitting via Kalman smoothing and EM estimation in TimeModels.jl

Time series model fitting via Kalman smoothing and EM estimation in TimeModels.jl Time series model fiing via Kalman smoohing and EM esimaion in TimeModels.jl Gord Sephen Las updaed: January 206 Conens Inroducion 2. Moivaion and Acknowledgemens....................... 2.2 Noaion......................................

More information