Prediction of Driving Behavior through Probabilistic Inference
|
|
- Patrick Holland
- 5 years ago
- Views:
Transcription
1 Held n Torremolnos, Malaga (SPAIN) 8-10 September 2003 Predcton of Drvng Behavor through Probablstc Inference Toru Kumaga*, Yasuo Sakaguch**, Masayuk Okuwa*** and Motoyuk Akamatsu* *Natonal Insttute of Advanced Industral Scence and Technology, Tsukuba, Japan **Res. Inst. of Human Engneerng for Qualty Lfe ***Toyota Central Res. & Dev. Labs. Inc. Abstract: Drvng assstance systems are essental technologes to avod traffc accdents, reduce traffc jams, and solve envronmental problems. Not only observable behavoral data, but also unobservable nferred values should be consdered to realze advanced drvng assstance systems that are adaptable to ndvdual drvers and stuatons. For ths purpose, Bayesan networks, whch are the most consstent nference approach, have been appled for estmaton of unobservable physcal values and nternal states ntroduced for convenence s sake. Nevertheless, only a few reports have addressed predcton of future states of drvng behavor. Ths paper proposes predctng drvng behavor n the near future through a smple dynamc Bayesan network, whch s a hdden Markov model or a swtchng lnear dynamc system. The proposed predctors were examned wth real data. We focused on predcton of the future stop probablty at an ntersecton because t s one of the most mportant maneuvers for safety to avod collson wth other traffc elements (.e. other vehcles and pedestrans) at an ntersecton. Both the HMM and the swtchng lnear dynamc system worked well as stop probablty predctors. The HMM represented the temporal structure of human drvng behavor. Keywords: dynamc Bayesan network, hdden Markov model, swtchng lnear dynamc system, drvng behavor predcton, drvng assstance system, mult-step ahead predcton 1. Introducton Drvng assstance systems are essental technologes for avodng traffc accdents, reducng traffc jams, and solvng envronmental problems. These assstance systems, whose typcal feature s a warnng system, produce a decson based on physcal parameters such as headway dstance and vehcle speed. However, ndvdual drvng behavor depends on myrad varables ncludng ndvdual drvng characterstcs, envronmental condtons, drvng ntenton, and so on. Addtonally, some physcal parameters mght not be measured accurately. Not only observable behavoral data, but also unobservable nferred values should be consdered to realze advanced drvng assstance systems that can adapt to ndvdual drvers and stuatons. For ths purpose, Bayesan networks, whch are the most consstent nferental approaches, have been appled for estmaton of unobservable physcal values and nternal states ntroduced for convenence sake. For example, one study [1] nferred a probablstc dstrbuton of brake onset tme to cross lne from varous evdence, such as weather condton, methodcal drvng style scores, accelerator pedal release tmng, and so on. Dynamc Bayesan networks, whch nclude well-known hdden Markov models, have also attracted many researchers. The model used n another study [2] provded a decson-makng model for an autonomous vehcle of a smple smulaton envronment through a dynamc probablstc network. Another study [3] used a hdden Markov model for modelng and recognzng drvng maneuvers at a tactcal level. Dynamc Bayesan networks have also been appled for general behavor modelng: one study [4] appled a swtchng lnear dynamc system for modelng and recognzng human locomoton. Another [5] appled a swtchng Kalman flter for modelng and recognzng smulated drvng behavor. Nevertheless, only a few nvestgatons have addressed predcton of future states of drvng behavor. Most dynamc Bayesan network applcatons have recognzed temporal nformaton or nferred current states, but have not predcted future states. Although one [1] estmated probablstc dstrbutons of future events, t was based on the statc relaton between predefned varables: t used no temporal nformaton. 117
2 Held n Torremolnos, Malaga (SPAIN) 8-10 September 2003 Ths study s ntended to predct drvng behavor n the near future through a smple dynamc Bayesan network, whch s a hdden Markov model or a swtchng lnear dynamc system. The proposed predctors were examned wth real data. We focused on the predcton of the stop probablty at an ntersecton durng a drver s sde turn because that s a very mportant maneuver for safety. It avods collson wth other traffc elements (.e., other vehcles and pedestrans) at an ntersecton. The remander of ths paper wll be organzed as follows. Secton 2 ntroduces dynamc Bayesan networks and predcton algorthms. Secton 3 descrbes measurement of actual vehcle data n the real road envronment. Secton 4 descrbes the learnng procedure of dynamc Bayesan networks. Secton 5 explans stop-probablty predcton. Fnally, Secton 6 concludes wth a dscusson of the method used n ths paper. 2. Dynamc Bayesan networks 2.1 Model structure In ths study, we denoted dynamc Bayesan networks as where ( ) = a, j t δ j t +1 y t ( )δ t ( ) = f s(t) ( y(t 1) ) t s dscrete tme, s t ( ), (1) ( ) s the dscrete state at tme t, ( ) s the probablty of state j at tme t,.e. Pr s( t) = δ j t a, j s the state transton probablty from state to j, ( ) s the observable value vector at tme t y t and f When we assume that f When we assume that f 2.2 Predcton algorthm Gven observaton y t straghtforward ways: (1) Hdden Markov model ( ) = δ ( ), ( ) s the functon that decdes observaton values at state. ( ) ~ N µ,σ ( ) ~ N µ + w y t 1 δ j T + n ( ), (1) s a Gaussan hdden Markov model. ( ( ),Σ ), (1) s a swtchng lnear dynamc system. ( ) { t = 1...T } and nferred δ T ( ) = a, j δ ( T + n 1) ( ) ~ δ ( T + n) y T + n (2) Swtchng lnear dynamc system n = 1... ( ) = a, j δ ( T + n 1) δ j T + n y T + n ( ), the predcton s performed n the followng { } N ( µ,σ ) ; and (2) { } n = 1... ( ) ~ δ ( T + n) N ( µ + w y( T + n 1),Σ ). (3) In ths study, we approxmate (3) to (4) because (3) s computatonally expensve. 118
3 Held n Torremolnos, Malaga (SPAIN) 8-10 September 2003 ( ) = a, j δ ( T + n 1) δ j T + n y T + n n = 1... ( ) ~ δ ( T + n) N µ + w y T + n ( ) ~ N µ + w ( ) = arg max J T + n 1 y ( T ) = y( T ) { } ( y ( T + n 1),Σ ) ( y J ( T +n 1) ( T + n 1 ),Σ ) j ( ) a j, δ j T + n 1 (4) Note that y ( T + n) { n = 1... } s always a normal dstrbuton. 3. Data preparaton We evaluated the proposed predctor usng actual data n a real road envronment. We developed a vehcle equpped wth sensng devces to measure drvng behavor [6]. Sensng devces ncluded those for drver's operatonal behavor, such as steerng wheel operaton, and those for the vehcle condton, such as vehcle speed. Ths study used vehcle speed and pedal strokes of the acceleraton and brake pedals. Pedal sensors attached to the pedals detected the pedal strokes. The speed sgnal was obtaned from the front wheel speed sensor. We resampled the data at 15 Hz after measurements; the samplng rate was 30 Hz for sensor sgnals. We measured the drver s sde turn behavor (.e., rght turn behavor n Japan) 33 tmes at an ntersecton n a suburb of Tsukuba, Japan (Fg. 1). One testee drove the car. We removed portons of 20 [Km/h] or hgher speeds from records. The car stopped once or twce n 16 of 33 cases because the roadway beyond was blocked wth traffc or the traffc sgnal (see the top row of Fgs. 4 and 7). In other cases, the car passed the ntersecton wthout stoppng. We assumed 16 of 33 records to be learnng data. The remanng 17 records were assumed to be test data. 4. Learnng and nference The Baum-Welch algorthm and the Vterb algorthm are the most wdely used learnng and nference algorthms for HMMs. Learnng and nference algorthms for swtchng lnear dynamc systems are gven as the extenson of those of HMMs [9]. In ths study, we used The Bayes Net Toolbox for Matlab [10] for learnng and nference. We gave the speed of the vehcle and the pedal stroke to a dynamc Bayesan network as observable data. The pedal stroke was gven as the subtracton of the stroke of the brake pedal from the stroke of the accelerator pedal (hereafter called the pedal stroke). Szes of the HMM and the swtchng lnear dynamc system were determned by the number of states, Q. The ncrease of Q contrbuted to mprovement of the stop-predcton accuracy. However, performance was not so senstve to Q when Q was somewhat larger. In ths study, we chose Q = 15 for the HMM and Q = 11 for the swtchng lnear dynamc system. Fgure 2 shows topology of the traned HMM. Each f ( ) was plotted on the plane of the speed and the pedal stroke. An ellpse shows Σ. Arrows show major state transtons and ther drectons. The process of deceleratng, stoppng and acceleratng was clearly obtaned n the model. Fgure 3 shows the topology of the traned swtchng lnear dynamc system. Each f ( ) was plotted on the plane of w 11 and w 12. Here, w 11 s the weght between the speed at tme t and tme (t 1) ; w 12 s the weght between the speed at tme t and the pedal stroke at tme (t 1). Arrows show major state transtons and ther drectons. It s dffcult to nterpret topology of the swtchng lnear dynamc system because nodes have many varables and output values depend on nput values. However, t was at least readly plausble that w 12 s greater than 0 and w 11 s nearly equal to
4 Held n Torremolnos, Malaga (SPAIN) 8-10 September Stop probablty predcton We appled the above HMM to estmate the future stop-probablty. We used the followng functon to estmate the stop probablty. stop T (T p ) = Pr(speed(T p ) < Const speed y(t) { t = 1...T }) Pr(pedal _ stroke(t p ) < Const pedal y(t) { t = 1...T }) (5) Here, stop T (T p ) s the stop probablty at tmet p predcted at a predcton pont T. We chose Const speed = 0.5 and Const pedal = 5. Fgure 4 shows an example of the stop probablty predcton at some predcton pont. We denoted the actual stop tme as T stop. The vehcle stopped at around tme 7 s: T stop 7[s]. Predcton of the stop probablty at the actual stop tme, stop T (T stop ), approached 1 as the predcton pont approached the actual stop tme (see also Fg. 6). At around predcton tme 21 s, the predcted stop probablty of the near future rose wthout an actual stop because the speed was suffcently low and the deceleraton rate was hgh. At tme 22 s, the brake pedal was released and the accelerator pedal was depressed. Thereby the stop probablty decreased to zero. The predcted stop dd not actually occur when t was canceled n mdcourse: detectng the change n the pedal stroke, the predctor sgnaled the cancelng of the stop maneuver. Fgure 5 shows the relatonshp between the predcted stop probablty, stop T (T p ), and the actual stop rate. In an deal predctor, a predcted stop probablty s equal to an actual stop rate. Fgure 5 shows that the stop probablty was nearly equal to the actual stop rate even though predcted values were somewhat larger than actual rates. Fgure 6 shows the average value of stop T (T stop ) at every tme to the stop, (T stop T ). Ths fgure shows agan that the stop probablty approached 1 when predcton pont T approached actual stop tme T stop. Ths fact corresponds to ntuton. Along wth the HMM, the swtchng lnear dynamc system also worked well to predct the stop probablty. Fgures 7 9 show the predcton results. In addton to the stop probablty predcton, fg. 7 shows the mean value of predcted vehcle speed. 6. Dscusson The proposed predctor produced the future stop probablty gven current and past observable data. Both the HMM and the swtchng lnear dynamc system worked well. In almost all cases, the proposed system predcted the possblty of a future stop several seconds before ts occurrence, as shown n Fgs. 4 9; the vehcle dd not stop when the predctor dd not predct t, as seen n Fgs. 5 and 8. Ths result s very sgnfcant to consder the applcaton of the proposed system to drvng assstance systems. The predcted stop-probablty consdered changes of the drver s ntenton. The probablty was smaller than 1 even durng the typcal stop maneuver (see Fgs. 6 and 9) because the drver mght change the ntenton and cancel the maneuver before the predcted stop. Therefore, the predcted stop-probablty was nearly equal to the actual stop rate, as seen n Fgs. 5 and 8. Varous factors cause changes of the drver s ntenton. They were modeled n the state transton probabltes. In ths study, predcton was done by sequental nference through dynamc Bayesan networks. In contrast, many conventonal studes have used dynamc Bayesan networks as a classfer. A textbook example s a strategy that prepares HMMs of the same number as recognton objects and then recognzes them by comparng ther lkelhoods. Ths strategy s applcable when only a part of a recognton object 120
5 Held n Torremolnos, Malaga (SPAIN) 8-10 September 2003 s observable. Gven current and past data, HMMs could recognze ongong behavor. However, for a recognton problem of behavor, t mght be dffcult to prepare a categorzed learnng data set for supervsed learnng of such dynamc Bayesan networks. Classfyng human behavor nto dscrete categores s complcated because behavor s dependent on ndvduals and stuatons. Moreover, t s mpossble to determne when an acton starts and ends. These engender the dffculty of behavor defnton. Ths study avoded ths problem by replacng the defnton of the whole sequence of behavor wth behavor results as defned n (5). The easest way to construct a stop-probablty predctor s to prepare a statc table that descrbes the relatonshp between observable data and the frequency of future stops. However, several reasons recommend the dynamc approach. For example, the statc approach requres more tranng data because of long-range predctons. In the prelmnary study of the stop predcton problem of ths paper, a smple probablstc table could not work well n some cases;.e. t could not determne the probablty because of the lack of learnng data. In general, t s dffcult to forecast future states precsely through dynamc Bayesan networks because nference addresses tme slces for whch no observatons have been gven (e.g. [7],[8]). However, the predctor of the present study showed good performance for the specfc temporal structure of human drvng behavor as seen n Fg. 2. That s, the drver behaved accordng to certan habts whle changng ntenton. Ths structure s also an mportant fndng of ths study. The HMM and the swtchng lnear dynamc system descrbed behavor n a qute dfferent manner. The HMM could be a good proflng tool of the drver s behavor because t vsualzes behavor as seen n Fg. 2. On the other hand, the swtchng dynamc system could descrbe behavor wth a smaller number of states than the HMM because a sngle lnear dynamc system descrbes the relaton of observable data between tme t and t 1 ( ). References [1] Y. Sakaguch, M. Okuwa, Ken chro Takguch, and M. Akamatsu, Measurng and modelng of drver for detectng unusual behavor for drvng assstance, (to appear n Proceedngs of 18th Internatonal Conference on Enhanced Safety Vehcles, 2003) [2] J. Forbes, T. Huang, K. Kanazawa, and S. Russell, The BATmoble: Towards a Bayesan Automated Tax, The 1995 Internatonal Jont Conference on AI (1995). [3] N. Olver and A. Pentland, Graphcal Models for Drver Behavor Recognton n a Smart Car, IEEE Intl. Conference on Intellgent Vehcles (2000). [4] Vladmr Pavlovc, James M. Rehg, Tat-Jen Cham, and K. P. Murphy, A dynamcal Bayesan Network Approach to Fgure Trackng Usng Learned Dynamc Models, Internatonal Conference on Computer Vson (1999). [5] A. Pentland and A. Lu, Modelng and Predcton of Human Behavor, Neural Computaton, 11, pp (1999). [6] M. Akamatsu, Measurng Drvng Behavor, detectng unusual behavor for drvng assstance, SICE Annual Conference n Osaka (2002). [7] U. Kjærulff, A computatonal scheme for reasonng n dynamc probablstc networks, Proceedngs of the Eghth Conference on Uncertanty n Artfcal Intellgence, Morgan Kaufmann Publshers, San Mateo, Calforna, pp (1992). [8] P. Dagum, A. Galper, E. Horvtz and A. Sever, Uncertan Reasonng and Forecastng, Internatonal Journal of Forecastng, 11, pp (1995). [9] K. P. Murphy, Dynamc Bayesan Networks: Representaton, Inference and Learnng, dssertaton, PhD thess, U.C. Berkeley, Dept. Comp. Sc. (2002). [10] K. P. Murphy, The Bayes Net Toolbox for Matlab, Computng Scence and Statstcs: Proceedngs of Interface, 33 (2001). 121
6 Held n Torremolnos, Malaga (SPAIN) 8-10 September 2003 Hanamuro rver To Sakura To central Tsukuba To Tuchura 500m To Arakawaok Fg. 1 An ntersecton n a Tsukuba suburb Fg. 2 Topology of the HMM Fg. 3 Topology of the swtchng lnear dynamc system Fg. 4 Stop probablty through the HMM Fg. 5 Predcted stop probablty through the HMM and the actual stop rate Fg. 6 Averaged predcted stop probablty through the HMM 122
7 Held n Torremolnos, Malaga (SPAIN) 8-10 September 2003 Fg. 7 Stop probablty through the swtchng lnear dynamc system Fg. 8 Predcted stop probablty through the swtchng lnear dynamc system and the actual stop rate Fg. 9 Averaged predcted stop probablty through the swtchng lnear dynamc system 123
Statistics for Economics & Business
Statstcs for Economcs & Busness Smple Lnear Regresson Learnng Objectves In ths chapter, you learn: How to use regresson analyss to predct the value of a dependent varable based on an ndependent varable
More informationChapter 13: Multiple Regression
Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to
More informationSimulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests
Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth
More informationHidden Markov Models
CM229S: Machne Learnng for Bonformatcs Lecture 12-05/05/2016 Hdden Markov Models Lecturer: Srram Sankararaman Scrbe: Akshay Dattatray Shnde Edted by: TBD 1 Introducton For a drected graph G we can wrte
More informationA Robust Method for Calculating the Correlation Coefficient
A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal
More informationHomework Assignment 3 Due in class, Thursday October 15
Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.
More informationInternational Journal of Mathematical Archive-3(3), 2012, Page: Available online through ISSN
Internatonal Journal of Mathematcal Archve-3(3), 2012, Page: 1136-1140 Avalable onlne through www.ma.nfo ISSN 2229 5046 ARITHMETIC OPERATIONS OF FOCAL ELEMENTS AND THEIR CORRESPONDING BASIC PROBABILITY
More informationFREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced,
FREQUENCY DISTRIBUTIONS Page 1 of 6 I. Introducton 1. The dea of a frequency dstrbuton for sets of observatons wll be ntroduced, together wth some of the mechancs for constructng dstrbutons of data. Then
More informationPsychology 282 Lecture #24 Outline Regression Diagnostics: Outliers
Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.
More informationOn an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1
On an Extenson of Stochastc Approxmaton EM Algorthm for Incomplete Data Problems Vahd Tadayon Abstract: The Stochastc Approxmaton EM (SAEM algorthm, a varant stochastc approxmaton of EM, s a versatle tool
More informationHidden Markov Models & The Multivariate Gaussian (10/26/04)
CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models
More informationEcon107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)
I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes
More informationFuzzy Boundaries of Sample Selection Model
Proceedngs of the 9th WSES Internatonal Conference on ppled Mathematcs, Istanbul, Turkey, May 7-9, 006 (pp309-34) Fuzzy Boundares of Sample Selecton Model L. MUHMD SFIIH, NTON BDULBSH KMIL, M. T. BU OSMN
More informationDepartment of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6
Department of Quanttatve Methods & Informaton Systems Tme Seres and Ther Components QMIS 30 Chapter 6 Fall 00 Dr. Mohammad Zanal These sldes were modfed from ther orgnal source for educatonal purpose only.
More informationFeature Selection & Dynamic Tracking F&P Textbook New: Ch 11, Old: Ch 17 Guido Gerig CS 6320, Spring 2013
Feature Selecton & Dynamc Trackng F&P Textbook New: Ch 11, Old: Ch 17 Gudo Gerg CS 6320, Sprng 2013 Credts: Materal Greg Welch & Gary Bshop, UNC Chapel Hll, some sldes modfed from J.M. Frahm/ M. Pollefeys,
More informationSTATS 306B: Unsupervised Learning Spring Lecture 10 April 30
STATS 306B: Unsupervsed Learnng Sprng 2014 Lecture 10 Aprl 30 Lecturer: Lester Mackey Scrbe: Joey Arthur, Rakesh Achanta 10.1 Factor Analyss 10.1.1 Recap Recall the factor analyss (FA) model for lnear
More informationA Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach
A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland
More informationSupporting Information
Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to
More informationArtificial Intelligence Bayesian Networks
Artfcal Intellgence Bayesan Networks Adapted from sldes by Tm Fnn and Mare desjardns. Some materal borrowed from Lse Getoor. 1 Outlne Bayesan networks Network structure Condtonal probablty tables Condtonal
More informationNumerical Heat and Mass Transfer
Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and
More informationIntroduction to Hidden Markov Models
Introducton to Hdden Markov Models Alperen Degrmenc Ths document contans dervatons and algorthms for mplementng Hdden Markov Models. The content presented here s a collecton of my notes and personal nsghts
More informationMarkov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement
Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs
More informationKernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationHidden Markov Models
Hdden Markov Models Namrata Vaswan, Iowa State Unversty Aprl 24, 204 Hdden Markov Model Defntons and Examples Defntons:. A hdden Markov model (HMM) refers to a set of hdden states X 0, X,..., X t,...,
More informationChapter 5 Multilevel Models
Chapter 5 Multlevel Models 5.1 Cross-sectonal multlevel models 5.1.1 Two-level models 5.1.2 Multple level models 5.1.3 Multple level modelng n other felds 5.2 Longtudnal multlevel models 5.2.1 Two-level
More informationAn Application of Fuzzy Hypotheses Testing in Radar Detection
Proceedngs of the th WSES Internatonal Conference on FUZZY SYSEMS n pplcaton of Fuy Hypotheses estng n Radar Detecton.K.ELSHERIF, F.M.BBDY, G.M.BDELHMID Department of Mathematcs Mltary echncal Collage
More informationAppendix B: Resampling Algorithms
407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles
More informationStatistics for Managers Using Microsoft Excel/SPSS Chapter 13 The Simple Linear Regression Model and Correlation
Statstcs for Managers Usng Mcrosoft Excel/SPSS Chapter 13 The Smple Lnear Regresson Model and Correlaton 1999 Prentce-Hall, Inc. Chap. 13-1 Chapter Topcs Types of Regresson Models Determnng the Smple Lnear
More informationQuantifying Uncertainty
Partcle Flters Quantfyng Uncertanty Sa Ravela M. I. T Last Updated: Sprng 2013 1 Quantfyng Uncertanty Partcle Flters Partcle Flters Appled to Sequental flterng problems Can also be appled to smoothng problems
More informationShort Term Load Forecasting using an Artificial Neural Network
Short Term Load Forecastng usng an Artfcal Neural Network D. Kown 1, M. Km 1, C. Hong 1,, S. Cho 2 1 Department of Computer Scence, Sangmyung Unversty, Seoul, Korea 2 Department of Energy Grd, Sangmyung
More informationINF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018
INF 5860 Machne learnng for mage classfcaton Lecture 3 : Image classfcaton and regresson part II Anne Solberg January 3, 08 Today s topcs Multclass logstc regresson and softma Regularzaton Image classfcaton
More informationLearning undirected Models. Instructor: Su-In Lee University of Washington, Seattle. Mean Field Approximation
Readngs: K&F 0.3, 0.4, 0.6, 0.7 Learnng undrected Models Lecture 8 June, 0 CSE 55, Statstcal Methods, Sprng 0 Instructor: Su-In Lee Unversty of Washngton, Seattle Mean Feld Approxmaton Is the energy functonal
More informationOperating conditions of a mine fan under conditions of variable resistance
Paper No. 11 ISMS 216 Operatng condtons of a mne fan under condtons of varable resstance Zhang Ynghua a, Chen L a, b, Huang Zhan a, *, Gao Yukun a a State Key Laboratory of Hgh-Effcent Mnng and Safety
More informationNegative Binomial Regression
STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...
More informationPop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing
Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,
More informationA New Evolutionary Computation Based Approach for Learning Bayesian Network
Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang
More informationTime-Varying Systems and Computations Lecture 6
Tme-Varyng Systems and Computatons Lecture 6 Klaus Depold 14. Januar 2014 The Kalman Flter The Kalman estmaton flter attempts to estmate the actual state of an unknown dscrete dynamcal system, gven nosy
More informationTracking with Kalman Filter
Trackng wth Kalman Flter Scott T. Acton Vrgna Image and Vdeo Analyss (VIVA), Charles L. Brown Department of Electrcal and Computer Engneerng Department of Bomedcal Engneerng Unversty of Vrgna, Charlottesvlle,
More informationPower law and dimension of the maximum value for belief distribution with the max Deng entropy
Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng
More informationEM and Structure Learning
EM and Structure Learnng Le Song Machne Learnng II: Advanced Topcs CSE 8803ML, Sprng 2012 Partally observed graphcal models Mxture Models N(μ 1, Σ 1 ) Z X N N(μ 2, Σ 2 ) 2 Gaussan mxture model Consder
More informationStructure and Drive Paul A. Jensen Copyright July 20, 2003
Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification
E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton
More informationA PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS
HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,
More informationInductance Calculation for Conductors of Arbitrary Shape
CRYO/02/028 Aprl 5, 2002 Inductance Calculaton for Conductors of Arbtrary Shape L. Bottura Dstrbuton: Internal Summary In ths note we descrbe a method for the numercal calculaton of nductances among conductors
More informationCSC321 Tutorial 9: Review of Boltzmann machines and simulated annealing
CSC321 Tutoral 9: Revew of Boltzmann machnes and smulated annealng (Sldes based on Lecture 16-18 and selected readngs) Yue L Emal: yuel@cs.toronto.edu Wed 11-12 March 19 Fr 10-11 March 21 Outlne Boltzmann
More informationBasic Business Statistics, 10/e
Chapter 13 13-1 Basc Busness Statstcs 11 th Edton Chapter 13 Smple Lnear Regresson Basc Busness Statstcs, 11e 009 Prentce-Hall, Inc. Chap 13-1 Learnng Objectves In ths chapter, you learn: How to use regresson
More information6 Supplementary Materials
6 Supplementar Materals 61 Proof of Theorem 31 Proof Let m Xt z 1:T : l m Xt X,z 1:t Wethenhave mxt z1:t ˆm HX Xt z 1:T mxt z1:t m HX Xt z 1:T + mxt z 1:T HX We consder each of the two terms n equaton
More informationUncertainty and auto-correlation in. Measurement
Uncertanty and auto-correlaton n arxv:1707.03276v2 [physcs.data-an] 30 Dec 2017 Measurement Markus Schebl Federal Offce of Metrology and Surveyng (BEV), 1160 Venna, Austra E-mal: markus.schebl@bev.gv.at
More informationClock-Gating and Its Application to Low Power Design of Sequential Circuits
Clock-Gatng and Its Applcaton to Low Power Desgn of Sequental Crcuts ng WU Department of Electrcal Engneerng-Systems, Unversty of Southern Calforna Los Angeles, CA 989, USA, Phone: (23)74-448 Massoud PEDRAM
More informationParameter Estimation for Dynamic System using Unscented Kalman filter
Parameter Estmaton for Dynamc System usng Unscented Kalman flter Jhoon Seung 1,a, Amr Atya F. 2,b, Alexander G.Parlos 3,c, and Klto Chong 1,4,d* 1 Dvson of Electroncs Engneerng, Chonbuk Natonal Unversty,
More informationNeural Networks & Learning
Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred
More informationMarkov Chain Monte Carlo Lecture 6
where (x 1,..., x N ) X N, N s called the populaton sze, f(x) f (x) for at least one {1, 2,..., N}, and those dfferent from f(x) are called the tral dstrbutons n terms of mportance samplng. Dfferent ways
More informationBACKGROUND SUBTRACTION WITH EIGEN BACKGROUND METHODS USING MATLAB
BACKGROUND SUBTRACTION WITH EIGEN BACKGROUND METHODS USING MATLAB 1 Ilmyat Sar 2 Nola Marna 1 Pusat Stud Komputas Matematka, Unverstas Gunadarma e-mal: lmyat@staff.gunadarma.ac.d 2 Pusat Stud Komputas
More informationMultiple Sound Source Location in 3D Space with a Synchronized Neural System
Multple Sound Source Locaton n D Space wth a Synchronzed Neural System Yum Takzawa and Atsush Fukasawa Insttute of Statstcal Mathematcs Research Organzaton of Informaton and Systems 0- Mdor-cho, Tachkawa,
More informationChapter 9: Statistical Inference and the Relationship between Two Variables
Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,
More informationHidden Markov Models
Note to other teachers and users of these sldes. Andrew would be delghted f you found ths source materal useful n gvng your own lectures. Feel free to use these sldes verbatm, or to modfy them to ft your
More informationFor now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.
Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson
More informationNote 10. Modeling and Simulation of Dynamic Systems
Lecture Notes of ME 475: Introducton to Mechatroncs Note 0 Modelng and Smulaton of Dynamc Systems Department of Mechancal Engneerng, Unversty Of Saskatchewan, 57 Campus Drve, Saskatoon, SK S7N 5A9, Canada
More informationPulse Coded Modulation
Pulse Coded Modulaton PCM (Pulse Coded Modulaton) s a voce codng technque defned by the ITU-T G.711 standard and t s used n dgtal telephony to encode the voce sgnal. The frst step n the analog to dgtal
More informationCOMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
More informationUncertainty as the Overlap of Alternate Conditional Distributions
Uncertanty as the Overlap of Alternate Condtonal Dstrbutons Olena Babak and Clayton V. Deutsch Centre for Computatonal Geostatstcs Department of Cvl & Envronmental Engneerng Unversty of Alberta An mportant
More informationApplication research on rough set -neural network in the fault diagnosis system of ball mill
Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(4):834-838 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 Applcaton research on rough set -neural network n the
More information4.3 Poisson Regression
of teratvely reweghted least squares regressons (the IRLS algorthm). We do wthout gvng further detals, but nstead focus on the practcal applcaton. > glm(survval~log(weght)+age, famly="bnomal", data=baby)
More informationPREDICTIVE CONTROL BY DISTRIBUTED PARAMETER SYSTEMS BLOCKSET FOR MATLAB & SIMULINK
PREDICTIVE CONTROL BY DISTRIBUTED PARAMETER SYSTEMS BLOCKSET FOR MATLAB & SIMULINK G. Hulkó, C. Belavý, P. Buček, P. Noga Insttute of automaton, measurement and appled nformatcs, Faculty of Mechancal Engneerng,
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationCIS587 - Artificial Intellgence. Bayesian Networks CIS587 - AI. KB for medical diagnosis. Example.
CIS587 - Artfcal Intellgence Bayesan Networks KB for medcal dagnoss. Example. We want to buld a KB system for the dagnoss of pneumona. Problem descrpton: Dsease: pneumona Patent symptoms (fndngs, lab tests):
More informationOnline Classification: Perceptron and Winnow
E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng
More information(Online First)A Lattice Boltzmann Scheme for Diffusion Equation in Spherical Coordinate
Internatonal Journal of Mathematcs and Systems Scence (018) Volume 1 do:10.494/jmss.v1.815 (Onlne Frst)A Lattce Boltzmann Scheme for Dffuson Equaton n Sphercal Coordnate Debabrata Datta 1 *, T K Pal 1
More informationChapter 15 - Multiple Regression
Chapter - Multple Regresson Chapter - Multple Regresson Multple Regresson Model The equaton that descrbes how the dependent varable y s related to the ndependent varables x, x,... x p and an error term
More informationDesign and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm
Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:
More informationA General Method for Assessing the Uncertainty in Classified Remotely Sensed Data at Pixel Scale
Proceedngs of the 8th Internatonal Symposum on Spatal Accuracy Assessment n Natural Resources and Envronmental Scences Shangha, P. R. Chna, June 25-27, 2008, pp. 86-94 A General ethod for Assessng the
More informationCopyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for P Charts. Dr. Wayne A. Taylor
Taylor Enterprses, Inc. Control Lmts for P Charts Copyrght 2017 by Taylor Enterprses, Inc., All Rghts Reserved. Control Lmts for P Charts Dr. Wayne A. Taylor Abstract: P charts are used for count data
More informationPredictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore
Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.
More informationStatistics for Business and Economics
Statstcs for Busness and Economcs Chapter 11 Smple Regresson Copyrght 010 Pearson Educaton, Inc. Publshng as Prentce Hall Ch. 11-1 11.1 Overvew of Lnear Models n An equaton can be ft to show the best lnear
More informationGlobal Sensitivity. Tuesday 20 th February, 2018
Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values
More informationMotion Perception Under Uncertainty. Hongjing Lu Department of Psychology University of Hong Kong
Moton Percepton Under Uncertanty Hongjng Lu Department of Psychology Unversty of Hong Kong Outlne Uncertanty n moton stmulus Correspondence problem Qualtatve fttng usng deal observer models Based on sgnal
More information15-381: Artificial Intelligence. Regression and cross validation
15-381: Artfcal Intellgence Regresson and cross valdaton Where e are Inputs Densty Estmator Probablty Inputs Classfer Predct category Inputs Regressor Predct real no. Today Lnear regresson Gven an nput
More informationECE559VV Project Report
ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate
More informationRegularized Discriminant Analysis for Face Recognition
1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths
More informationBayesian predictive Configural Frequency Analysis
Psychologcal Test and Assessment Modelng, Volume 54, 2012 (3), 285-292 Bayesan predctve Confgural Frequency Analyss Eduardo Gutérrez-Peña 1 Abstract Confgural Frequency Analyss s a method for cell-wse
More informationLecture 12: Classification
Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna
More informationComparison of Regression Lines
STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence
More informationSpeech and Language Processing
Speech and Language rocessng Lecture 3 ayesan network and ayesan nference Informaton and ommuncatons Engneerng ourse Takahro Shnozak 08//5 Lecture lan (Shnozak s part) I gves the frst 6 lectures about
More informationThe Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD
he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s
More informationSystem identifications by SIRMs models with linear transformation of input variables
ORIGINAL RESEARCH System dentfcatons by SIRMs models wth lnear transformaton of nput varables Hrofum Myama, Nortaka Shge, Hrom Myama Graduate School of Scence and Engneerng, Kagoshma Unversty, Japan Receved:
More informationOther NN Models. Reinforcement learning (RL) Probabilistic neural networks
Other NN Models Renforcement learnng (RL) Probablstc neural networks Support vector machne (SVM) Renforcement learnng g( (RL) Basc deas: Supervsed dlearnng: (delta rule, BP) Samples (x, f(x)) to learn
More informationStatistical Foundations of Pattern Recognition
Statstcal Foundatons of Pattern Recognton Learnng Objectves Bayes Theorem Decson-mang Confdence factors Dscrmnants The connecton to neural nets Statstcal Foundatons of Pattern Recognton NDE measurement
More informationOutline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline
Outlne Bayesan Networks: Maxmum Lkelhood Estmaton and Tree Structure Learnng Huzhen Yu janey.yu@cs.helsnk.f Dept. Computer Scence, Unv. of Helsnk Probablstc Models, Sprng, 200 Notces: I corrected a number
More informationDepartment of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING
MACHINE LEANING Vasant Honavar Bonformatcs and Computatonal Bology rogram Center for Computatonal Intellgence, Learnng, & Dscovery Iowa State Unversty honavar@cs.astate.edu www.cs.astate.edu/~honavar/
More informationLearning from Data 1 Naive Bayes
Learnng from Data 1 Nave Bayes Davd Barber dbarber@anc.ed.ac.uk course page : http://anc.ed.ac.uk/ dbarber/lfd1/lfd1.html c Davd Barber 2001, 2002 1 Learnng from Data 1 : c Davd Barber 2001,2002 2 1 Why
More informationThe Expectation-Maximization Algorithm
The Expectaton-Maxmaton Algorthm Charles Elan elan@cs.ucsd.edu November 16, 2007 Ths chapter explans the EM algorthm at multple levels of generalty. Secton 1 gves the standard hgh-level verson of the algorthm.
More informationMLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012
MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:
More informationFeature Selection: Part 1
CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?
More informationChapter 11: Simple Linear Regression and Correlation
Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests
More informationGaussian Mixture Models
Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous
More informationCHAPTER-5 INFORMATION MEASURE OF FUZZY MATRIX AND FUZZY BINARY RELATION
CAPTER- INFORMATION MEASURE OF FUZZY MATRI AN FUZZY BINARY RELATION Introducton The basc concept of the fuzz matr theor s ver smple and can be appled to socal and natural stuatons A branch of fuzz matr
More informationThe Chaotic Robot Prediction by Neuro Fuzzy Algorithm (2) = θ (3) = ω. Asin. A v. Mana Tarjoman, Shaghayegh Zarei
The Chaotc Robot Predcton by Neuro Fuzzy Algorthm Mana Tarjoman, Shaghayegh Zare Abstract In ths paper an applcaton of the adaptve neurofuzzy nference system has been ntroduced to predct the behavor of
More informationResource Allocation and Decision Analysis (ECON 8010) Spring 2014 Foundations of Regression Analysis
Resource Allocaton and Decson Analss (ECON 800) Sprng 04 Foundatons of Regresson Analss Readng: Regresson Analss (ECON 800 Coursepak, Page 3) Defntons and Concepts: Regresson Analss statstcal technques
More informationDifference Equations
Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1
More informationLinear Regression Analysis: Terminology and Notation
ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented
More information