Time Series Forecasting Using Artificial Neural Networks under Dempster Shafer Evidence Theory and Trimmed-winsorized Means
|
|
- Joy Hart
- 5 years ago
- Views:
Transcription
1 Internatonal Journal of Informaton and Computaton Technology. ISSN Volume 3, Number 5 (2013), pp Internatonal Research Publcatons House rphouse.com /jct.htm Tme Seres Forecastng Usng Artfcal Neural Networks under Dempster Shafer Evdence Theory and Trmmed-wnsorzed Means Har Teja.A 1 and S.Vasundra 2 1 CSE Department, JNTUACEA, AP, INDIA. 2 CSE Department, JNTUACEA, AP, INDIA. Abstract Artfcal neural networks are adequate for multple jobs n desgn recognton and machne-learnng. Contrastng to the tradtonal approaches for tme seres evaluaton, a synthetc neural system needs advce regardng the tme seres data and could be placed on a comprehensve varety of ssues. For the use, we conducted experments to obtan the approprate gudelnes for a forecastng system. The artfcal neural networks whch were dscovered had provded a better forecastng effcency under well known feature selecton approach. An Accurate tme seres forecastng usng whch s sgnfcant to demonstrate the approach wthn the past contnues n the drecton to nfluence the future as well as to schedule our daly actons. At present, a huge lterature s n progress on the usage of artfcal neural networks (ANNs) n many forecastng applcatons. In ths regard the proposed artfcal neural networks based on Tme Seres Forecastng use Dempster Shafer evdence theory and trmmed- Wnsorzed means for feature selecton. A comparatve study between these two methods, wth a set of referenced tme seres wll be shown. Keywords: Tme seres Forecastng; Dempster Shafer; Trmmed- Wnsorzed; Artfcal Neural networks; Machne Learnng. 1. Introducton Normally, the gudelnes of the typcal approach [7] are based on frequency range and the auto connecton of tme seres. The utlzaton of artfcal neural networks for tme sequence analyss reles just about the data whch were dscovered. A synthetc neural system s strong enough to sgnfy any knd of tme seres, as multple layer feed
2 384 Har Teja.A & S.Vasundra forward systems wth an adequate amount of hdden models and one or more hdden layers are capable of approxmatng functon [5, 6] to any consderable. The forward and backward paths of the completely connected feed forward network may be appled by nternal and external products of matrces and vectors n several outlnes of APL code. For the applcaton, we choose to make use of a completely joned, layered, feed forward artfcal neural system wth a sngle hdden layer and the back-propagaton learnng algorthm[1,2]. The fundamentals of the method for understandng n neural nets were set by [2]. Artfcal neural nets contan many smple processng devces (called Processng factors or neurons) grouped n levels. Each layer s recognzed from the ndex l=o,...,m. The processng elements are nterlocked as follows, Conversaton between processng elements s permtted for processng elements of nearby levels [3]. Neurons n a level cannot convey, each neuron has a partcular actvaton level 'a' as shown n Fg.1, Data s processed by the network from the exchange of servce levels between neurons. Fgure 1: Interchangng of actvaton values among neurons. The output value of the -th neuron n sheet l s denoted by ( l ), It s calculated wth the formula = g( a l ), ( l ) where g( a l ) s a monotone growng functon. For nstance, let us consder the 1 functon g(y) as the squashng functon where g( y). y 1 e The commencement level a l of the neuron n layer l s consdered by a f ( u ), ( l) ( l)
3 Tme Seres Forecastng Usng Artfcal Neural Networks under 385 Where f( u ( l) ) s the actvaton functon (n our case the characterstcs functon s used). The net nput u l of neuron n layer l s projected as follows, l, ( l ) n ( l ) ( l ) ( l 1) ( l) ( j j ) j 1 u Where l s the weght of neuron j n layer l 1 connected to neuron n layer ( l 1) j j the output of neuron j n layer l 1. l s a prejudce allege that s removed from the total amount of the weghted actvatons. 2. Implementaton The tme seres modelng and forecastng classfcaton was mplemented usng core 2 duo processor wth 4 GB RAM of an auxlary workstaton. The system conssts of two man components: A toolkt of APL functons that constran the neural network and log parameters and results of the replcaton runs to APL component fles. A graphcal user nterface that allows the user to navgate durng the smulatons and to evaluate the actual tme seres generated by the neural network. Fgure 2: User nterface of the forecastng system. Fg. 2 portrays a dsplay dump of the nterface, the damaged lne dsplays the real tme sequence and the strong lne sgnfes the network's output. The outlook data s dvded from the hstorc data - the network's tranng set - by the vertcal bar n the
4 386 Har Teja.A & S.Vasundra best quarter of the chart, the menu n the top left part of number 3 enables the person to choose a perspectve of the network's forecastng capacty at dfferent countres through the entre learnng stage. By lookng at the record fles of the smulator runs, current and prevous outcomes could be assessed and compared. 2.1 The Tranng Sets As test bed for our forecastng program we utlzed two well-known tme sequence from [7], the monthly totals of nternatonal flght passengers (thousands of passengers) from 1949 to 1960, as well as the daly closng costs of share nventory. Table 1: Propertes of tme seres. Tme Seres n Share Prce Grd power unt Prce Where s the standard devaton, s the mean and n s the number of observatons. Followng secton wll explan about how a neural network learn a tme seres. 2.2 Back-Propagaton Algorthm The neural network sees the tme seres X1,....,Xm n the form of many mappngs of an nput vector to an output value. Ths technque was presented n [5]. Fgure 3: Learnng a Tme Seres.
5 Tme Seres Forecastng Usng Artfcal Neural Networks under 387 A number of adjonng data ponts of the tme seres (the nput wndow X t to X t-s ) as shown n Fg.3 are planned to the nterval [0,1] and utlzed as servce levels for the models of the entre level. The problem useful for that back-propagaton learnng algorthm s currently computed by evaluatng the worth of the result unt together wth the value of the tme sequence at tme' t 1 '. Tranng a sensory network wth all the back-propagaton algorthm generally demands that representatons of the entre set (called one epoch) whch s offered for several tmes. To understand the perod of seres data, the representatons were offered n a random fashon, As reported n [5], pckng a random place for every portrayal's nput wndow ensures better system functonalty and prevents local mnma. 3. Comparson between Dvergent Feature Selecton Modelng s On comparng the results ganed under feature selecton methods called Dempster Shafer evdence theory and trmmed-wnsorzed means, the executon of the procedure uses the applcatons explaned by Jenkns and Box n Component V of ther classc [7]. The approach s known as an autoregressve ntegrated movng average method of order (p, d, and q). It s descrbed by Eqs (1-2) as follows, Xt d a( z) X b( z) U t t Where stands for n tme ordered values of a tme seres, t = 1... N for n observatons. Ut Is a sequence of random values called whte nose process. The backward dfference operator V s defned as Xt Xt Xt 1 (1 z) Xt (2) We fxed these feature selecton models for each tme seres and let t predct the next 20 observatons of the tme seres. The last 20 observatons were dropped from the tme seres and used to calculate the predcton error of the models. The followng feature selecton models were calculated for the power unt prce of the tme seres after a logarthmc transformaton as, (1 - Z) (1 - z 12 X ) t = ( , z 12 ) Ut and for the Share tme seres, (1 - X z) t = ( z) Ut In the forecasted errors of the man-made neural network (ANN), the artfcal neural network utlzng the logarthmc and change (ANN record,) n the feature selecton models opted are compared. The artfcal neural network utlzng the logarthmc and changed tme seres out-performed for both tme seres, whereas the "smple" artfcal neural network expected more precsely just for the Shares tme seres [4]. Ths behavor could be descrbed as, the bgger the data selecton n the tme seres, It results n a reduced precson for the untransformed place set. Logarthmc and dfferencng transformatons helped to remove the tendency and planned the tme seres data n to a smaller varety. (1)
6 388 Har Teja.A & S.Vasundra Table 2: Forecastng errors for ANN under Dempster Shafer evdence theory and Trmmed-Wnsorzed means. Tme seres Dempster Shafer evdence theory Trmmed- Wnsorzed means power unt prces Share prce Fgure 4: Fault proneness of ANN under Dempster Shafer evdence theory and Trmmed-Wnsorzed means. 4. Concluson and Future Work We've offered a predctng program for un-varant tme sequence that utlzes artfcal neural networks. These processng devces proved themselves to be feasble optons to classc approaches. The machne may be utlzed along wth other approaches for tme seres evaluaton or just as a standalone devce. Enhancement of the work may nclude the assessment wth addtonal TSA (tme seres analyss) approaches, development of hybrd process that blend the power of standard methods wth artfcal neural networks and the use of our program to multvarate tme seres. References [1] Wezhong Yan and Senor Member "Toward Automatc Tme-Seres Forecastng Usng Neural Networks " n the proceedngs of IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 23, NO. 7, JULY 2012
7 Tme Seres Forecastng Usng Artfcal Neural Networks under 389 [2] G. P. Zhang and D. M. Klne, Quarterly tme seres forecastng wth neural networks, IEEE Trans. Neural Newt., vol. 18, no. 6, pp , Jun [3] S. Crone and P. Grafelle, An evaluaton framework for publcatons on artfcal neural networks n sales forecastng, n Proc. Int. Conf. Artfcal ntellgence., Las Vegas, NV, Jun. 2004, pp [4] S. D. Balkn and J. K. Ord, Automatc neural network modelng for unvarate tme seres, Int. J. Forecast., vol. 16, no. 4, pp ,2000. [5] Kurt Hornk, Maxwell Stnchcombe, and Halbert Whte. Multlayer Feed forward Networks are Unversal Approxmators. Neural Networks, 2: , [6] Halbert Whte. Economc predcton usng neural networks: the case of bm daly stock returns. In Proceedngs of the IEEE Internatonal Conference on Neural Networks, pages II , [7] George E. P. Box and Gwlym M. Jenkns. Tme Seres Analyss - forecastng and control. Seres n Tme Seres Analyss. Holden-Day, 500 Sansome Street, San Francsco, Calforna, 1976.
8 390 Har Teja.A & S.Vasundra
Solving Nonlinear Differential Equations by a Neural Network Method
Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,
More informationNeural Networks & Learning
Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred
More informationShort Term Load Forecasting using an Artificial Neural Network
Short Term Load Forecastng usng an Artfcal Neural Network D. Kown 1, M. Km 1, C. Hong 1,, S. Cho 2 1 Department of Computer Scence, Sangmyung Unversty, Seoul, Korea 2 Department of Energy Grd, Sangmyung
More informationEEE 241: Linear Systems
EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they
More informationMulti-Step-Ahead Prediction of Stock Price Using a New Architecture of Neural Networks
Journal of Computer & Robotcs 8(), 05 47-56 47 Mult-Step-Ahead Predcton of Stoc Prce Usng a New Archtecture of Neural Networs Mohammad Taleb Motlagh *, Hamd Khaloozadeh Department of Systems and Control,
More informationSupporting Information
Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to
More informationPop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing
Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,
More informationStatistics for Economics & Business
Statstcs for Economcs & Busness Smple Lnear Regresson Learnng Objectves In ths chapter, you learn: How to use regresson analyss to predct the value of a dependent varable based on an ndependent varable
More informationNeural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17
Neural Networks Perceptrons and Backpropagaton Slke Bussen-Heyen Unverstät Bremen Fachberech 3 5th of Novemeber 2012 Neural Networks 1 / 17 Contents 1 Introducton 2 Unts 3 Network structure 4 Snglelayer
More informationMultilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata
Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,
More informationFor now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.
Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson
More informationStatistical Evaluation of WATFLOOD
tatstcal Evaluaton of WATFLD By: Angela MacLean, Dept. of Cvl & Envronmental Engneerng, Unversty of Waterloo, n. ctober, 005 The statstcs program assocated wth WATFLD uses spl.csv fle that s produced wth
More informationMultilayer neural networks
Lecture Multlayer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Mdterm exam Mdterm Monday, March 2, 205 In-class (75 mnutes) closed book materal covered by February 25, 205 Multlayer
More informationAn Improved multiple fractal algorithm
Advanced Scence and Technology Letters Vol.31 (MulGraB 213), pp.184-188 http://dx.do.org/1.1427/astl.213.31.41 An Improved multple fractal algorthm Yun Ln, Xaochu Xu, Jnfeng Pang College of Informaton
More informationCOMPARING NOISE REMOVAL IN THE WAVELET AND FOURIER DOMAINS
COMPARING NOISE REMOVAL IN THE WAVELET AND FOURIER DOMAINS Robert J. Barsant, and Jordon Glmore Department of Electrcal and Computer Engneerng The Ctadel Charleston, SC, 29407 e-mal: robert.barsant@ctadel.edu
More informationCOMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
More information1 Convex Optimization
Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,
More informationDepartment of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6
Department of Quanttatve Methods & Informaton Systems Tme Seres and Ther Components QMIS 30 Chapter 6 Fall 00 Dr. Mohammad Zanal These sldes were modfed from ther orgnal source for educatonal purpose only.
More informationNumerical Heat and Mass Transfer
Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and
More informationMultilayer Perceptron (MLP)
Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne
More informationAppendix B: Resampling Algorithms
407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles
More informationMulti-layer neural networks
Lecture 0 Mult-layer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Lnear regresson w Lnear unts f () Logstc regresson T T = w = p( y =, w) = g( w ) w z f () = p ( y = ) w d w d Gradent
More informationWeek 5: Neural Networks
Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple
More informationA LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS. Dr. Derald E. Wentzien, Wesley College, (302) ,
A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS Dr. Derald E. Wentzen, Wesley College, (302) 736-2574, wentzde@wesley.edu ABSTRACT A lnear programmng model s developed and used to compare
More informationStructure and Drive Paul A. Jensen Copyright July 20, 2003
Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.
More informationA PAPER CLOCK MODEL FOR THE CESIUM CLOCK ENSEMBLE OF TL
A PAPER CLOCK MODEL FOR THE CESIUM CLOCK ESEMBLE OF TL Shnn Yan Ln and Hsn Mn Peng atonal Standard Tme and Frequency Lab., TL, Chunghwa Telecom Co., Ltd. o. Lane 55, Mn-Tsu Rd. Sec. 5, YangMe, TaoYuan,
More informationComparison of Regression Lines
STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence
More informationNUMERICAL DIFFERENTIATION
NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the
More informationMicrowave Diversity Imaging Compression Using Bioinspired
Mcrowave Dversty Imagng Compresson Usng Bonspred Neural Networks Youwe Yuan 1, Yong L 1, Wele Xu 1, Janghong Yu * 1 School of Computer Scence and Technology, Hangzhou Danz Unversty, Hangzhou, Zhejang,
More informationRBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis
Appled Mechancs and Materals Submtted: 24-6-2 ISSN: 662-7482, Vols. 62-65, pp 2383-2386 Accepted: 24-6- do:.428/www.scentfc.net/amm.62-65.2383 Onlne: 24-8- 24 rans ech Publcatons, Swtzerland RBF Neural
More informationLinear Feature Engineering 11
Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19
More informationTransfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system
Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng
More informationMidterm Examination. Regression and Forecasting Models
IOMS Department Regresson and Forecastng Models Professor Wllam Greene Phone: 22.998.0876 Offce: KMC 7-90 Home page: people.stern.nyu.edu/wgreene Emal: wgreene@stern.nyu.edu Course web page: people.stern.nyu.edu/wgreene/regresson/outlne.htm
More informationA new Approach for Solving Linear Ordinary Differential Equations
, ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of
More informationADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING
1 ADVANCED ACHINE LEARNING ADVANCED ACHINE LEARNING Non-lnear regresson technques 2 ADVANCED ACHINE LEARNING Regresson: Prncple N ap N-dm. nput x to a contnuous output y. Learn a functon of the type: N
More informationSEASONAL TIME SERIES PREDICTION WITH ARTIFICIAL NEURAL NETWORKS AND LOCAL MEASURES. R. Pinto, S. Cavalieri
SEASONAL TIME SERIES PREDICTION WITH ARTIFICIAL NEURAL NETWORKS AND LOCAL MEASURES R. Pnto, S. Cavaler Unversty of Bergamo, Department of Industral Engneerng, Italy vale Marcon, 5. I-24044 Dalmne BG Ph.:
More informationDepartment of Statistics University of Toronto STA305H1S / 1004 HS Design and Analysis of Experiments Term Test - Winter Solution
Department of Statstcs Unversty of Toronto STA35HS / HS Desgn and Analyss of Experments Term Test - Wnter - Soluton February, Last Name: Frst Name: Student Number: Instructons: Tme: hours. Ads: a non-programmable
More informationBearing Remaining Useful Life Prediction Based on an Improved Back Propagation Neural Network
nternatonal Journal of Performablty Engneerng, Vol. 10, No. 6, September 2014, pp.653-657. RAMS Consultants Prnted n nda Bearng Remanng Useful Lfe Predcton Based on an mproved Back Propagaton Neural Network
More informationLecture 4: November 17, Part 1 Single Buffer Management
Lecturer: Ad Rosén Algorthms for the anagement of Networs Fall 2003-2004 Lecture 4: November 7, 2003 Scrbe: Guy Grebla Part Sngle Buffer anagement In the prevous lecture we taled about the Combned Input
More informationA Comparison on Neural Network Forecasting
011 nternatonal Conference on Crcuts, System and Smulaton PCST vol.7 (011) (011) ACST Press, Sngapore A Comparson on Neural Networ Forecastng Hong-Choon Ong 1, a and Shn-Yue Chan 1, b 1 School of Mathematcal
More informationData Mining Using Surface and Deep Agents Based on Neural Networks
Assocaton for Informaton Systems AIS Electronc Lbrary (AISeL) AMCIS 2010 Proceedngs Amercas Conference on Informaton Systems (AMCIS) 8-2010 Based on Neural Networks Subhash Kak Oklahoma State Unversty,
More informationUncertainty in measurements of power and energy on power networks
Uncertanty n measurements of power and energy on power networks E. Manov, N. Kolev Department of Measurement and Instrumentaton, Techncal Unversty Sofa, bul. Klment Ohrdsk No8, bl., 000 Sofa, Bulgara Tel./fax:
More informationStudy on Project Bidding Risk Evaluation Based on BP Neural Network Theory
1192 Proceedngs of the 7th Internatonal Conference on Innovaton & Management Study on Proect Bddng Rs Evaluaton Based on BP Neural Networ Theory Wang Xnzheng 1, He Png 2 1 School of Cvl Engneerng, Nanyang
More informationChapter 9: Statistical Inference and the Relationship between Two Variables
Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,
More informationThe Study of Teaching-learning-based Optimization Algorithm
Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute
More informationWhy feed-forward networks are in a bad shape
Why feed-forward networks are n a bad shape Patrck van der Smagt, Gerd Hrznger Insttute of Robotcs and System Dynamcs German Aerospace Center (DLR Oberpfaffenhofen) 82230 Wesslng, GERMANY emal smagt@dlr.de
More informationNegative Binomial Regression
STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...
More informationTurbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH
Turbulence classfcaton of load data by the frequency and severty of wnd gusts Introducton Oscar Moñux, DEWI GmbH Kevn Blebler, DEWI GmbH Durng the wnd turbne developng process, one of the most mportant
More informationChapter 5 Multilevel Models
Chapter 5 Multlevel Models 5.1 Cross-sectonal multlevel models 5.1.1 Two-level models 5.1.2 Multple level models 5.1.3 Multple level modelng n other felds 5.2 Longtudnal multlevel models 5.2.1 Two-level
More informationAn Empirical Study of Fuzzy Approach with Artificial Neural Network Models
An Emprcal Study of Fuzzy Approach wth Artfcal Neural Networ Models Memmedaga Memmedl and Ozer Ozdemr Abstract Tme seres forecastng based on fuzzy approach by usng artfcal neural networs s a sgnfcant topc
More informationA Network Intrusion Detection Method Based on Improved K-means Algorithm
Advanced Scence and Technology Letters, pp.429-433 http://dx.do.org/10.14257/astl.2014.53.89 A Network Intruson Detecton Method Based on Improved K-means Algorthm Meng Gao 1,1, Nhong Wang 1, 1 Informaton
More informationEvaluation of classifiers MLPs
Lecture Evaluaton of classfers MLPs Mlos Hausrecht mlos@cs.ptt.edu 539 Sennott Square Evaluaton For any data set e use to test the model e can buld a confuson matrx: Counts of examples th: class label
More informationStatistics for Business and Economics
Statstcs for Busness and Economcs Chapter 11 Smple Regresson Copyrght 010 Pearson Educaton, Inc. Publshng as Prentce Hall Ch. 11-1 11.1 Overvew of Lnear Models n An equaton can be ft to show the best lnear
More informationExperience with Automatic Generation Control (AGC) Dynamic Simulation in PSS E
Semens Industry, Inc. Power Technology Issue 113 Experence wth Automatc Generaton Control (AGC) Dynamc Smulaton n PSS E Lu Wang, Ph.D. Staff Software Engneer lu_wang@semens.com Dngguo Chen, Ph.D. Staff
More informationInternet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks
Internet Engneerng Jacek Mazurkewcz, PhD Softcomputng Part 3: Recurrent Artfcal Neural Networks Self-Organsng Artfcal Neural Networks Recurrent Artfcal Neural Networks Feedback sgnals between neurons Dynamc
More informationParking Demand Forecasting in Airport Ground Transportation System: Case Study in Hongqiao Airport
Internatonal Symposum on Computers & Informatcs (ISCI 25) Parkng Demand Forecastng n Arport Ground Transportaton System: Case Study n Hongqao Arport Ln Chang, a, L Wefeng, b*, Huanh Yan 2, c, Yang Ge,
More informationUsing Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,*
Advances n Computer Scence Research (ACRS), volume 54 Internatonal Conference on Computer Networks and Communcaton Technology (CNCT206) Usng Immune Genetc Algorthm to Optmze BP Neural Network and Its Applcaton
More informationLecture 9: Linear regression: centering, hypothesis testing, multiple covariates, and confounding
Recall: man dea of lnear regresson Lecture 9: Lnear regresson: centerng, hypothess testng, multple covarates, and confoundng Sandy Eckel seckel@jhsph.edu 6 May 8 Lnear regresson can be used to study an
More informationLecture 9: Linear regression: centering, hypothesis testing, multiple covariates, and confounding
Lecture 9: Lnear regresson: centerng, hypothess testng, multple covarates, and confoundng Sandy Eckel seckel@jhsph.edu 6 May 008 Recall: man dea of lnear regresson Lnear regresson can be used to study
More informationChapter 16 Student Lecture Notes 16-1
Chapter 16 Student Lecture Notes 16-1 Basc Busness Statstcs (9 th Edton) Chapter 16 Tme-Seres Forecastng and Index Numbers 2004 Prentce-Hall, Inc. Chap 16-1 Chapter Topcs The Importance of Forecastng Component
More informationLOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin
Proceedngs of the 007 Wnter Smulaton Conference S G Henderson, B Bller, M-H Hseh, J Shortle, J D Tew, and R R Barton, eds LOW BIAS INTEGRATED PATH ESTIMATORS James M Calvn Department of Computer Scence
More informationUncertainty as the Overlap of Alternate Conditional Distributions
Uncertanty as the Overlap of Alternate Condtonal Dstrbutons Olena Babak and Clayton V. Deutsch Centre for Computatonal Geostatstcs Department of Cvl & Envronmental Engneerng Unversty of Alberta An mportant
More informationIntroduction to the Introduction to Artificial Neural Network
Introducton to the Introducton to Artfcal Neural Netork Vuong Le th Hao Tang s sldes Part of the content of the sldes are from the Internet (possbly th modfcatons). The lecturer does not clam any onershp
More informationMultigradient for Neural Networks for Equalizers 1
Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT
More informationKernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationAtmospheric Environmental Quality Assessment RBF Model Based on the MATLAB
Journal of Envronmental Protecton, 01, 3, 689-693 http://dxdoorg/10436/jep0137081 Publshed Onlne July 01 (http://wwwscrporg/journal/jep) 689 Atmospherc Envronmental Qualty Assessment RBF Model Based on
More informationOil Price Forecasting with an EMD-Based Multiscale Neural Network Learning Paradigm
Ol Prce Forecastng wth an EMD-Based Multscale Neural Network Learnng Paradgm Lean Yu 1,2, Kn Keung La 2, Shouyang Wang 1, and Kajan He 2 1 Insttute of Systems Scence, Academy of Mathematcs and Systems
More informationResource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud
Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal
More informationAdmin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester
0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #
More informationChapter 11: Simple Linear Regression and Correlation
Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests
More informationNote 10. Modeling and Simulation of Dynamic Systems
Lecture Notes of ME 475: Introducton to Mechatroncs Note 0 Modelng and Smulaton of Dynamc Systems Department of Mechancal Engneerng, Unversty Of Saskatchewan, 57 Campus Drve, Saskatoon, SK S7N 5A9, Canada
More informationWeek3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity
Week3, Chapter 4 Moton n Two Dmensons Lecture Quz A partcle confned to moton along the x axs moves wth constant acceleraton from x =.0 m to x = 8.0 m durng a 1-s tme nterval. The velocty of the partcle
More informationOnline Classification: Perceptron and Winnow
E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng
More informationBOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu
BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS M. Krshna Reddy, B. Naveen Kumar and Y. Ramu Department of Statstcs, Osmana Unversty, Hyderabad -500 007, Inda. nanbyrozu@gmal.com, ramu0@gmal.com
More informationFeature Selection: Part 1
CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?
More informationAssignment 5. Simulation for Logistics. Monti, N.E. Yunita, T.
Assgnment 5 Smulaton for Logstcs Mont, N.E. Yunta, T. November 26, 2007 1. Smulaton Desgn The frst objectve of ths assgnment s to derve a 90% two-sded Confdence Interval (CI) for the average watng tme
More informationFeature Selection & Dynamic Tracking F&P Textbook New: Ch 11, Old: Ch 17 Guido Gerig CS 6320, Spring 2013
Feature Selecton & Dynamc Trackng F&P Textbook New: Ch 11, Old: Ch 17 Gudo Gerg CS 6320, Sprng 2013 Credts: Materal Greg Welch & Gary Bshop, UNC Chapel Hll, some sldes modfed from J.M. Frahm/ M. Pollefeys,
More informationMODELING TRAFFIC LIGHTS IN INTERSECTION USING PETRI NETS
The 3 rd Internatonal Conference on Mathematcs and Statstcs (ICoMS-3) Insttut Pertanan Bogor, Indonesa, 5-6 August 28 MODELING TRAFFIC LIGHTS IN INTERSECTION USING PETRI NETS 1 Deky Adzkya and 2 Subono
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification
E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton
More informationFuzzy Boundaries of Sample Selection Model
Proceedngs of the 9th WSES Internatonal Conference on ppled Mathematcs, Istanbul, Turkey, May 7-9, 006 (pp309-34) Fuzzy Boundares of Sample Selecton Model L. MUHMD SFIIH, NTON BDULBSH KMIL, M. T. BU OSMN
More informationAn Extended Hybrid Genetic Algorithm for Exploring a Large Search Space
2nd Internatonal Conference on Autonomous Robots and Agents Abstract An Extended Hybrd Genetc Algorthm for Explorng a Large Search Space Hong Zhang and Masum Ishkawa Graduate School of L.S.S.E., Kyushu
More informationLecture 6: Introduction to Linear Regression
Lecture 6: Introducton to Lnear Regresson An Manchakul amancha@jhsph.edu 24 Aprl 27 Lnear regresson: man dea Lnear regresson can be used to study an outcome as a lnear functon of a predctor Example: 6
More informationHome Assignment 4. Figure 1: A sample input sequence for NER tagging
Advanced Methods n NLP Due Date: May 22, 2018 Home Assgnment 4 Lecturer: Jonathan Berant In ths home assgnment we wll mplement models for NER taggng, get famlar wth TensorFlow and learn how to use TensorBoard
More informationFundamentals of Neural Networks
Fundamentals of Neural Networks Xaodong Cu IBM T. J. Watson Research Center Yorktown Heghts, NY 10598 Fall, 2018 Outlne Feedforward neural networks Forward propagaton Neural networks as unversal approxmators
More informationDeep Belief Network using Reinforcement Learning and its Applications to Time Series Forecasting
Deep Belef Network usng Renforcement Learnng and ts Applcatons to Tme Seres Forecastng Takaom HIRATA, Takash KUREMOTO, Masanao OBAYASHI, Shngo MABU Graduate School of Scence and Engneerng Yamaguch Unversty
More informationTransient Stability Assessment of Power System Based on Support Vector Machine
ransent Stablty Assessment of Power System Based on Support Vector Machne Shengyong Ye Yongkang Zheng Qngquan Qan School of Electrcal Engneerng, Southwest Jaotong Unversty, Chengdu 610031, P. R. Chna Abstract
More informationSuppose that there s a measured wndow of data fff k () ; :::; ff k g of a sze w, measured dscretely wth varable dscretzaton step. It s convenent to pl
RECURSIVE SPLINE INTERPOLATION METHOD FOR REAL TIME ENGINE CONTROL APPLICATIONS A. Stotsky Volvo Car Corporaton Engne Desgn and Development Dept. 97542, HA1N, SE- 405 31 Gothenburg Sweden. Emal: astotsky@volvocars.com
More informationA Weighted Utility Framework for Mining Association Rules
A Weghted Utlty Framework for Mnng Assocaton Rules M Sulaman Khan 1, 2, Maybn Muyeba 1, Frans Coenen 2 1 School of Computng, Lverpool Hope Unversty, UK 2 Department of Computer Scence, Unversty of Lverpool,
More informationA Self-Adaptive Prediction Model for Dynamic Pricing and Inventory Control
A Self-Adaptve Predcton Model for Dynamc Prcng and Inventory Control Hejun Lang* Shangha Unversty of Fnance and Economcs, School of Informaton Management and Engneerng, Shangha 00, Chna *E-mal: langhejun@fudan.edu.cn
More information2016 Wiley. Study Session 2: Ethical and Professional Standards Application
6 Wley Study Sesson : Ethcal and Professonal Standards Applcaton LESSON : CORRECTION ANALYSIS Readng 9: Correlaton and Regresson LOS 9a: Calculate and nterpret a sample covarance and a sample correlaton
More informationEfficient Weather Forecasting using Artificial Neural Network as Function Approximator
Effcent Weather Forecastng usng Artfcal Neural Network as Functon Approxmator I. El-Fegh, Z. Zuba and S. Abozgaya Abstract Forecastng s the referred to as the process of estmaton n unknown stuatons. Weather
More informationNeural Networks. Neural Network Motivation. Why Neural Networks? Comments on Blue Gene. More Comments on Blue Gene
Motvaton for non-lnear Classfers Neural Networs CPS 27 Ron Parr Lnear methods are wea Mae strong assumptons Can only express relatvely smple functons of nputs Comng up wth good features can be hard Why
More informationA neural network with localized receptive fields for visual pattern classification
Unversty of Wollongong Research Onlne Faculty of Informatcs - Papers (Archve) Faculty of Engneerng and Informaton Scences 2005 A neural network wth localzed receptve felds for vsual pattern classfcaton
More informationIntroduction to Regression
Introducton to Regresson Dr Tom Ilvento Department of Food and Resource Economcs Overvew The last part of the course wll focus on Regresson Analyss Ths s one of the more powerful statstcal technques Provdes
More informationA New Evolutionary Computation Based Approach for Learning Bayesian Network
Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang
More informationA Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach
A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland
More informationCopyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for P Charts. Dr. Wayne A. Taylor
Taylor Enterprses, Inc. Control Lmts for P Charts Copyrght 2017 by Taylor Enterprses, Inc., All Rghts Reserved. Control Lmts for P Charts Dr. Wayne A. Taylor Abstract: P charts are used for count data
More informationDO NOT OPEN THE QUESTION PAPER UNTIL INSTRUCTED TO DO SO BY THE CHIEF INVIGILATOR. Introductory Econometrics 1 hour 30 minutes
25/6 Canddates Only January Examnatons 26 Student Number: Desk Number:...... DO NOT OPEN THE QUESTION PAPER UNTIL INSTRUCTED TO DO SO BY THE CHIEF INVIGILATOR Department Module Code Module Ttle Exam Duraton
More information2 Laminar Structure of Cortex. 4 Area Structure of Cortex
Networks!! Lamnar Structure of Cortex. Bology: The cortex. Exctaton: Undrectonal (transformatons) Local vs. dstrbuted representatons Bdrectonal (pattern completon, amplfcaton). Inhbton: Controllng bdrectonal
More informationApplication research on rough set -neural network in the fault diagnosis system of ball mill
Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(4):834-838 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 Applcaton research on rough set -neural network n the
More information