SHORT-TERM PREDICTION OF AIR POLLUTION USING MULTI- LAYER PERCPTERON & GAMMA NEURAL NETWORKS
|
|
- Lynette Fields
- 6 years ago
- Views:
Transcription
1 Control 4, Unversty of Bath, UK, September 4 ID-6 SHORT-TER PREDICTIO OF AIR POUTIO USI UTI- AYER PERCPTERO & AA EURA ETWORKS. Alyar. Shoorehdel *,. Teshnehlab, A. Kha. Sedgh * PhD student. of Elect. Eng. K.. Toos Unversty of Tech. Tehran, Iran. Fax: _Alyar@eetd. ntu. ac. r Atmospherc Scences and etrologcal Research Center. & Dept. of Elect. Eng. K.. Toos Unversty of Tech. Tehran, Iran. asmerc@rmet.net Dept. of Elect. Eng. K.. Toos Unversty of Tech. Tehran, Iran. Fax: Sedgh@eetd. ntu. ac. r Keywords: Tme Seres, eural etwor, ult-ayer Perceptron, amma emory, emory Depth Parameter. Abstract Ths paper consders the problem of ar polluton data predcton usng mult- layer perceptron and gamma memores neural networs. Ar polluton data are avalable n the format of tme seres and these real data are used to tran and predct the future ar polluton condton. Due to the fast dynamcs and complex behavor of the process governng the ar polluton dynamcs, the modelng and predcton of ths process s dffcult. Also, results are provded to gve a comparson of the two proposed predctors. Introducton Ar polluton s a very complex process affected by many dfferent factors. Thus, predctng such data wth fast dynamcs s very dffcult. Then, wth the help of these data and solvng the related equatons we can model atmospherc processes n the front data networ for each part. It should be noted that atmospherc factors such as temperature, pressure, humdty ran, wnd, etc [8] cause the equatons to become unbalanced; and the maps based on the spread of pollutants to be useless. Even when we consder the atmospherc factors, other factors such as, ncrease n the manufacture of automobles, archtecture of the ctes and many other factors would serously deterorate the model. The current methods of predctng used n the metrologcal organzaton are based on the analyss of the predetermned maps whch are prepared by dfferent centers usng data provded by the land staton, and satelltes. These data are then used by experts to predct the ar polluton. Ths method has serous shortcomng due to ts human centered nature.we have tred to predct ar polluton usng two dfferent neural networ structures. In ths paper, we used real data for Ara cty durng Oct 3, ths data was collected every half hour (Table ()). as Unt Carbon onoxde µg (CO) 3 m Partcular atter ppm (partcular per mllon) ( P-) Table : polluton parameters whch are analyzed and ther unt. emory ult-ayer Perceptron (P) eural etwors The mult-layer neural nets (s) are an mportant class of s. Typcally, the s consst of a set memory sensory neurons that consttute an nput, hdden(s) and output layer of computaton neurons. The nput sgnal(s) propagate(s) through the networ structure n a feed forward drecton. These s are commonly referred to as mult- layer perceptron (P). eural networs are used n predctng tme seres especally where statonary condtons or condtons whch should be provded for classcal technques do not wor and when the dynamcs of tme seres are fast [8, 6, 7, 5]. P networs wth one hdden layer and enough neurons n the hdden layer can estmate functon wth sutable approxmaton [5]. In general, we don t now the optmal number of neurons n hdden layers. We can, based on experence, acheve the desred structure. aybe the smplest nd of memory for a neural networ s the use of past data as nput as n fgure (). Ths memory termnal wors as a delay lne at the nput. Ths nd of termnal whch s used often and s one of the smplest and most useful forms of short-term memory termnals s called Tapped Delay ne emory. Fgure : Ordnary tapped delay lne memory of order p. Fgures (, ) llustrate the structure of delay lne termnal and how t connects to the P networ. In other words nput x along wth p nputs before t, that s, x ( t ) to x( t p) stored n a delay lne memory s of
2 Control 4, Unversty of Bath, UK, September 4 ID-6 p degree whch for nstance y ( networ output that y ( wll be the same as x ˆ ( t +). In ths way predcton s one step a head [3]. Then, we consdered the second networ (P ),.e. x ( t + ) x + x( t ) (second dervaton) as the desred output n a way that the networ usng the method of BP tranng. The results are llustrated n fgures (5, 6). (It should be noted that n all the networ layers except nput we used sgmod transfer functon.) Fgure : Focused tme lagged feed forward networ (TF). Here tranng structure s based on the mnmzaton of error wth the help of the gradent descent (Bac Propagaton (BP)). To further mprove the tranng of networ, all nputs and output have been normalzed. Frst for the networ nputs, nstead of usng delay lne memory, the subtract of two successve value are used as the networ nput,.e. nput x t ) (frst dervaton) was used as networ nput so the whole nput s: X x, x( x( t ), x( t ), x( t ) x( t ) () [ ] T Fgure 5: One step ahead predcton of CO by P. Here we have consdered four nputs and the hdden layer has fve neurons. ext, we traned the networ n a way that ts output reaches x( t + ) x,.e., the frst networ (P ) predcton results are llustrated n fgures (3, 4). Fgure 6: One step ahead predcton of P- (Partcular atter) by P. 3 amma etwor and ts Forward Calculatons Fgure 3: One step ahead predcton of CO by P. It s advantages to weght the memory and ts depth used n neural networs. So, we must move from a smple TD memory to the case where we can weght past data, by defnng a parameter that frst of all determnes the memory depth and second t can be traned through each step of neural networ parameters tranng. The structure of memory n neural networs, are to represent the human memory. For example, a person may not remember last wee s temperature but he/she remembers f t was colder or warmer than today. Fgure 4: One step ahead predcton of P- (Partcular atter) by P. Human short-term memory has varous features, for nstance, ts depth s changeable. Events wth greater nfluence stay longer n the memory and unmportant events are forgotten soon. Ths feature can easly be employed by storng a small part of the data. Fgure (7) llustrates ths nd of memory whch
3 Control 4, Unversty of Bath, UK, September 4 ID-6 s called the amma memory. The depth of ths memory s adjustable to the amount of µ. Another pont of concern s the value of µ, f < µ <, the flter s stable wth poles n the unt crcle and on the rght sde of Z plane, and f < µ < the flter s stable wth poles n unt crcle and on the left sde of the Z plane. amma synapses combne attractve propertes of a FIR synapses wth some of the general power of an IIR flter []. 3. Forward Calculaton Fgure 7: amma memory unt (-order). Varous neural archtectures have been proposed for modelng tme seres. Both IIR (Infnte Impulse Response) /recurrent and FIR (Fnte Impulse Response)/feed forward tme delay approach have numerous drawbacs whch ether lmt the modelng capactes or suffer from nstabltes durng tranng. Based on fgure (8) we can show y w x + w x + + w l w l x l x (3) Where xl ( µ ) xl ( t ) + µ x, l ( t ), l > (4) When l Then x l u. Based on fgures (7, 8) we have: [ x, x( t ),, x( t m + ] T X ) (5) Fgure 8: A amma memory synapses. Untl the early nnetes t was unfeasble to combne the stablty of the FIR type nets wth the benefts of few parameters for large tme scopes n IIR type nets. A soluton to ths problem has been found n the early nnetes by Prncpe and devres, the so called amma networs. Another advantage of amma s ts ablty n modelng the reference model where measurng I/O s on-lne. amma synapses combnes attractve propertes of a FIR synapses wth some of the general power of an IIR flter [4,]. eural networs used n ths paper have three layers wth a nonlnear sgmod functon n the hdden and output layers, as µ µ z shown n fgures (, 7, 8) where or s z + µ ( µ ) z used nstead of (z). The followng calculatons are carred out based on fgure (8): µ z x ( z) x x x( ( µ ) x( t ) ( µ ) z µ x ( t ) x ( µ ) x ( t ) + µ x ( t ) () From equaton () t s clear that f the value of x ( t ) s more mportance than x ( t ) then the value of µ, nown as the factor of memory depth, s growng up. Ths s due to the multplcaton of µ. F 3 m 3 m n n 3n mn T T. X, [ et ] [ ] m. [ X] T ( [ et ] ) F( [ ]. [ X] ) et m In the hdden layer we have: n m m, o( 3 T [ ] n. [ F( et )] These calculatons are for sngle output networ and easly generalze to mult outputs. ote: by havng one amma functon n degree n one synapse, only one µ s used to determne ts memory depth. Also each one of these amma functons has one weght that plays the role of the same weghts n the P memoryless networs. In fact, each synapse doesn t have only one weght, t has + controllable weghts. oreover, each synapse has one µ for controllng the depth of the memory. (6) (7) (8)
4 Control 4, Unversty of Bath, UK, September 4 ID-6 3. Tranng the amma etwor A mnmzng of the sum of the square errors s used for tranng as the supervse rule: E ( d ( n) o ( n)) e (9) Where, d and o are desred and real outputs, respectvely. In ths method, the neurons of the nput layer are four, the hdden layer 7 and the out put layer s one. The tranng rate of weghts ( w ) equal.5 and the rate of tranng the memory depth µ equal. and 3. The calculatons and determnng the updates of w are started from outer layer. Before ths, consderng fgure (8), we can see: y xl () w l Also concernng the equaton n (3) we see: y x l wl wl α l () l l Where xl α l ( µ ) α l ( t ) () + µ α ( t ) + x x ( t ), l, l l In equaton () t > and l >, otherwse, α l. Equaton () s derved from equaton (4), so we show: ( µ ) xl ( t ) α l xl ( t ) + ( µ ) x, l ( t ) + x, l ( t ) + µ xl ( t ) (3) xl ( t ) x, l ( t ) + ( µ ) + x, l ( t ) + µ whch gves equaton (). Fnally, as the last calculaton we see: y y w x u (4) The above equaton s easly acheved from equaton (3). Consderng equatons (), () and (4), t s obvous that all of the equatons necessary for the Bac Propagaton method s calculated (chan rule). One pont worth mentonng here s the relaton of TD wth amma. The only dfference between gamma and tapped-delay lne (TD) net s that each part of the gamma memory unt has a feedbac connecton. If we consder µ, the amma flter can easly be changed to TD. Bac Propagaton calculatons for TD can easly be obtaned from the wrtten equatons for amma. The only pont s that TD networ doesn t have the µ tranng. Fgures (9, ) llustrate the predcton by a three layer amma networ. The memory unts exst, both n the hdden and n the output layers. The output layer neurons are lnear because of the value of our target data. Of course, the output layer can also be nonlnear, but the value should be normalzed. Fgure 9: One step ahead predcton of CO by amma. Fgure : One step ahead predcton of P- (Partcular atter) by amma. 4 Comparatve Studes For evaluatng and comparng two dfferent modelng and predcton methods whch have been examned, four crtera are consdered [9, 4]. Root ean Squared Errors (RSE) ormalzed ean Squared Errors (SE) ean Absolute Error (AE) ean Bas Error (BE) The crtera are defned as: RSE ( xˆ( ) (5) Where, of x s the measured and xˆ s the predcted value. SE ( xˆ( ) ( x( ) (6) Where the value of x s the mean of the measured data. AE xˆ( (7)
5 Control 4, Unversty of Bath, UK, September 4 ID-6 BE xˆ( (8) Tables () and (3) llustrate the comparson results of the methods. Co P P amma RSE SE A B Table : Comparng of three methods for predcton of Co. [6] S. Hayns, eural etwor A Comprehensve Foundaton, cmaster Unversty Hamlton, Ontaro, Canada, Prentce Hall Internatonal Inc, (999). [7]. B. enhaj, Computatonal ntellgence, Poly Technc Unversty Publsher, vol, (). [8] P. laer and. Boznar, Percptron eural etwor Based odel Predcts Ar Polluton, IEEE, World Congress, pp , (997). [9] J. C. Prncpe, J. uo and S. Celeb, Senor ember, An Analyss of the amma emory n Dynamc eural etwors, Bref Paper, IEEE Transactons on eural etwor, Vol. 5, o, pp ,(arch 994). [] J. C. Prncpe, B. de Vres, and P.. de Olvera, The gamma flter a new class of adaptve IIR flters wth restrcted feedbac, IEEE Transacton on Sgnal processng, vol 4(), pp , (993). P - P P amma RSE SE A B Table 3: Comparng of three methods for predcton of P-. 5 Concluson: Ths paper depcted the ablty of gamma neural networs for modelng and predcton of fast dynamc data such as ar polluton the s wthout memory. It s also shown that n comparson wth gamma neural networs has hgh capablty for modelng and predctng of dynamc tme seres. Accordng to tables (, ) the proposed gamma neural networs wth memory weghts demonstrated better performance n modelng and predctons of complex systems. Also, n small scale values (when the values of data are small) the method of P wors better than P. References [] S. Celeb, J. C. Prcpe Analyss of spectral Feature Extracton Usng the amma Flter, Unv Florda, (994). [] T. J. Cholewo, J.. Zurada and A. Choc, Exact radent Calculaton n amma eural etwors, In Proceedngs of the 997 Internatonal Symposum on onlnear Theory and ts Applcatons, Honolulu, Hawa, USA, ov. (997). [3] A. B. Chelan, D.. aghate, and. Z. Hasan, Predcton of Ambent P and Toxc etals Usng Artfcal eural etwors, TECHICA PAPER, Ar & Waste anagement Assocaton, Vol 5, pp 85-8, (July ). [4] B. De Vres and J. C. Prncpe, The amma odel-a new eural odel for Temporal Processng, neural networs, vol.5 no.4, pp , (99). [5]. Dorffner, eural etwors for Tme Seres Processng, eural etwor World, pp , (996).
Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata
Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,
More informationEEE 241: Linear Systems
EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they
More informationCHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE
CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng
More informationGlobal Sensitivity. Tuesday 20 th February, 2018
Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values
More informationAdmin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester
0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #
More informationTransfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system
Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng
More informationFor now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.
Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson
More informationAir Age Equation Parameterized by Ventilation Grouped Time WU Wen-zhong
Appled Mechancs and Materals Submtted: 2014-05-07 ISSN: 1662-7482, Vols. 587-589, pp 449-452 Accepted: 2014-05-10 do:10.4028/www.scentfc.net/amm.587-589.449 Onlne: 2014-07-04 2014 Trans Tech Publcatons,
More informationParameter Estimation for Dynamic System using Unscented Kalman filter
Parameter Estmaton for Dynamc System usng Unscented Kalman flter Jhoon Seung 1,a, Amr Atya F. 2,b, Alexander G.Parlos 3,c, and Klto Chong 1,4,d* 1 Dvson of Electroncs Engneerng, Chonbuk Natonal Unversty,
More informationNON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS
IJRRAS 8 (3 September 011 www.arpapress.com/volumes/vol8issue3/ijrras_8_3_08.pdf NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS H.O. Bakodah Dept. of Mathematc
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationSupporting Information
Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to
More informationKernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationCHAPTER III Neural Networks as Associative Memory
CHAPTER III Neural Networs as Assocatve Memory Introducton One of the prmary functons of the bran s assocatve memory. We assocate the faces wth names, letters wth sounds, or we can recognze the people
More informationNumerical Heat and Mass Transfer
Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and
More informationNeural networks. Nuno Vasconcelos ECE Department, UCSD
Neural networs Nuno Vasconcelos ECE Department, UCSD Classfcaton a classfcaton problem has two types of varables e.g. X - vector of observatons (features) n the world Y - state (class) of the world x X
More informationAN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING
AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING Qn Wen, Peng Qcong 40 Lab, Insttuton of Communcaton and Informaton Engneerng,Unversty of Electronc Scence and Technology
More informationThe Study of Teaching-learning-based Optimization Algorithm
Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute
More informationNeural Networks & Learning
Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred
More informationMulti-Step-Ahead Prediction of Stock Price Using a New Architecture of Neural Networks
Journal of Computer & Robotcs 8(), 05 47-56 47 Mult-Step-Ahead Predcton of Stoc Prce Usng a New Archtecture of Neural Networs Mohammad Taleb Motlagh *, Hamd Khaloozadeh Department of Systems and Control,
More informationCOMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
More informationThe Exact Formulation of the Inverse of the Tridiagonal Matrix for Solving the 1D Poisson Equation with the Finite Difference Method
Journal of Electromagnetc Analyss and Applcatons, 04, 6, 0-08 Publshed Onlne September 04 n ScRes. http://www.scrp.org/journal/jemaa http://dx.do.org/0.46/jemaa.04.6000 The Exact Formulaton of the Inverse
More informationMultilayer Perceptron (MLP)
Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne
More information1 Convex Optimization
Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,
More informationUncertainty and auto-correlation in. Measurement
Uncertanty and auto-correlaton n arxv:1707.03276v2 [physcs.data-an] 30 Dec 2017 Measurement Markus Schebl Federal Offce of Metrology and Surveyng (BEV), 1160 Venna, Austra E-mal: markus.schebl@bev.gv.at
More informationPop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing
Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,
More informationMultigradient for Neural Networks for Equalizers 1
Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT
More informationCOEFFICIENT DIAGRAM: A NOVEL TOOL IN POLYNOMIAL CONTROLLER DESIGN
Int. J. Chem. Sc.: (4), 04, 645654 ISSN 097768X www.sadgurupublcatons.com COEFFICIENT DIAGRAM: A NOVEL TOOL IN POLYNOMIAL CONTROLLER DESIGN R. GOVINDARASU a, R. PARTHIBAN a and P. K. BHABA b* a Department
More informationRBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis
Appled Mechancs and Materals Submtted: 24-6-2 ISSN: 662-7482, Vols. 62-65, pp 2383-2386 Accepted: 24-6- do:.428/www.scentfc.net/amm.62-65.2383 Onlne: 24-8- 24 rans ech Publcatons, Swtzerland RBF Neural
More informationStatistical Energy Analysis for High Frequency Acoustic Analysis with LS-DYNA
14 th Internatonal Users Conference Sesson: ALE-FSI Statstcal Energy Analyss for Hgh Frequency Acoustc Analyss wth Zhe Cu 1, Yun Huang 1, Mhamed Soul 2, Tayeb Zeguar 3 1 Lvermore Software Technology Corporaton
More informationInner Product. Euclidean Space. Orthonormal Basis. Orthogonal
Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,
More informationOne-sided finite-difference approximations suitable for use with Richardson extrapolation
Journal of Computatonal Physcs 219 (2006) 13 20 Short note One-sded fnte-dfference approxmatons sutable for use wth Rchardson extrapolaton Kumar Rahul, S.N. Bhattacharyya * Department of Mechancal Engneerng,
More informationShort Term Load Forecasting using an Artificial Neural Network
Short Term Load Forecastng usng an Artfcal Neural Network D. Kown 1, M. Km 1, C. Hong 1,, S. Cho 2 1 Department of Computer Scence, Sangmyung Unversty, Seoul, Korea 2 Department of Energy Grd, Sangmyung
More informationFall 2012 Analysis of Experimental Measurements B. Eisenstein/rev. S. Errede
Fall 0 Analyss of Expermental easurements B. Esensten/rev. S. Errede We now reformulate the lnear Least Squares ethod n more general terms, sutable for (eventually extendng to the non-lnear case, and also
More informationFoundations of Arithmetic
Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an
More informationNeural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17
Neural Networks Perceptrons and Backpropagaton Slke Bussen-Heyen Unverstät Bremen Fachberech 3 5th of Novemeber 2012 Neural Networks 1 / 17 Contents 1 Introducton 2 Unts 3 Network structure 4 Snglelayer
More informationEvaluation for Prediction Accuracies of Parallel-type Neuron Network
Proceedngs of the Internatonal ultconference of Engneers and Computer Scentsts 009 Vol I, IECS 009, arch 8-0, 009, Hong Kong Evaluaton for Predcton Accuraces of Parallel-type Neuron Networ Shunsue Kobayaawa
More informationA new Approach for Solving Linear Ordinary Differential Equations
, ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of
More informationIdentification of Linear Partial Difference Equations with Constant Coefficients
J. Basc. Appl. Sc. Res., 3(1)6-66, 213 213, TextRoad Publcaton ISSN 29-434 Journal of Basc and Appled Scentfc Research www.textroad.com Identfcaton of Lnear Partal Dfference Equatons wth Constant Coeffcents
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationChapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems
Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons
More informationA Robust Method for Calculating the Correlation Coefficient
A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal
More informationErrors for Linear Systems
Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch
More informationNote 10. Modeling and Simulation of Dynamic Systems
Lecture Notes of ME 475: Introducton to Mechatroncs Note 0 Modelng and Smulaton of Dynamc Systems Department of Mechancal Engneerng, Unversty Of Saskatchewan, 57 Campus Drve, Saskatoon, SK S7N 5A9, Canada
More informationLecture 4: November 17, Part 1 Single Buffer Management
Lecturer: Ad Rosén Algorthms for the anagement of Networs Fall 2003-2004 Lecture 4: November 7, 2003 Scrbe: Guy Grebla Part Sngle Buffer anagement In the prevous lecture we taled about the Combned Input
More informationLecture 4. Instructor: Haipeng Luo
Lecture 4 Instructor: Hapeng Luo In the followng lectures, we focus on the expert problem and study more adaptve algorthms. Although Hedge s proven to be worst-case optmal, one may wonder how well t would
More informationA PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS
HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,
More informationVQ widely used in coding speech, image, and video
at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng
More informationUncertainty in measurements of power and energy on power networks
Uncertanty n measurements of power and energy on power networks E. Manov, N. Kolev Department of Measurement and Instrumentaton, Techncal Unversty Sofa, bul. Klment Ohrdsk No8, bl., 000 Sofa, Bulgara Tel./fax:
More informationLinear Regression Analysis: Terminology and Notation
ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented
More informationChapter - 2. Distribution System Power Flow Analysis
Chapter - 2 Dstrbuton System Power Flow Analyss CHAPTER - 2 Radal Dstrbuton System Load Flow 2.1 Introducton Load flow s an mportant tool [66] for analyzng electrcal power system network performance. Load
More informationChapter 11: Simple Linear Regression and Correlation
Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests
More informationOPTIMAL COMBINATION OF FOURTH ORDER STATISTICS FOR NON-CIRCULAR SOURCE SEPARATION. Christophe De Luigi and Eric Moreau
OPTIMAL COMBINATION OF FOURTH ORDER STATISTICS FOR NON-CIRCULAR SOURCE SEPARATION Chrstophe De Lug and Erc Moreau Unversty of Toulon LSEET UMR CNRS 607 av. G. Pompdou BP56 F-8362 La Valette du Var Cedex
More informationCIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M
CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute
More informationTime-Varying Systems and Computations Lecture 6
Tme-Varyng Systems and Computatons Lecture 6 Klaus Depold 14. Januar 2014 The Kalman Flter The Kalman estmaton flter attempts to estmate the actual state of an unknown dscrete dynamcal system, gven nosy
More informationThe Minimum Universal Cost Flow in an Infeasible Flow Network
Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationGrover s Algorithm + Quantum Zeno Effect + Vaidman
Grover s Algorthm + Quantum Zeno Effect + Vadman CS 294-2 Bomb 10/12/04 Fall 2004 Lecture 11 Grover s algorthm Recall that Grover s algorthm for searchng over a space of sze wors as follows: consder the
More informationLinear Correlation. Many research issues are pursued with nonexperimental studies that seek to establish relationships among 2 or more variables
Lnear Correlaton Many research ssues are pursued wth nonexpermental studes that seek to establsh relatonshps among or more varables E.g., correlates of ntellgence; relaton between SAT and GPA; relaton
More informationNumerical Solution of Ordinary Differential Equations
Numercal Methods (CENG 00) CHAPTER-VI Numercal Soluton of Ordnar Dfferental Equatons 6 Introducton Dfferental equatons are equatons composed of an unknown functon and ts dervatves The followng are examples
More informationColor Rendering Uncertainty
Australan Journal of Basc and Appled Scences 4(10): 4601-4608 010 ISSN 1991-8178 Color Renderng Uncertanty 1 A.el Bally M.M. El-Ganany 3 A. Al-amel 1 Physcs Department Photometry department- NIS Abstract:
More informationDesign and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm
Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:
More informationFault Diagnosis of Autonomous Underwater Vehicles
Research Journal of Appled Scences, Engneerng and Technology 5(6): 407-4076, 03 SSN: 040-7459; e-ssn: 040-7467 Maxwell Scentfc Organzaton, 03 Submtted: March 3, 0 Accepted: January, 03 Publshed: Aprl 30,
More informationDETERMINATION OF TEMPERATURE DISTRIBUTION FOR ANNULAR FINS WITH TEMPERATURE DEPENDENT THERMAL CONDUCTIVITY BY HPM
Ganj, Z. Z., et al.: Determnaton of Temperature Dstrbuton for S111 DETERMINATION OF TEMPERATURE DISTRIBUTION FOR ANNULAR FINS WITH TEMPERATURE DEPENDENT THERMAL CONDUCTIVITY BY HPM by Davood Domr GANJI
More informationStructure and Drive Paul A. Jensen Copyright July 20, 2003
Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.
More informationDe-noising Method Based on Kernel Adaptive Filtering for Telemetry Vibration Signal of the Vehicle Test Kejun ZENG
6th Internatonal Conference on Mechatroncs, Materals, Botechnology and Envronment (ICMMBE 6) De-nosng Method Based on Kernel Adaptve Flterng for elemetry Vbraton Sgnal of the Vehcle est Kejun ZEG PLA 955
More informationUncertainty as the Overlap of Alternate Conditional Distributions
Uncertanty as the Overlap of Alternate Condtonal Dstrbutons Olena Babak and Clayton V. Deutsch Centre for Computatonal Geostatstcs Department of Cvl & Envronmental Engneerng Unversty of Alberta An mportant
More informationSolving Nonlinear Differential Equations by a Neural Network Method
Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,
More informationThe Chaotic Robot Prediction by Neuro Fuzzy Algorithm (2) = θ (3) = ω. Asin. A v. Mana Tarjoman, Shaghayegh Zarei
The Chaotc Robot Predcton by Neuro Fuzzy Algorthm Mana Tarjoman, Shaghayegh Zare Abstract In ths paper an applcaton of the adaptve neurofuzzy nference system has been ntroduced to predct the behavor of
More informationAdditional Codes using Finite Difference Method. 1 HJB Equation for Consumption-Saving Problem Without Uncertainty
Addtonal Codes usng Fnte Dfference Method Benamn Moll 1 HJB Equaton for Consumpton-Savng Problem Wthout Uncertanty Before consderng the case wth stochastc ncome n http://www.prnceton.edu/~moll/ HACTproect/HACT_Numercal_Appendx.pdf,
More informationAn Empirical Study of Fuzzy Approach with Artificial Neural Network Models
An Emprcal Study of Fuzzy Approach wth Artfcal Neural Networ Models Memmedaga Memmedl and Ozer Ozdemr Abstract Tme seres forecastng based on fuzzy approach by usng artfcal neural networs s a sgnfcant topc
More informationOperating conditions of a mine fan under conditions of variable resistance
Paper No. 11 ISMS 216 Operatng condtons of a mne fan under condtons of varable resstance Zhang Ynghua a, Chen L a, b, Huang Zhan a, *, Gao Yukun a a State Key Laboratory of Hgh-Effcent Mnng and Safety
More informationDUE: WEDS FEB 21ST 2018
HOMEWORK # 1: FINITE DIFFERENCES IN ONE DIMENSION DUE: WEDS FEB 21ST 2018 1. Theory Beam bendng s a classcal engneerng analyss. The tradtonal soluton technque makes smplfyng assumptons such as a constant
More informationmodeling of equilibrium and dynamic multi-component adsorption in a two-layered fixed bed for purification of hydrogen from methane reforming products
modelng of equlbrum and dynamc mult-component adsorpton n a two-layered fxed bed for purfcaton of hydrogen from methane reformng products Mohammad A. Ebrahm, Mahmood R. G. Arsalan, Shohreh Fatem * Laboratory
More informationLecture 13 APPROXIMATION OF SECOMD ORDER DERIVATIVES
COMPUTATIONAL FLUID DYNAMICS: FDM: Appromaton of Second Order Dervatves Lecture APPROXIMATION OF SECOMD ORDER DERIVATIVES. APPROXIMATION OF SECOND ORDER DERIVATIVES Second order dervatves appear n dffusve
More informationSystem Identification for Quad-rotor Parameters Using Neural Network
EVERGREE Jont Journal of ovel Carbon Resource Scences & Green Asa Strategy, Volume 3, Issue, pp. 6-, March 6 System Identfcaton for Quad-rotor Parameters Usng eural etwor Tare. Def,*, Shgeo Yoshda Graduate
More informationHigh resolution entropy stable scheme for shallow water equations
Internatonal Symposum on Computers & Informatcs (ISCI 05) Hgh resoluton entropy stable scheme for shallow water equatons Xaohan Cheng,a, Yufeng Ne,b, Department of Appled Mathematcs, Northwestern Polytechncal
More informationCONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING INTRODUCTION
CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING N. Phanthuna 1,2, F. Cheevasuvt 2 and S. Chtwong 2 1 Department of Electrcal Engneerng, Faculty of Engneerng Rajamangala
More information3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X
Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number
More informationDifference Equations
Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1
More informationAn identification algorithm of model kinetic parameters of the interfacial layer growth in fiber composites
IOP Conference Seres: Materals Scence and Engneerng PAPER OPE ACCESS An dentfcaton algorthm of model knetc parameters of the nterfacal layer growth n fber compostes o cte ths artcle: V Zubov et al 216
More informationNegative Binomial Regression
STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...
More informationWhy feed-forward networks are in a bad shape
Why feed-forward networks are n a bad shape Patrck van der Smagt, Gerd Hrznger Insttute of Robotcs and System Dynamcs German Aerospace Center (DLR Oberpfaffenhofen) 82230 Wesslng, GERMANY emal smagt@dlr.de
More informationLecture 6: Introduction to Linear Regression
Lecture 6: Introducton to Lnear Regresson An Manchakul amancha@jhsph.edu 24 Aprl 27 Lnear regresson: man dea Lnear regresson can be used to study an outcome as a lnear functon of a predctor Example: 6
More informationApplication of B-Spline to Numerical Solution of a System of Singularly Perturbed Problems
Mathematca Aeterna, Vol. 1, 011, no. 06, 405 415 Applcaton of B-Splne to Numercal Soluton of a System of Sngularly Perturbed Problems Yogesh Gupta Department of Mathematcs Unted College of Engneerng &
More informationExperience with Automatic Generation Control (AGC) Dynamic Simulation in PSS E
Semens Industry, Inc. Power Technology Issue 113 Experence wth Automatc Generaton Control (AGC) Dynamc Smulaton n PSS E Lu Wang, Ph.D. Staff Software Engneer lu_wang@semens.com Dngguo Chen, Ph.D. Staff
More informationThe Order Relation and Trace Inequalities for. Hermitian Operators
Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence
More informationPattern Classification
Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher
More informationPsychology 282 Lecture #24 Outline Regression Diagnostics: Outliers
Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.
More informationLectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix
Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could
More informationAppendix B: Resampling Algorithms
407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles
More informationUsing T.O.M to Estimate Parameter of distributions that have not Single Exponential Family
IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran
More informationMultilayer neural networks
Lecture Multlayer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Mdterm exam Mdterm Monday, March 2, 205 In-class (75 mnutes) closed book materal covered by February 25, 205 Multlayer
More informationOutline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1]
DYNAMIC SHORTEST PATH SEARCH AND SYNCHRONIZED TASK SWITCHING Jay Wagenpfel, Adran Trachte 2 Outlne Shortest Communcaton Path Searchng Bellmann Ford algorthm Algorthm for dynamc case Modfcatons to our algorthm
More informationA New Refinement of Jacobi Method for Solution of Linear System Equations AX=b
Int J Contemp Math Scences, Vol 3, 28, no 17, 819-827 A New Refnement of Jacob Method for Soluton of Lnear System Equatons AX=b F Naem Dafchah Department of Mathematcs, Faculty of Scences Unversty of Gulan,
More informationLinear Feature Engineering 11
Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19
More informationMinimizing Output Error in Multi-Layer Perceptrons. Jonathan P. Bernick. Department of Computer Science. Coastal Carolina University
Mnmzng Output Error n Mult-Layer Perceptrons Jonathan P. Bernck Department of Computer Scence Coastal Carolna Unversty I. Abstract It s well-establshed that a mult-layer perceptron (MLP) wth a sngle hdden
More informationADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING
1 ADVANCED ACHINE LEARNING ADVANCED ACHINE LEARNING Non-lnear regresson technques 2 ADVANCED ACHINE LEARNING Regresson: Prncple N ap N-dm. nput x to a contnuous output y. Learn a functon of the type: N
More informationCollege of Computer & Information Science Fall 2009 Northeastern University 20 October 2009
College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:
More informationWeek 5: Neural Networks
Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple
More information