Enhancing Performance of MLP/RBF Neural Classifiers via an Multivariate Data Distribution Scheme

Size: px
Start display at page:

Download "Enhancing Performance of MLP/RBF Neural Classifiers via an Multivariate Data Distribution Scheme"

Transcription

1 Enhancing Perfrmance f / Neural Classifiers via an Multivariate Data Distributin Scheme Halis Altun, Gökhan Gelen Nigde University, Electrical and Electrnics Engineering Department Nigde, Turkey haltun@nigde.edu.tr ggelen@nigde.edu.tr Abstract In this study, the perfrmance f tw neural classifiers; namely Multi Layer Perceptrn () and Radial Basis Fuctin (), are cmpared fr a multivariate classificatin prblem. and are tw f the mst widely neural netwrk architecture in literature fr classificatin and have successfully been emplyed fr a variety f applicatins. A nnlinear scaling scheme fr multivariate data is prpsed prir t training prcess in rder t imprve the perfrmance f bth neural classifiers. Prpsed scheme mdifies the gaussian multivariate data and prduce a unifrmly distributed multivariate data. It is shwn that the prpsed scaling scheme increases the perfrmance f neural classifiers. 1. Intrductin In recent years, a great deal f attentin is paid t utilizing neural classifiers in the field f pattern recgnitin. Reasns f this success essentially cme frm the fact that neural netwrks can be implemented as a nnlinear discriminant functin and universal apprximatir. and are tw f the mst widely neural netwrk architecture in literature fr classificatin r regressin prblems [1-9]. The perfrmance f / neural classifiers is studied in literature. In [6-7] a hybrid / neural classifier is prpsed which prduces far better classificatin and regressin results when cmpared with advanced r with architectures. A nrmalized net is applied t nnparametric classificatin prblem in [5], which is btained by dividing each radial functin in by the sum f all radial functins. In the study f Reyneri and Sgarbi [4], a neur-fuzzy unificatin algrithm which mixes and has been tested as a pattern classifier. It has been shwn that the prpsed algrithm reaches a perfrmance which is cmparable r better than ther traditinal neural algrithms and can be trained much faster. Raudys has studied the generalizatin f neural classifier in terms f the relatin between the perfrmance and the cmplexity f the classifier [8] It is shwn that neural learning is highly dependent n the presentatin f the training data. [10]. This dependency will be explited in rder t achieve higher perfrmance in a classificatin prblem. The aim f this study is t utilize a nnlinear scaling n the gaussian distributed multivariate input data t prduce a mre unifrmly distributed training data. It will be shwn that apprpriate distributin characteristics results in an imprved perfrmance in classificatin. 2. Neural Netwrk Classifiers Multilayer perceptrn () and Radial Basis Functin netwrks () have becme the mst widely used netwrk architectures in signal prcessing and pattern recgnitin. Bth types f neural netwrk structures are gd in pattern classificatin prblems. They are rbust classifiers with the ability t generalize fr imprecise input data. General difference between and is that is a lcalist type f learning which is respnsive nly t a limited sectin f input space. On the ther hand, is mre distributed apprach. The utput f a is prduced by linear cmbinatins f the utputs f hidden layer ndes in which every neurn maps a weighted average f the inputs thrugh a sigmid functin. In ne hidden layer netwrk hidden ndes map distances between input vectrs and center vectrs t utputs thrugh a nnlinear kernel r radial functin. 2.1 Multi Layer Perceptrn () Neural Netwrk netwrks cnsist f an input layer, ne r mre hidden layers and an utput layer. Each layer has a number f prcessing units and each unit is fully intercnnected with weighted cnnectins t units in the subsequent layer. The transfrms n inputs t l utputs thrugh sme nnlinear functins. The utput f the netwrk is determined by the activatin f the units in the utput layer as fllws 24

2 x = f xw h h (1) h where f() is activatin functin, x h : activatin f h th hidden layer nde and w h : is the intercnnectin between h th hidden layer nde and th utput layer nde. The mst used activatin functin t is the sigmid and it is given as fllws 1 x = 1 + exp x w ( h h) The activatin level f the ndes in the hidden layer is determined in a similar fashin. Based n the differences between the calculated utput and the target value an errr is defined as fllws (2) 2 x μ j φ j ( x ) = exp 2 (6) σ j where x and μ are the input and the center f unit respectively. σ j is the spread f the gaussian basis functin. The utput f i th neurn in the utput layer f is determined by the linear cmbinatin f the utput f the units in the hidden layer as fllws M y ( x) = w φ ( x ) + b i (7) i ij j j= 1 N L 1 ( s) ( s) E = ( t ) 2 x (3) 2 s where N is the number f pattern in data set and L is the number f utput ndes. The aim is t reduce the errr by adjusting the intercnnectins between layers. The weights are adjusted using gradient descent Backprpagatin (BP) algrithm. The algrithm requires a training data that cnsists f a set f crrespnding input and target pattern values t. During training prcess, starts with a randm set f initial weights and then training cntinues until set f w ih and that f w h are ptimized s that a predefined errr threshld is met between x and t. Accrding t the BP algrithm, each intercnnectin between the ndes are adjusted by the amunt f the weight update value as fllws E Δ wh = η wh = ηδ xh (4) E Δ wih = η wih = ηδ h xi (5) where E is the errr cst functin given in (3), δ = x ( t x ), δ = x δ w h h h where x = x ( 1 x ) and x when a h = xh ( 1 xh ) sigmid activatin functin is used 2.2 Radial Basis Functin () Neural Netwrk The structure f neural netwrk is similar t that f. It cnsists f layer f neurns. The main distinctin is that has a hidden layer which cntains ndes called units. Each ne f the units in the hidden layer has tw parameters; the center and the width f gaussian basis functin which determines the utput f the units. Gaussian basis functin has the fllwing frm 25 where w ij the weights between units and the utput nde f neural netwrk. b i is the bias term. The weights are ptimized using least mean square LMS algrithm nce the centers f units are determined. The centers can be chsen randmly r using clustering algrithms. 3. Data Sets Tw classificatin prblems are cnsidered in the experiments. In the first experiment, imensnal and 3- Dimensnal multivariate randm values are prduced using MATLAB Statistical Tlbx. The first set f prblems is easily separable as seen in Figure 1 fr classificatin prblem. The data have been chsen s that each independent variables, (and x3 fr ) have a gaussian distributin as seen in Figure 2a-b. In the secnd experiments a mre difficult classificatin prblem is cnstructed bth fr and 3- D cases chsing the mean value f the independent variables clse enugh. Figure 3 shws (-) plane fr the data and Figure 4a the distributin f data n (- -x3) space fr classificatin prblem. As seen frm Figure 3 data have a highly inseparable classes. A similar bservatin can be made fr data distributin n (, ), and (,x3) plane which are given in Figure 4b-c, Fr classificatin prblem, tw sets f data with identical statistical characteristics are prduced randmly. Each set has 200 samples and each classes (CLASS I and CLASS II) in the data sets is represented equally and has a ttal number f 100 samples, respectively. The first set is labeled as SET T ( T: stands fr training) and used t train neural classifiers. The secnd set is kept fr testing purpse and is labeled as SET TS (TS: stands fr testing).

3 Figure 1- Easily separable classificatin prblem n (-) plane Cls I Cls II Figure 3. A highly inseparable classificatin prblem n (-) plane Frequency x3 Figure 2a. Distributin f the easily separable classificatin prblem n plane Cls I Cls II Figure 4a. Distributin f the, and x3 space Frequency Figure 2b. Distributin f the easily separable classificatin prblem n plane Fr classificatin prblems as well, data are prduced as explained abve. All data sets are subject t scaling prcedure which explained in Appendix II. The scaling is perfrmed s that it mdifies the distributin characteristics f the independent variables, fr the prblems and, and x3 fr the prblem. The resulted distributin has a unifrm characteristic as illustrated in Figure A1 in the Appendix sectin. Figure 4b. Distributin f the (, ) plane x3 Figure 4c. Distributin f the (, x3) plane 26

4 4. Experiments A suitable and structures are cnstructed t slve and classificatin prblems. In the first experiment, the gaussian distributed data is first used t train and neural classifiers. In the secnd experiments, neural classifiers are trained using the mdified data. Fr cmparisn purpse, all parameters (i.e. initial weights, learning rate, netwrk structure etc.) are kept cnstant fr neural classifier. In the same way, parameters are chsen identical t fairly evaluate results. and neural classifiers are created as explained in Appendix I. Table 1 shws the perfrmance f neural classifiers in the first experiment. As the classes are easily separable, and reach a high scre fr training and test data. crrectly classifies 94 data ut f 100 fr Class I. Classes Ttal Scre Prblem P. I II E T C F % atin C F % C atin F % C F % Table 1- The simulatin results f neural classifiers fr the secnd phase f the experiments (easily separable classificatin prblem) Only 12 data misclassified by fr Class II. Ttal scre fr classificatin prblem reach 94% fr training data and 91% fr test data. Fr prblem the scres are 86% and 85%, respectively. The perfrmance f is fund slightly inferir when cmpared t bth fr and classificatin prblem as 93% and 89%, respectevily. Hwever a great degradatin is fund in the slutin f 3- D classificatin prblem. The recrded scres are 97% fr training and 62% fr test data. An experiment with redistributed data is als carried ut t delineate the effect f data distributin n the perfrmance f the neural classifiers. Fr a fair cmparisn, all parameters and structure f neural classifiers are kept intact. Table 2 summarizes the findings. As seen, a great increase in the perfrmance f is btained fr prblem. is classifying all test patterns crrectly. Fr prblem, the perfrmance remains nearly same as 84%. On the ther hand, the perfrmance f imprves slightly frm 89% t 90% fr test patterns in slving classificatin prblem. As fr classificatin prblem, a great increase is experienced frm 62% t 74%. In the secnd phase f experiments, and neural classifiers are emplyed with the same structure and initial cnditins t slve mre cmplex classificatin prblems. The results are given in Table 3 and Table 4 fr riginal data and mdified data sets, respectively. As expected, the perfrmance f bth classifiers is reduced greatly due t the cmplexity f the classificatin prblem in hand. Classes Ttal Prblem Net P. I II E T C F % C atin F % C atin F % C F % Table 2- The simulatin results f neural classifiers fr the secnd phase f the experiments with mdified input data (easily separable classificatin prblem) Classes Ttal Prblem Net P. I II E T C F % C atin F % C atin F % C F % Table 3- The simulatin results f neural classifiers fr the secnd phase f the experiments (cmplex classificatin prblem) When mdified data set is used t train neural classifiers, an enhancement in the classificatin perfrmance f and is witnessed. An increase f 25 in percentage, frm 47% t 72%, is encuntered in 27

5 the perfrmance f neural classifier fr prblem. The perfrmance f is als imprved frm 49% t 53% in slving prblem. As fr classificatin prblem, shws a higher accuracy with an increase frm 59% t 63%, while a slight decrease takes place in the perfrmance f frm 67% t 65%. Classes Ttal Prblem Net P. I II E T C F % C atin F % C atin F % C F % Table 4- The simulatin results f neural classifiers fr the secnd phase f the experiments with mdified input data (cmplex classificatin prblem) 5. Cnclusin A nnlinear scaling has been prpsed t imprve the perfrmance f and neural classifiers. A 2- D and classificatin prblems have been slve and results have shwn the validity f the prpsed nnlinear scaling scheme. Better perfrmance in terms f classificatin is btained mst f the case. As an extreme case, an increase frm 47% t 72%, is btained in the perfrmance f neural classifier fr prblem. The prpsed scaling takes nly the mean value f the data int accunt t prduce a unifrmly distributed data. This is believed t result in a negligible incnsistency in the imprvement f the perfrmance. Hence nnlinear scaling scheme may further be enhanced t augment the unifrm distributin 6. Appendix I: Netwrk Structures The neural netwrks are cnstructed and trained in MATLAB 6.5 envirnment using Neural Netwrk Tlbx. neural netwrks is created using newff build-in functin and have a 2 ndes fr classificatin prblem r 3 ndes fr classificatin prblem in the input layer, 10 ndes in the hidden layers and 2 ndes, crrespnding t classes, in the utput layer. Lgsig (sigmid functin) is used as an activatin functin fr all neurns. Training is carried ut by backprpagatin algrithm with mmentum and a buildin functin traingdm is used. neural classifier is created using MATLAB builtin functin newrb with 2 r 3 ndes in the input layer, 15 units in the hidden layer and 2 ndes in the utput layer Appendix 2: Mdified Distributin f Data Set: Nnlinear Scaling The distributin f the independent variables, and x3 is selected t be gaussian as seen frm figures. A nnlinear scaling which mdifies the gaussian distributin t a unifrm ne is perfrmed t seek an enhanced perfrmance f the neural classifiers. T this end, the fllwing functin which maps the X i t X 0 is prpsed with m being the mean f the data which is given in Table A1 fr each set f data prduced. X 2 0 = 1 2( X m) 1+ e A.1 i where; x i : the data being mdified x 0 : mdified data m: mean value f the data. As an example Figure A1 illustrated the mdified data fr classificatin prblem given in Figure 1. Table A1. The mean value m f the independent variables namely,, and x3 (bth fr training and test data) fr bth easily separable (ES) and mre cmplex (MC) prblems Training Data Test Data Prblem I II I II X ES X X MC X X X ES X X X MC X

6 Elec Cmp Eng, Trnt, Canada, May 13-16, 2001, pp [10] H Altun and K. M. Curtis, Expliting the Statistical Characteristics f the Speech Signals fr Imprved Neural Learning in Neural Netwrks, IEEE Wrkshp n Neural Netwrks fr Signal Prcessing, NNSP 98, Cambridge, UK, August 1998, pp Figure A1. illustrates the scaled versin f data shwn in Figure References [1] W. Lh, L. Tim, A Cmparisn f Predictin Accuracy, Cmplexity, and Training Time f Thirty Three ld and New atin Algrithm, Machine Learning 40(3), pp , [2] A Ribert, A Ennaji, Y Lecurtier, Generalisatin Capabilities f a Distributed Neural Classifier, ESANN' Eurpean Sympsium n Artificial Neural Netwrks Bruges (Belgium), 1999, pp [3] KennethJ. McGarry, Stefan Wernter, and Jhn MacInyre Knwledge Extractin frm Radial Basis Fuctin Netwrks and Multi Layer Perceptrns Internatinal Jurnal f Cmputatinal Intelligence and Applicatins, 1(3) 2001 pp [4] L.M. Reyneri, M. Sgarbi, Perfrmance f Weighted Radial Basis Functin Classifiers, in Prc. f ESANN 97, Eurpean Sympsium n Artificial Neural Netwrks, Bruges (B), April 1997, pp [5] B. Kegl, A Krzyzak and H Niemann, Radial basis functin netwrks in nnparametric classificatin and functin learning, ICPR 98: Internatinal Cnference n Pattern Recgnitin, Brisbane, Australia, August, 1998, pp [6] S. Chen and N. Intratr Autmatic mdel selectin in a hybrid perceptrn/radial netwrk. Infrmatin Fusin: Special issue n multiple experts, 3 (4), 2002, pp [7] Shimn Chen, Nathan Intratr: A Study f Ensemble f Hybrid Netwrks with Strng Regularizatin. Multiple Classifier Systems, 2003, pp [8] Sarunas Raudys, Generalizatin f classifiers, The 2nd Internatinal Cnference n Neural Netwrks and Artificial Intelligence, ICNNAI'01, 2001 Belarus [9] Ramirez L, Pedrycz W, Pizzi N. Severe strm cell classificatin using supprt vectr machines and radial basis functin appraches. Prc IEEE Canadian Cnf 29

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must M.E. Aggune, M.J. Dambrg, M.A. El-Sharkawi, R.J. Marks II and L.E. Atlas, "Dynamic and static security assessment f pwer systems using artificial neural netwrks", Prceedings f the NSF Wrkshp n Applicatins

More information

A Scalable Recurrent Neural Network Framework for Model-free

A Scalable Recurrent Neural Network Framework for Model-free A Scalable Recurrent Neural Netwrk Framewrk fr Mdel-free POMDPs April 3, 2007 Zhenzhen Liu, Itamar Elhanany Machine Intelligence Lab Department f Electrical and Cmputer Engineering The University f Tennessee

More information

Analysis on the Stability of Reservoir Soil Slope Based on Fuzzy Artificial Neural Network

Analysis on the Stability of Reservoir Soil Slope Based on Fuzzy Artificial Neural Network Research Jurnal f Applied Sciences, Engineering and Technlgy 5(2): 465-469, 2013 ISSN: 2040-7459; E-ISSN: 2040-7467 Maxwell Scientific Organizatin, 2013 Submitted: May 08, 2012 Accepted: May 29, 2012 Published:

More information

Chapter 3: Cluster Analysis

Chapter 3: Cluster Analysis Chapter 3: Cluster Analysis } 3.1 Basic Cncepts f Clustering 3.1.1 Cluster Analysis 3.1. Clustering Categries } 3. Partitining Methds 3..1 The principle 3.. K-Means Methd 3..3 K-Medids Methd 3..4 CLARA

More information

COMP 551 Applied Machine Learning Lecture 11: Support Vector Machines

COMP 551 Applied Machine Learning Lecture 11: Support Vector Machines COMP 551 Applied Machine Learning Lecture 11: Supprt Vectr Machines Instructr: (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted fr this curse

More information

Artificial Neural Networks MLP, Backpropagation

Artificial Neural Networks MLP, Backpropagation Artificial Neural Netwrks MLP, Backprpagatin 01001110 01100101 01110101 01110010 01101111 01101110 01101111 01110110 01100001 00100000 01110011 01101011 01110101 01110000 01101001 01101110 01100001 00100000

More information

Resampling Methods. Cross-validation, Bootstrapping. Marek Petrik 2/21/2017

Resampling Methods. Cross-validation, Bootstrapping. Marek Petrik 2/21/2017 Resampling Methds Crss-validatin, Btstrapping Marek Petrik 2/21/2017 Sme f the figures in this presentatin are taken frm An Intrductin t Statistical Learning, with applicatins in R (Springer, 2013) with

More information

NUROP CONGRESS PAPER CHINESE PINYIN TO CHINESE CHARACTER CONVERSION

NUROP CONGRESS PAPER CHINESE PINYIN TO CHINESE CHARACTER CONVERSION NUROP Chinese Pinyin T Chinese Character Cnversin NUROP CONGRESS PAPER CHINESE PINYIN TO CHINESE CHARACTER CONVERSION CHIA LI SHI 1 AND LUA KIM TENG 2 Schl f Cmputing, Natinal University f Singapre 3 Science

More information

1996 Engineering Systems Design and Analysis Conference, Montpellier, France, July 1-4, 1996, Vol. 7, pp

1996 Engineering Systems Design and Analysis Conference, Montpellier, France, July 1-4, 1996, Vol. 7, pp THE POWER AND LIMIT OF NEURAL NETWORKS T. Y. Lin Department f Mathematics and Cmputer Science San Jse State University San Jse, Califrnia 959-003 tylin@cs.ssu.edu and Bereley Initiative in Sft Cmputing*

More information

Pattern Recognition 2014 Support Vector Machines

Pattern Recognition 2014 Support Vector Machines Pattern Recgnitin 2014 Supprt Vectr Machines Ad Feelders Universiteit Utrecht Ad Feelders ( Universiteit Utrecht ) Pattern Recgnitin 1 / 55 Overview 1 Separable Case 2 Kernel Functins 3 Allwing Errrs (Sft

More information

initially lcated away frm the data set never win the cmpetitin, resulting in a nnptimal nal cdebk, [2] [3] [4] and [5]. Khnen's Self Organizing Featur

initially lcated away frm the data set never win the cmpetitin, resulting in a nnptimal nal cdebk, [2] [3] [4] and [5]. Khnen's Self Organizing Featur Cdewrd Distributin fr Frequency Sensitive Cmpetitive Learning with One Dimensinal Input Data Aristides S. Galanpuls and Stanley C. Ahalt Department f Electrical Engineering The Ohi State University Abstract

More information

COMP 551 Applied Machine Learning Lecture 9: Support Vector Machines (cont d)

COMP 551 Applied Machine Learning Lecture 9: Support Vector Machines (cont d) COMP 551 Applied Machine Learning Lecture 9: Supprt Vectr Machines (cnt d) Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Class web page: www.cs.mcgill.ca/~hvanh2/cmp551 Unless therwise

More information

IAML: Support Vector Machines

IAML: Support Vector Machines 1 / 22 IAML: Supprt Vectr Machines Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester 1 2 / 22 Outline Separating hyperplane with maimum margin Nn-separable training data Epanding the input int

More information

Least Squares Optimal Filtering with Multirate Observations

Least Squares Optimal Filtering with Multirate Observations Prc. 36th Asilmar Cnf. n Signals, Systems, and Cmputers, Pacific Grve, CA, Nvember 2002 Least Squares Optimal Filtering with Multirate Observatins Charles W. herrien and Anthny H. Hawes Department f Electrical

More information

Early detection of mining truck failure by modelling its operation with neural networks classification algorithms

Early detection of mining truck failure by modelling its operation with neural networks classification algorithms RU, Rand GOLOSINSKI, T.S. Early detectin f mining truck failure by mdelling its peratin with neural netwrks classificatin algrithms. Applicatin f Cmputers and Operatins Research ill the Minerals Industries,

More information

Tree Structured Classifier

Tree Structured Classifier Tree Structured Classifier Reference: Classificatin and Regressin Trees by L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stne, Chapman & Hall, 98. A Medical Eample (CART): Predict high risk patients

More information

On Huntsberger Type Shrinkage Estimator for the Mean of Normal Distribution ABSTRACT INTRODUCTION

On Huntsberger Type Shrinkage Estimator for the Mean of Normal Distribution ABSTRACT INTRODUCTION Malaysian Jurnal f Mathematical Sciences 4(): 7-4 () On Huntsberger Type Shrinkage Estimatr fr the Mean f Nrmal Distributin Department f Mathematical and Physical Sciences, University f Nizwa, Sultanate

More information

COMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification

COMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification COMP 551 Applied Machine Learning Lecture 5: Generative mdels fr linear classificatin Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Jelle Pineau Class web page: www.cs.mcgill.ca/~hvanh2/cmp551

More information

CS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007

CS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007 CS 477/677 Analysis f Algrithms Fall 2007 Dr. Gerge Bebis Curse Prject Due Date: 11/29/2007 Part1: Cmparisn f Srting Algrithms (70% f the prject grade) The bjective f the first part f the assignment is

More information

Data Mining: Concepts and Techniques. Classification and Prediction. Chapter February 8, 2007 CSE-4412: Data Mining 1

Data Mining: Concepts and Techniques. Classification and Prediction. Chapter February 8, 2007 CSE-4412: Data Mining 1 Data Mining: Cncepts and Techniques Classificatin and Predictin Chapter 6.4-6 February 8, 2007 CSE-4412: Data Mining 1 Chapter 6 Classificatin and Predictin 1. What is classificatin? What is predictin?

More information

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) >

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) > Btstrap Methd > # Purpse: understand hw btstrap methd wrks > bs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(bs) > mean(bs) [1] 21.64625 > # estimate f lambda > lambda = 1/mean(bs);

More information

Neural Networks with Wavelet Based Denoising Layers for Time Series Prediction

Neural Networks with Wavelet Based Denoising Layers for Time Series Prediction Neural Netwrks with Wavelet Based Denising Layers fr Time Series Predictin UROS LOTRIC 1 AND ANDREJ DOBNIKAR University f Lublana, Faculty f Cmputer and Infrmatin Science, Slvenia, e-mail: {urs.ltric,

More information

Slide04 (supplemental) Haykin Chapter 4 (both 2nd and 3rd ed): Multi-Layer Perceptrons

Slide04 (supplemental) Haykin Chapter 4 (both 2nd and 3rd ed): Multi-Layer Perceptrons Slide04 supplemental) Haykin Chapter 4 bth 2nd and 3rd ed): Multi-Layer Perceptrns CPSC 636-600 Instructr: Ynsuck Che Heuristic fr Making Backprp Perfrm Better 1. Sequential vs. batch update: fr large

More information

In SMV I. IAML: Support Vector Machines II. This Time. The SVM optimization problem. We saw:

In SMV I. IAML: Support Vector Machines II. This Time. The SVM optimization problem. We saw: In SMV I IAML: Supprt Vectr Machines II Nigel Gddard Schl f Infrmatics Semester 1 We sa: Ma margin trick Gemetry f the margin and h t cmpute it Finding the ma margin hyperplane using a cnstrained ptimizatin

More information

Methods for Determination of Mean Speckle Size in Simulated Speckle Pattern

Methods for Determination of Mean Speckle Size in Simulated Speckle Pattern 0.478/msr-04-004 MEASUREMENT SCENCE REVEW, Vlume 4, N. 3, 04 Methds fr Determinatin f Mean Speckle Size in Simulated Speckle Pattern. Hamarvá, P. Šmíd, P. Hrváth, M. Hrabvský nstitute f Physics f the Academy

More information

SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical model for microarray data analysis

SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical model for microarray data analysis SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical mdel fr micrarray data analysis David Rssell Department f Bistatistics M.D. Andersn Cancer Center, Hustn, TX 77030, USA rsselldavid@gmail.cm

More information

Churn Prediction using Dynamic RFM-Augmented node2vec

Churn Prediction using Dynamic RFM-Augmented node2vec Churn Predictin using Dynamic RFM-Augmented nde2vec Sandra Mitrvić, Jchen de Weerdt, Bart Baesens & Wilfried Lemahieu Department f Decisin Sciences and Infrmatin Management, KU Leuven 18 September 2017,

More information

Support-Vector Machines

Support-Vector Machines Supprt-Vectr Machines Intrductin Supprt vectr machine is a linear machine with sme very nice prperties. Haykin chapter 6. See Alpaydin chapter 13 fr similar cntent. Nte: Part f this lecture drew material

More information

MINIMIZATION OF ACTUATOR REPOSITIONING USING NEURAL NETWORKS WITH APPLICATION IN NONLINEAR HVAC 1 SYSTEMS

MINIMIZATION OF ACTUATOR REPOSITIONING USING NEURAL NETWORKS WITH APPLICATION IN NONLINEAR HVAC 1 SYSTEMS MINIMIZATION OF ACTUATOR REPOSITIONING USING NEURAL NETWORKS WITH APPLICATION IN NONLINEAR HVAC SYSTEMS M. J. Yazdanpanah *, E. Semsar, C. Lucas * yazdan@ut.ac.ir, semsar@chamran.ut.ac.ir, lucas@ipm.ir

More information

Computational modeling techniques

Computational modeling techniques Cmputatinal mdeling techniques Lecture 4: Mdel checing fr ODE mdels In Petre Department f IT, Åb Aademi http://www.users.ab.fi/ipetre/cmpmd/ Cntent Stichimetric matrix Calculating the mass cnservatin relatins

More information

x 1 Outline IAML: Logistic Regression Decision Boundaries Example Data

x 1 Outline IAML: Logistic Regression Decision Boundaries Example Data Outline IAML: Lgistic Regressin Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester Lgistic functin Lgistic regressin Learning lgistic regressin Optimizatin The pwer f nn-linear basis functins Least-squares

More information

Design and Simulation of Dc-Dc Voltage Converters Using Matlab/Simulink

Design and Simulation of Dc-Dc Voltage Converters Using Matlab/Simulink American Jurnal f Engineering Research (AJER) 016 American Jurnal f Engineering Research (AJER) e-issn: 30-0847 p-issn : 30-0936 Vlume-5, Issue-, pp-9-36 www.ajer.rg Research Paper Open Access Design and

More information

k-nearest Neighbor How to choose k Average of k points more reliable when: Large k: noise in attributes +o o noise in class labels

k-nearest Neighbor How to choose k Average of k points more reliable when: Large k: noise in attributes +o o noise in class labels Mtivating Example Memry-Based Learning Instance-Based Learning K-earest eighbr Inductive Assumptin Similar inputs map t similar utputs If nt true => learning is impssible If true => learning reduces t

More information

The blessing of dimensionality for kernel methods

The blessing of dimensionality for kernel methods fr kernel methds Building classifiers in high dimensinal space Pierre Dupnt Pierre.Dupnt@ucluvain.be Classifiers define decisin surfaces in sme feature space where the data is either initially represented

More information

Collocation Map for Overcoming Data Sparseness

Collocation Map for Overcoming Data Sparseness Cllcatin Map fr Overcming Data Sparseness Mnj Kim, Yung S. Han, and Key-Sun Chi Department f Cmputer Science Krea Advanced Institute f Science and Technlgy Taejn, 305-701, Krea mj0712~eve.kaist.ac.kr,

More information

Agenda. What is Machine Learning? Learning Type of Learning: Supervised, Unsupervised and semi supervised Classification

Agenda. What is Machine Learning? Learning Type of Learning: Supervised, Unsupervised and semi supervised Classification Agenda Artificial Intelligence and its applicatins Lecture 6 Supervised Learning Prfessr Daniel Yeung danyeung@ieee.rg Dr. Patrick Chan patrickchan@ieee.rg Suth China University f Technlgy, China Learning

More information

CHAPTER 4 DIAGNOSTICS FOR INFLUENTIAL OBSERVATIONS

CHAPTER 4 DIAGNOSTICS FOR INFLUENTIAL OBSERVATIONS CHAPTER 4 DIAGNOSTICS FOR INFLUENTIAL OBSERVATIONS 1 Influential bservatins are bservatins whse presence in the data can have a distrting effect n the parameter estimates and pssibly the entire analysis,

More information

3.4 Shrinkage Methods Prostate Cancer Data Example (Continued) Ridge Regression

3.4 Shrinkage Methods Prostate Cancer Data Example (Continued) Ridge Regression 3.3.4 Prstate Cancer Data Example (Cntinued) 3.4 Shrinkage Methds 61 Table 3.3 shws the cefficients frm a number f different selectin and shrinkage methds. They are best-subset selectin using an all-subsets

More information

Part 3 Introduction to statistical classification techniques

Part 3 Introduction to statistical classification techniques Part 3 Intrductin t statistical classificatin techniques Machine Learning, Part 3, March 07 Fabi Rli Preamble ØIn Part we have seen that if we knw: Psterir prbabilities P(ω i / ) Or the equivalent terms

More information

ENSC Discrete Time Systems. Project Outline. Semester

ENSC Discrete Time Systems. Project Outline. Semester ENSC 49 - iscrete Time Systems Prject Outline Semester 006-1. Objectives The gal f the prject is t design a channel fading simulatr. Upn successful cmpletin f the prject, yu will reinfrce yur understanding

More information

COMP9444 Neural Networks and Deep Learning 3. Backpropagation

COMP9444 Neural Networks and Deep Learning 3. Backpropagation COMP9444 Neural Netwrks and Deep Learning 3. Backprpagatin Tetbk, Sectins 4.3, 5.2, 6.5.2 COMP9444 17s2 Backprpagatin 1 Outline Supervised Learning Ockham s Razr (5.2) Multi-Layer Netwrks Gradient Descent

More information

Floating Point Method for Solving Transportation. Problems with Additional Constraints

Floating Point Method for Solving Transportation. Problems with Additional Constraints Internatinal Mathematical Frum, Vl. 6, 20, n. 40, 983-992 Flating Pint Methd fr Slving Transprtatin Prblems with Additinal Cnstraints P. Pandian and D. Anuradha Department f Mathematics, Schl f Advanced

More information

Elements of Machine Intelligence - I

Elements of Machine Intelligence - I ECE-175A Elements f Machine Intelligence - I Ken Kreutz-Delgad Nun Vascncels ECE Department, UCSD Winter 2011 The curse The curse will cver basic, but imprtant, aspects f machine learning and pattern recgnitin

More information

Resampling Methods. Chapter 5. Chapter 5 1 / 52

Resampling Methods. Chapter 5. Chapter 5 1 / 52 Resampling Methds Chapter 5 Chapter 5 1 / 52 1 51 Validatin set apprach 2 52 Crss validatin 3 53 Btstrap Chapter 5 2 / 52 Abut Resampling An imprtant statistical tl Pretending the data as ppulatin and

More information

* o * * o o 1.5. o o. o * 0.5 * * o * * o. o o. oo o. * * * o * 0.5. o o * * * o o. * * oo. o o o o. * o * * o* o

* o * * o o 1.5. o o. o * 0.5 * * o * * o. o o. oo o. * * * o * 0.5. o o * * * o o. * * oo. o o o o. * o * * o* o Pseudinverse Learning Algrithm fr Feedfrward Neural Netwrs Λ PING GUO and MICHAEL R. LYU Department f Cmputer Science and Engineering, The Chinese University f Hng Kng, Shatin, NT, Hng Kng. SAR f P.R.

More information

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised

More information

Margin Distribution and Learning Algorithms

Margin Distribution and Learning Algorithms ICML 03 Margin Distributin and Learning Algrithms Ashutsh Garg IBM Almaden Research Center, San Jse, CA 9513 USA Dan Rth Department f Cmputer Science, University f Illinis, Urbana, IL 61801 USA ASHUTOSH@US.IBM.COM

More information

Chapter 15 & 16: Random Forests & Ensemble Learning

Chapter 15 & 16: Random Forests & Ensemble Learning Chapter 15 & 16: Randm Frests & Ensemble Learning DD3364 Nvember 27, 2012 Ty Prblem fr Bsted Tree Bsted Tree Example Estimate this functin with a sum f trees with 9-terminal ndes by minimizing the sum

More information

The Kullback-Leibler Kernel as a Framework for Discriminant and Localized Representations for Visual Recognition

The Kullback-Leibler Kernel as a Framework for Discriminant and Localized Representations for Visual Recognition The Kullback-Leibler Kernel as a Framewrk fr Discriminant and Lcalized Representatins fr Visual Recgnitin Nun Vascncels Purdy H Pedr Mren ECE Department University f Califrnia, San Dieg HP Labs Cambridge

More information

Drought damaged area

Drought damaged area ESTIMATE OF THE AMOUNT OF GRAVEL CO~TENT IN THE SOIL BY A I R B O'RN EMS S D A T A Y. GOMI, H. YAMAMOTO, AND S. SATO ASIA AIR SURVEY CO., l d. KANAGAWA,JAPAN S.ISHIGURO HOKKAIDO TOKACHI UBPREFECTRAl OffICE

More information

The Research on Flux Linkage Characteristic Based on BP and RBF Neural Network for Switched Reluctance Motor

The Research on Flux Linkage Characteristic Based on BP and RBF Neural Network for Switched Reluctance Motor Prgress In Electrmagnetics Research M, Vl. 35, 5 6, 24 The Research n Flux Linkage Characteristic Based n BP and RBF Neural Netwrk fr Switched Reluctance Mtr Yan Cai, *, Siyuan Sun, Chenhui Wang, and Cha

More information

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised

More information

MODULAR DECOMPOSITION OF THE NOR-TSUM MULTIPLE-VALUED PLA

MODULAR DECOMPOSITION OF THE NOR-TSUM MULTIPLE-VALUED PLA MODUAR DECOMPOSITION OF THE NOR-TSUM MUTIPE-AUED PA T. KAGANOA, N. IPNITSKAYA, G. HOOWINSKI k Belarusian State University f Infrmatics and Radielectrnics, abratry f Image Prcessing and Pattern Recgnitin.

More information

Midwest Big Data Summer School: Machine Learning I: Introduction. Kris De Brabanter

Midwest Big Data Summer School: Machine Learning I: Introduction. Kris De Brabanter Midwest Big Data Summer Schl: Machine Learning I: Intrductin Kris De Brabanter kbrabant@iastate.edu Iwa State University Department f Statistics Department f Cmputer Science June 24, 2016 1/24 Outline

More information

COMP 551 Applied Machine Learning Lecture 4: Linear classification

COMP 551 Applied Machine Learning Lecture 4: Linear classification COMP 551 Applied Machine Learning Lecture 4: Linear classificatin Instructr: Jelle Pineau (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted

More information

Support Vector Machines and Flexible Discriminants

Support Vector Machines and Flexible Discriminants 12 Supprt Vectr Machines and Flexible Discriminants This is page 417 Printer: Opaque this 12.1 Intrductin In this chapter we describe generalizatins f linear decisin bundaries fr classificatin. Optimal

More information

CHAPTER 3 INEQUALITIES. Copyright -The Institute of Chartered Accountants of India

CHAPTER 3 INEQUALITIES. Copyright -The Institute of Chartered Accountants of India CHAPTER 3 INEQUALITIES Cpyright -The Institute f Chartered Accuntants f India INEQUALITIES LEARNING OBJECTIVES One f the widely used decisin making prblems, nwadays, is t decide n the ptimal mix f scarce

More information

a(k) received through m channels of length N and coefficients v(k) is an additive independent white Gaussian noise with

a(k) received through m channels of length N and coefficients v(k) is an additive independent white Gaussian noise with urst Mde Nn-Causal Decisin-Feedback Equalizer based n Sft Decisins Elisabeth de Carvalh and Dirk T.M. Slck Institut EURECOM, 2229 rute des Crêtes,.P. 93, 694 Sphia ntiplis Cedex, FRNCE Tel: +33 493263

More information

A Novel Stochastic-Based Algorithm for Terrain Splitting Optimization Problem

A Novel Stochastic-Based Algorithm for Terrain Splitting Optimization Problem A Nvel Stchastic-Based Algrithm fr Terrain Splitting Optimizatin Prblem Le Hang Sn Nguyen inh Ha Abstract This paper deals with the prblem f displaying large igital Elevatin Mdel data in 3 GIS urrent appraches

More information

APPLICATION OF THE BRATSETH SCHEME FOR HIGH LATITUDE INTERMITTENT DATA ASSIMILATION USING THE PSU/NCAR MM5 MESOSCALE MODEL

APPLICATION OF THE BRATSETH SCHEME FOR HIGH LATITUDE INTERMITTENT DATA ASSIMILATION USING THE PSU/NCAR MM5 MESOSCALE MODEL JP2.11 APPLICATION OF THE BRATSETH SCHEME FOR HIGH LATITUDE INTERMITTENT DATA ASSIMILATION USING THE PSU/NCAR MM5 MESOSCALE MODEL Xingang Fan * and Jeffrey S. Tilley University f Alaska Fairbanks, Fairbanks,

More information

Pipetting 101 Developed by BSU CityLab

Pipetting 101 Developed by BSU CityLab Discver the Micrbes Within: The Wlbachia Prject Pipetting 101 Develped by BSU CityLab Clr Cmparisns Pipetting Exercise #1 STUDENT OBJECTIVES Students will be able t: Chse the crrect size micrpipette fr

More information

Determining the Accuracy of Modal Parameter Estimation Methods

Determining the Accuracy of Modal Parameter Estimation Methods Determining the Accuracy f Mdal Parameter Estimatin Methds by Michael Lee Ph.D., P.E. & Mar Richardsn Ph.D. Structural Measurement Systems Milpitas, CA Abstract The mst cmmn type f mdal testing system

More information

Differentiation Applications 1: Related Rates

Differentiation Applications 1: Related Rates Differentiatin Applicatins 1: Related Rates 151 Differentiatin Applicatins 1: Related Rates Mdel 1: Sliding Ladder 10 ladder y 10 ladder 10 ladder A 10 ft ladder is leaning against a wall when the bttm

More information

Chapter 7. Neural Networks

Chapter 7. Neural Networks Chapter 7. Neural Netwrks Wei Pan Divisin f Bistatistics, Schl f Public Health, University f Minnesta, Minneaplis, MN 55455 Email: weip@bistat.umn.edu PubH 7475/8475 c Wei Pan Intrductin Chapter 11. nly

More information

CN700 Additive Models and Trees Chapter 9: Hastie et al. (2001)

CN700 Additive Models and Trees Chapter 9: Hastie et al. (2001) CN700 Additive Mdels and Trees Chapter 9: Hastie et al. (2001) Madhusudana Shashanka Department f Cgnitive and Neural Systems Bstn University CN700 - Additive Mdels and Trees March 02, 2004 p.1/34 Overview

More information

Pure adaptive search for finite global optimization*

Pure adaptive search for finite global optimization* Mathematical Prgramming 69 (1995) 443-448 Pure adaptive search fr finite glbal ptimizatin* Z.B. Zabinskya.*, G.R. Wd b, M.A. Steel c, W.P. Baritmpa c a Industrial Engineering Prgram, FU-20. University

More information

Learning to Control an Unstable System with Forward Modeling

Learning to Control an Unstable System with Forward Modeling 324 Jrdan and Jacbs Learning t Cntrl an Unstable System with Frward Mdeling Michael I. Jrdan Brain and Cgnitive Sciences MIT Cambridge, MA 02139 Rbert A. Jacbs Cmputer and Infrmatin Sciences University

More information

Coalition Formation and Data Envelopment Analysis

Coalition Formation and Data Envelopment Analysis Jurnal f CENTRU Cathedra Vlume 4, Issue 2, 20 26-223 JCC Jurnal f CENTRU Cathedra Calitin Frmatin and Data Envelpment Analysis Rlf Färe Oregn State University, Crvallis, OR, USA Shawna Grsspf Oregn State

More information

Training Algorithms for Recurrent Neural Networks

Training Algorithms for Recurrent Neural Networks raining Algrithms fr Recurrent Neural Netwrks SUWARIN PAAAVORAKUN UYN NGOC PIEN Cmputer Science Infrmatin anagement Prgram Asian Institute f echnlgy P.O. Bx 4, Klng Luang, Pathumthani 12120 AILAND http://www.ait.ac.th

More information

NWC SAF ENTERING A NEW PHASE

NWC SAF ENTERING A NEW PHASE NWC SAF ENTERING A NEW PHASE Pilar Fernández Agencia Estatal de Meterlgía (AEMET) Lenard Priet Castr, 8; 28040 Madrid, Spain Phne: +34 915 819 654, Fax: +34 915 819 767 E-mail: mafernandeza@aemet.es Abstract

More information

MACE For Conformation Traits

MACE For Conformation Traits MACE Fr Cnfrmatin raits L. Klei and. J. Lawlr Hlstein Assciatin USA, Inc., Brattlebr, Vermnt, USA Intrductin Multiple acrss cuntry evaluatins (MACE) fr prductin traits are nw rutinely cmputed and used

More information

Performance Bounds for Detect and Avoid Signal Sensing

Performance Bounds for Detect and Avoid Signal Sensing Perfrmance unds fr Detect and Avid Signal Sensing Sam Reisenfeld Real-ime Infrmatin etwrks, University f echnlgy, Sydney, radway, SW 007, Australia samr@uts.edu.au Abstract Detect and Avid (DAA) is a Cgnitive

More information

Distributions, spatial statistics and a Bayesian perspective

Distributions, spatial statistics and a Bayesian perspective Distributins, spatial statistics and a Bayesian perspective Dug Nychka Natinal Center fr Atmspheric Research Distributins and densities Cnditinal distributins and Bayes Thm Bivariate nrmal Spatial statistics

More information

Thermodynamics Partial Outline of Topics

Thermodynamics Partial Outline of Topics Thermdynamics Partial Outline f Tpics I. The secnd law f thermdynamics addresses the issue f spntaneity and invlves a functin called entrpy (S): If a prcess is spntaneus, then Suniverse > 0 (2 nd Law!)

More information

Eric Klein and Ning Sa

Eric Klein and Ning Sa Week 12. Statistical Appraches t Netwrks: p1 and p* Wasserman and Faust Chapter 15: Statistical Analysis f Single Relatinal Netwrks There are fur tasks in psitinal analysis: 1) Define Equivalence 2) Measure

More information

Current/voltage-mode third order quadrature oscillator employing two multiple outputs CCIIs and grounded capacitors

Current/voltage-mode third order quadrature oscillator employing two multiple outputs CCIIs and grounded capacitors Indian Jurnal f Pure & Applied Physics Vl. 49 July 20 pp. 494-498 Current/vltage-mde third rder quadrature scillatr emplying tw multiple utputs CCIIs and grunded capacitrs Jiun-Wei Hrng Department f Electrnic

More information

Biplots in Practice MICHAEL GREENACRE. Professor of Statistics at the Pompeu Fabra University. Chapter 13 Offprint

Biplots in Practice MICHAEL GREENACRE. Professor of Statistics at the Pompeu Fabra University. Chapter 13 Offprint Biplts in Practice MICHAEL GREENACRE Prfessr f Statistics at the Pmpeu Fabra University Chapter 13 Offprint CASE STUDY BIOMEDICINE Cmparing Cancer Types Accrding t Gene Epressin Arrays First published:

More information

A Regression Solution to the Problem of Criterion Score Comparability

A Regression Solution to the Problem of Criterion Score Comparability A Regressin Slutin t the Prblem f Criterin Scre Cmparability William M. Pugh Naval Health Research Center When the criterin measure in a study is the accumulatin f respnses r behavirs fr an individual

More information

A mathematical model for complete stress-strain curve prediction of permeable concrete

A mathematical model for complete stress-strain curve prediction of permeable concrete A mathematical mdel fr cmplete stress-strain curve predictin f permeable cncrete M. K. Hussin Y. Zhuge F. Bullen W. P. Lkuge Faculty f Engineering and Surveying, University f Suthern Queensland, Twmba,

More information

A Matrix Representation of Panel Data

A Matrix Representation of Panel Data web Extensin 6 Appendix 6.A A Matrix Representatin f Panel Data Panel data mdels cme in tw brad varieties, distinct intercept DGPs and errr cmpnent DGPs. his appendix presents matrix algebra representatins

More information

Technical Bulletin. Generation Interconnection Procedures. Revisions to Cluster 4, Phase 1 Study Methodology

Technical Bulletin. Generation Interconnection Procedures. Revisions to Cluster 4, Phase 1 Study Methodology Technical Bulletin Generatin Intercnnectin Prcedures Revisins t Cluster 4, Phase 1 Study Methdlgy Release Date: Octber 20, 2011 (Finalizatin f the Draft Technical Bulletin released n September 19, 2011)

More information

Dead-beat controller design

Dead-beat controller design J. Hetthéssy, A. Barta, R. Bars: Dead beat cntrller design Nvember, 4 Dead-beat cntrller design In sampled data cntrl systems the cntrller is realised by an intelligent device, typically by a PLC (Prgrammable

More information

ANALYTICAL MODEL FOR PREDICTING STRESS-STRAIN BEHAVIOUR OF BACTERIAL CONCRETE

ANALYTICAL MODEL FOR PREDICTING STRESS-STRAIN BEHAVIOUR OF BACTERIAL CONCRETE Internatinal Jurnal f Civil Engineering and Technlgy (IJCIET) Vlume 9, Issue 11, Nvember 018, pp. 383 393, Article ID: IJCIET_09_11_38 Available nline at http://www.iaeme.cm/ijciet/issues.asp?jtype=ijciet&vtype=9&itype=11

More information

Inflow Control on Expressway Considering Traffic Equilibria

Inflow Control on Expressway Considering Traffic Equilibria Memirs f the Schl f Engineering, Okayama University Vl. 20, N.2, February 1986 Inflw Cntrl n Expressway Cnsidering Traffic Equilibria Hirshi INOUYE* (Received February 14, 1986) SYNOPSIS When expressway

More information

Prediction of Municipal Solid Waste Generation by Use of Artificial Neural Network: A Case Study of Mashhad

Prediction of Municipal Solid Waste Generation by Use of Artificial Neural Network: A Case Study of Mashhad Int. J. Envirn. Res., 2(1): 13-22, Winter 2008 ISSN: 1735-6865 Predictin f Municipal Slid Waste Generatin by Use f Artificial Neural Netwrk: A Case Study f Mashhad Jalili Ghazi Zade, M. 1 and Nri, R. 2*

More information

^YawataR&D Laboratory, Nippon Steel Corporation, Tobata, Kitakyushu, Japan

^YawataR&D Laboratory, Nippon Steel Corporation, Tobata, Kitakyushu, Japan Detectin f fatigue crack initiatin frm a ntch under a randm lad C. Makabe," S. Nishida^C. Urashima,' H. Kaneshir* "Department f Mechanical Systems Engineering, University f the Ryukyus, Nishihara, kinawa,

More information

A PARALLEL APPROACH FOR BACKPROPAGATION LEARNING OF NEURAL NETWORKS CRESPO, M., PICCOLI F.,PRINTISTA M. 1, GALLARD R.

A PARALLEL APPROACH FOR BACKPROPAGATION LEARNING OF NEURAL NETWORKS CRESPO, M., PICCOLI F.,PRINTISTA M. 1, GALLARD R. A PARALLEL APPROACH FOR BACKPROPAGATION LEARNING OF NEURAL NETWORKS CRESPO, M., PICCOLI F.,PRINTISTA M. 1, GALLARD R. Grup de Interés en Sistemas de Cmputación 3 Departament de Infrmática Universidad Nacinal

More information

3D FE Modeling Simulation of Cold Rotary Forging with Double Symmetry Rolls X. H. Han 1, a, L. Hua 1, b, Y. M. Zhao 1, c

3D FE Modeling Simulation of Cold Rotary Forging with Double Symmetry Rolls X. H. Han 1, a, L. Hua 1, b, Y. M. Zhao 1, c Materials Science Frum Online: 2009-08-31 ISSN: 1662-9752, Vls. 628-629, pp 623-628 di:10.4028/www.scientific.net/msf.628-629.623 2009 Trans Tech Publicatins, Switzerland 3D FE Mdeling Simulatin f Cld

More information

ENG2410 Digital Design Sequential Circuits: Part B

ENG2410 Digital Design Sequential Circuits: Part B ENG24 Digital Design Sequential Circuits: Part B Fall 27 S. Areibi Schl f Engineering University f Guelph Analysis f Sequential Circuits Earlier we learned hw t analyze cmbinatinal circuits We will extend

More information

Kinetic Model Completeness

Kinetic Model Completeness 5.68J/10.652J Spring 2003 Lecture Ntes Tuesday April 15, 2003 Kinetic Mdel Cmpleteness We say a chemical kinetic mdel is cmplete fr a particular reactin cnditin when it cntains all the species and reactins

More information

CAUSAL INFERENCE. Technical Track Session I. Phillippe Leite. The World Bank

CAUSAL INFERENCE. Technical Track Session I. Phillippe Leite. The World Bank CAUSAL INFERENCE Technical Track Sessin I Phillippe Leite The Wrld Bank These slides were develped by Christel Vermeersch and mdified by Phillippe Leite fr the purpse f this wrkshp Plicy questins are causal

More information

Checking the resolved resonance region in EXFOR database

Checking the resolved resonance region in EXFOR database Checking the reslved resnance regin in EXFOR database Gttfried Bertn Sciété de Calcul Mathématique (SCM) Oscar Cabells OECD/NEA Data Bank JEFF Meetings - Sessin JEFF Experiments Nvember 0-4, 017 Bulgne-Billancurt,

More information

Marginal Conceptual Predictive Statistic for Mixed Model Selection

Marginal Conceptual Predictive Statistic for Mixed Model Selection Open Jurnal f Statistics, 06, 6, 39-53 Published Online April 06 in Sci http://wwwscirprg/jurnal/js http://dxdirg/0436/js0660 Marginal Cnceptual Predictive Statistic fr Mixed Mdel Selectin Cheng Wenren,

More information

ON-LINE PROCEDURE FOR TERMINATING AN ACCELERATED DEGRADATION TEST

ON-LINE PROCEDURE FOR TERMINATING AN ACCELERATED DEGRADATION TEST Statistica Sinica 8(1998), 207-220 ON-LINE PROCEDURE FOR TERMINATING AN ACCELERATED DEGRADATION TEST Hng-Fwu Yu and Sheng-Tsaing Tseng Natinal Taiwan University f Science and Technlgy and Natinal Tsing-Hua

More information

SAMPLING DYNAMICAL SYSTEMS

SAMPLING DYNAMICAL SYSTEMS SAMPLING DYNAMICAL SYSTEMS Melvin J. Hinich Applied Research Labratries The University f Texas at Austin Austin, TX 78713-8029, USA (512) 835-3278 (Vice) 835-3259 (Fax) hinich@mail.la.utexas.edu ABSTRACT

More information

arxiv: v1 [physics.comp-ph] 21 Feb 2018

arxiv: v1 [physics.comp-ph] 21 Feb 2018 arxiv:182.7486v1 [physics.cmp-ph] 21 Feb 218 rspa.ryalscietypublishing.rg Research Article submitted t jurnal Subject Areas: Mechanical Engineering Keywrds: Data-driven frecasting, Lng-Shrt Term Memry,

More information

PSU GISPOPSCI June 2011 Ordinary Least Squares & Spatial Linear Regression in GeoDa

PSU GISPOPSCI June 2011 Ordinary Least Squares & Spatial Linear Regression in GeoDa There are tw parts t this lab. The first is intended t demnstrate hw t request and interpret the spatial diagnstics f a standard OLS regressin mdel using GeDa. The diagnstics prvide infrmatin abut the

More information

CHAPTER 24: INFERENCE IN REGRESSION. Chapter 24: Make inferences about the population from which the sample data came.

CHAPTER 24: INFERENCE IN REGRESSION. Chapter 24: Make inferences about the population from which the sample data came. MATH 1342 Ch. 24 April 25 and 27, 2013 Page 1 f 5 CHAPTER 24: INFERENCE IN REGRESSION Chapters 4 and 5: Relatinships between tw quantitative variables. Be able t Make a graph (scatterplt) Summarize the

More information

Fall 2013 Physics 172 Recitation 3 Momentum and Springs

Fall 2013 Physics 172 Recitation 3 Momentum and Springs Fall 03 Physics 7 Recitatin 3 Mmentum and Springs Purpse: The purpse f this recitatin is t give yu experience wrking with mmentum and the mmentum update frmula. Readings: Chapter.3-.5 Learning Objectives:.3.

More information

Performance Comparison of Nonlinear Filters for Indoor WLAN Positioning

Performance Comparison of Nonlinear Filters for Indoor WLAN Positioning Perfrmance Cmparisn f Nnlinear Filters fr Indr WLAN Psitining Hui Wang Andrei Szab Jachim Bamberger Learning Systems Infrmatin and Cmmunicatins Crprate Technlgy Siemens AG Munich Germany Email: {hui.wang.ext.andrei.szab.jachim.bamberger}@siemens.cm

More information