An Accurate Heave Signal Prediction Using Artificial Neural Network

Size: px
Start display at page:

Download "An Accurate Heave Signal Prediction Using Artificial Neural Network"

Transcription

1 Internatonal Journal of Multdsclnary and Current Research Research Artcle ISSN: Avalale at: htt://jmcr.com Mohammed El-Dasty 1,2 1 Hydrograhc Surveyng Deartment, Faculty of Martme Studes, Kng Adulazz Unversty 2 Engneerng Deartment of Pulc Wors, Faculty of Engneerng, Mansoura Unversty Acceted 05 Oct 2014, Avalale onlne 15 Oct 2014, Vol.2 (Set/Oct 2014 ssue) Astract An accurate heave modelng s reured for several alcatons, ncludng hydrograhc surveyng. Ths aer rooses an adatve heave sgnal modelng, whch uses a neural networ-ased modellng. A recurrent neural networ and three-layer feed forward neural networ traned usng the Levenerg-Maruardt learnng algorthm s used for ths urose. Comutatonal results wth fve dfferent datasets of real tme heave are rovded to valdate the effectveness of the artfcal neural networ-ased model. It s shown that the new neural networ-ased model gve a relale heave model wth excellent erformance. Also, a comarson s made etween the develoed artfcal neural networ models and the autoregressve models. It s shown that the new artfcal neural networ-ased model gve the est erformance results (.e., the mean suare error MSE). Keywords: Heave Predcton, Neural Networ, Autoregressve, Recurrent Networ 1. Introducton Accurate heave modelng s reured for several alcatons, ncludng hydrograhc surveyng. The hyscal models and Kalman Flter estmaton of the heave rocess can e found n [1]. Freuency resonse methods have een used n the ast to dentfy have models. The model s ased on the freuency content of the heave record. The freuency content s used as the ass to formulate a state sace reresentaton of the heave model [1]. A lot of studes have een reorted to solve ths rolem, wth the Kalman flter eng the most common [1]. The desgn of the Kalman flter s ased on the assumton of the comlete structural nowledge of the model whch descres the heave dynamcs and t s ased the Gaussan assumton of the state sace nose statstcs [2]. It s well nown that there s a degradaton of the KF-ased estmaton ualty when the actual nose statstcs s not Gaussan [3]. In order to overcome these lmtatons, another aroach was followed ased on autoregressve movng average model [4]. Ths aer rooses an adatve heave sgnal modelng, whch uses a neural networ-ased modellng. A recurrent neural networ (RNN) and three-layer feed forward neural networ (FFNN) traned usng the Levenerg-Maruardt algorthm s used for ths urose. Comutatonal results wth fve dfferent datasets of real tme heave are rovded to valdate the effectveness of the ANN-ased model. It s shown that the new artfcal neural networased model gve a relale heave model wth excellent erformance. Also, a comarson s made etween the develoed artfcal neural networ models and the autoregressve models. It s shown that the new artfcal neural networ-ased model gve the est erformance results (.e., the mean suare error MSE). 2. Autoregressve Models (AR) Autoregressve rocess s lnear model. The ref lterature of autoregressve rocess can e found n [4] and [5]. Autoregressve (AR) rocess s tme seres consstng lnear comnaton of mmedate ast values of the tme seres. Fgure 1 shows an examle of a fully connected AR strucure, whch s referred to as -1 (_AR, 1_out). AR() rocess of order can e summarzed n the followng lnear euaton (see Fgure 1): y ˆ( n) a( ). n )) (1) DOI: htt://dx.do.org/ /ijmcr/2/5/2014/ Int. J. of Multdsclnary and Current research, Set/Oct Where, yn ˆ( ) s the estmated outut, n ) s the mmedate oserved heave value and a () s autoregressve (AR) model arameter. AR() model arameters are estmated usng least suares method [5]. To consder the non-statonary art (.e., resduals) n the tme seres, the lnear autregressve movng average (ARMA) model. ARMA rocess s tme seres consstng of random error comonent and lnear comnaton of mmedate heave values and mmedate ast resduals. Fgure 1 shows an examle of a fully connected AR strucure, whch s referred to as (+)-1 (_AR + _MA,

2 Mohammed El-Dasty 1_out). The ARMA(, ) rocess can e summarzed n the lnear euaton (see Fgure2): n) 1 a( ). n ) j1 c( j). e( n j) ( n) Where, yn ˆ( ) s the estmated outut, n ) s the mmedate ast value of the records, a () s autoregressve (AR) model arameter, e( n j) yˆ ( n j) n j) s mmedate ast resdual, and c( j ) s movng average (MA) model arameter. ARMA(, ) model arameters are estmated usng least suares method [6]. (2) redcton model (See [6]; [7]). Artfcal neural networs can e develoed n many dfferent forms, deendng on networ structure and the learnng algorthms. Fgure 3 reresents a loc dagram of a smle model of a neuron showng the weghts of the varous lns. w 1 w2 y w 3 w y = Inut neuron y = Outut neuron a = AR arameters 1 x 1 x 1 x a 1 a 2 a Fg. 3: Smle neural networ structure The aove structure contans nut layer and outut layer and can e reresented n the comact form as: Fg. 1: Autoregressve (AR) structure [(_AR)-(1_out)] a 1 a +- 1 n-2) e (n) = (y (n) - y (n)) n-) y = Inut neuron y = Outut neuron a = AR arameters a j = MA arameters -j = Delay oerator yˆ f ( 1 w x ) Where, x reresents one varale n the nut layer, f j s the actvaton functon, yˆ reresents the outut varale, s the as varale, and w reresents one of the neural networ arameter. Artfcal neural networs are a flexle nonlnear form n whch the actvaton functons f can e chosen artrary. Fgure 4 show examle of two common actvaton functons (.e. Sgmod and hyerolc tangent functon). (3) n-) e(n-1) e(n-) Fg. 2: Autoregressve Average Movng (ARMA) structure [(_AR+_MA)-(1_out)] 3. Artfcal Neural Networs (ANN) Models 3.1 Bascs of Artfcal Neural Networs In ths secton we refly descre the fundamentals of artfcal neural networs (for more detals see [5]). Artfcal neural networ can e smly nterreted as a hghly nonlnear autoregressve model whch s constructed y comnng nonlnear actvaton functon through a multlayer structure. Neural networ have een successfully aled n many dfferent felds as a a. Sgmod functon. Hyerolc tangent Fg. 4: Actvaton functons - sgmod and hyerolc tangent functon. In ths aer we nvestgated the use of the feedforward neural networs and recurrent neural networ n heave redcton model. It should e noted that for the redcton model, the nut layer varale n ) reresents the mmedate ast value of the tme seres records. 990 Int. J. of Multdsclnary and Current research, Set/Oct

3 Mohammed El-Dasty 3.2 Feed forward and Recurrent Neural Networ Heave Predcton Models h y h h yˆ( n) f wn. f w. n ) wj. e( n j) n 1 1 j1 (5) A feedforward networs are characterzed y one nut layer, one outut layer and one or more hdden layers contanng a numer of networ actvated varales [5]. Fgure 5 shows an examle of a fully connected threelayer feedforward networ n AR strucure, whch s referred to as whch s referred to as [(_AR)-(s_hdden)- (1_out)]. The nut varales are smultaneously actvated y a nonlnear actvaton functon f and s rovde hdden varales n a hdden layer. Then, the resultng hdden varales are actvated y another actvaton functon f and roduce the networ outut J. j h Hdden Layer Inut Layer Outut Layer w 1 0 h 1 Fg. 5: Three-Layer Feedforward Neural Networ wth the Structure [(_AR)-(s_hdden)-(1_out)]. The aove structure can e reresented n a comact form as: h y h yˆ( n) f wn. f w. n ) n 1 1 (4) Alternatvely, a recurrent neural networ s a hghly nonlnear dynamc structure and smlar to ARMA model n the structure [9]. From a dynamc ont of vew, t s mortant to nclude the lagged deendent varales as feedac loos. The lagged varales are the resdual errors etween the desred oututs and the estmated oututs are consdered as nut varales. Hence, a recurrent neural networ has een roosed n ths aer. A recurrent neural networs are smlar to the feedforward networs, wth excet that the former have at least one feedac loo). Fgure 6 shows an examle of a fully connected three-layer recurrent networ n ARMA structure, whch s referred to as [_AR + _MA)- (s_hdden)-(1_out)]. The followng structure can e reresented n a comact form as: w c s s y = Outut neuron = Transfer functon w s = ANN arameters h s n-) y = Inut neuron h = Hdden neuron w s h Hdden Layer Inut Layer Outut Layer - + h 1 n-) Fg. 6 Recurrent Neural Networ structure [(_AR + _MA)-(s_hdden)-(1_out)] Tranng the feed forward neural networ and recurrent neural networ s accomlshed through teratve adjustments of the free arameters,.e., the weghts and as, of the networ tll we otan the otmal values. There exst varous learnng algorthms, whch are fundamental to the desgn of neural networs. The Levenerg-Maruardt learnng algorthm s the most wdely used for feedforward neural networs and recurrent neural networ, whch s used n ths research (see [9] for more detals). 4. Results and dscusson e (n) = (y (n) - y (n)) w c s e(n-1) w s (+) = Transfer functon -j = Delay oerator Fve dfferent data seres of real tme heave, namely MBheave-1, MBheave-2, MBheave-3, MBheave-4, and MBheave-4, were used to valdate the neural networ model. The structure of the neural networ was ult usng the Matla [8]. Several tests were conducted to otmze the structure of the neural networ for each of the fve data seres. As mentoned efore, we have nvestgated two neural networ-ased models. Frst, we used the feedforward neural networ structure. In ths aroach, mmedate ast values of the heave records are used as nut to the networ, whle future values of the tdal records are used as the desred outut. Then, we followed another neural networ structure, whch s recurrent neural networ. In ths aroach, the mmedate ast values of the heave records and the mmedate ast of value of the resduals are used as nut to the networ, whle future values of the heave records are used as the desred outut. Each of the data seres was dvded nto three datasets: tranng, testng and valdaton datasets. The frst 10% heave records were assgned to the testng dataset, whle the last 10% heave records were assgned to the valdaton dataset. The tranng dataset was 991 Int. J. of Multdsclnary and Current research, Set/Oct 2014 h s e(n-) -1 -r y = Inut neuron h = Hdden neuron y = Outut neuron w j = ANN arameters

4 Magntude (numer) Mohammed El-Dasty Tale 1: Summary results Data Model AR ARMA FFNN RNN MB-1 MSE-tran 1.13E E E E-06 MSE-vald 1.88E E E E-06 MB-2 MSE-tran 7.53E E E E-06 MSE-vald 5.89E E E E-06 MB-3 MSE-tran 1.34E E E E-06 MSE-vald 1.01E E E E-06 MB-4 MSE-tran 1.55E E E E-06 MSE-vald 1.14E E E E-06 MB -5 MSE-tran 1.52E E E E-06 MSE-vald 1.99E E E E-06 selected to reresent the mddle orton of the data seres, whch vared from one MBheave to another. The tranng was stoed ased on testng the generalzaton erformance the neural networ usng the testng dataset. After tranng and testng the networ, we generalzed the model to redct ahead the last 10% values of the data seres and comared the results wth the oserved heave records. Ths was done n a seuental manner to emulate the real-tme condton. It was concluded that the recurrent neural networ n ARMA structure [(6_AR+ 6_MA)-(12_hdden)-(1_out)] gves the est results,.e., has the lowest MSE (MSE) error (See Tale 1). To further valdate our model, we comared t wth the autoregressve aroach. Autoregressve (AR) structure and autoregressve Movng Average (ARMA) structure were ult usng the Matla [8]. Each of the data seres was dvded nto two datasets: tranng, and valdaton datasets. The frst 90 % heave records were assgned to the tranng dataset, whle the last 10% heave records were assgned to the valdaton dataset. Several tests were conducted to otmze the structure of the AR and ARMA for each of the fve data seres. It was concluded that the ARMA wth the structure [6_AR + 6_MA)-(1_out)] gves the est results,.e., has the lowest MSE (MSE) error Hstogram of models resduals AR ARMA FFNN RNN Resduals (m) Fg. 7: Hstogram for all models resduals The overall concluson from Tale 1 and Fgure 7 was that the recurrent neural networ wth the structure [6_AR + 6_MA)-(12_hdden)-(1_out)] gves the est results,.e., has the lowest MSE (MSE) error. Fgures 8 shows that the redcted heave versus the desred values, and the redcton errors (.e., resduals) for MBheave-1. Smlar results were otaned for the other four data seres. It can e seen that the maxmum redcton error s aroxmately 1 cm, wth the ul of the resduals fall wthn the +/-0.5 cm range (see Fgure 7). Ths shows that the develoed neural networ model has the caalty to recsely redct the heave values. Fg. 8: Predcted heave versus the desred (.e., actual) values, and the redcton error usng Artfcal Neural Networ (RNN) Concluson A seuental heave redcton model usng Artfcal Neural Networs was develoed n ths aer. The recurrent neural networ structure gave the est erformance results (.e., the mnmum RMS), and therefore was used n redctng the heave values. Heave data seres, wth varyng lengths, from fve dfferent real tme heave data seres were used to verfy the model. It s shown that the maxmum redcton error s aroxmately 1 cm, wth the ul of the resduals fall wthn the +/-0.5 cm range. A erformance comarson was made etween the develoed neural networ model and the autoregressve method for heave redcton. It s shown that erformance of recurrent neural networ s etter than autoregressve method.e., has the lowest MSE (MSE) error. Future develoment wll nclude a comlete atttude and heave soluton, whch taes advantage of GNSS heghts 992 Int. J. of Multdsclnary and Current research, Set/Oct 2014

5 Mohammed El-Dasty References [1]. El-Hawary, F. (1982): Comensaton for Source Heave y Use of a Kalman Flterng, IEEE, Journal of oceanc Engnerng, Vol. OE-7, No. 2. [2]. Brown R. G., and P. Hwang (1997): Introducton to Random Sgnals and Aled Kalman Flterng. John Wley & Sons, Inc., Toronto, Canada. [3]. Ka-We Chang, Aoelmagd Noureldn, and Naser El- Shemy (2003): Mult-sensors Integraton usng Neuron Comutng for Land Vehcle Navgaton, Journal of GPS Solutons Vol 6, No. 4, , Pulshed y John Wley & Sons, Inc. [4]. El-Hawary, F., G. Mamalu (1993): Refned Autoregressve Movng Average Modelng of Underwater Heave Process. IEEE, Journal of Oceanc Engneerng, Vol. 18, No. 3. [5]. Hayn, S. (1999): Neural Networs: A Comrehensve Foundaton. Second Edton. Prentce Hall. [6]. El-Raany, A., G. Auda and S. Adelazm (2002). Predctng Sea Ice Condtons Usng Neural Networs. Journal of Navgaton, Vol. 55, Numer 1, [7]. El-Raany, A. and M. El-Dasty (2003). A New Aroach to Seuental Tdal Predcton. The Journal of Navgaton, Vol. 56, [8]. MathWors (2004). Matla 6.5 Reference Gude, MathWors Inc., USA. [9]. Nørgaard, M., O. Ravn, N. K. Poulsen, L. K. Hansen (2000): Neural networs for Modellng and Control of Dynamc Systems, Srnger-Verlag, London, UK. 993 Int. J. of Multdsclnary and Current research, Set/Oct 2014

A Quadratic Cumulative Production Model for the Material Balance of Abnormally-Pressured Gas Reservoirs F.E. Gonzalez M.S.

A Quadratic Cumulative Production Model for the Material Balance of Abnormally-Pressured Gas Reservoirs F.E. Gonzalez M.S. Natural as Engneerng A Quadratc Cumulatve Producton Model for the Materal Balance of Abnormally-Pressured as Reservors F.E. onale M.S. Thess (2003) T.A. Blasngame, Texas A&M U. Deartment of Petroleum Engneerng

More information

Multiple Regression Analysis

Multiple Regression Analysis Multle Regresson Analss Roland Szlág Ph.D. Assocate rofessor Correlaton descres the strength of a relatonsh, the degree to whch one varale s lnearl related to another Regresson shows us how to determne

More information

A Quadratic Cumulative Production Model for the Material Balance of Abnormally-Pressured Gas Reservoirs F.E. Gonzalez M.S.

A Quadratic Cumulative Production Model for the Material Balance of Abnormally-Pressured Gas Reservoirs F.E. Gonzalez M.S. Formaton Evaluaton and the Analyss of Reservor Performance A Quadratc Cumulatve Producton Model for the Materal Balance of Abnormally-Pressured as Reservors F.E. onale M.S. Thess (2003) T.A. Blasngame,

More information

Short-Term Load Forecasting in Power Systems Using Adaptive Fuzzy Critic Based Neural Network

Short-Term Load Forecasting in Power Systems Using Adaptive Fuzzy Critic Based Neural Network Short-Term Load Forecastng n Power Systems Usng Adatve Fuzzy Crtc Based Neural Network Farzan Rashd Islamc Azad Unversty of Bushehr Bushehr, Iran Abstract: - Load forecastng consttutes an mortant tool

More information

Segmentation Method of MRI Using Fuzzy Gaussian Basis Neural Network

Segmentation Method of MRI Using Fuzzy Gaussian Basis Neural Network Neural Informaton Processng - Letters and Revews Vol.8, No., August 005 LETTER Segmentaton Method of MRI Usng Fuzzy Gaussan Bass Neural Networ We Sun College of Electrcal and Informaton Engneerng, Hunan

More information

The Chaotic Robot Prediction by Neuro Fuzzy Algorithm (2) = θ (3) = ω. Asin. A v. Mana Tarjoman, Shaghayegh Zarei

The Chaotic Robot Prediction by Neuro Fuzzy Algorithm (2) = θ (3) = ω. Asin. A v. Mana Tarjoman, Shaghayegh Zarei The Chaotc Robot Predcton by Neuro Fuzzy Algorthm Mana Tarjoman, Shaghayegh Zare Abstract In ths paper an applcaton of the adaptve neurofuzzy nference system has been ntroduced to predct the behavor of

More information

Model Reference Adaptive Temperature Control of the Electromagnetic Oven Process in Manufacturing Process

Model Reference Adaptive Temperature Control of the Electromagnetic Oven Process in Manufacturing Process RECENT ADVANCES n SIGNAL PROCESSING, ROBOTICS and AUTOMATION Model Reference Adatve Temerature Control of the Electromagnetc Oven Process n Manufacturng Process JIRAPHON SRISERTPOL SUPOT PHUNGPHIMAI School

More information

where I = (n x n) diagonal identity matrix with diagonal elements = 1 and off-diagonal elements = 0; and σ 2 e = variance of (Y X).

where I = (n x n) diagonal identity matrix with diagonal elements = 1 and off-diagonal elements = 0; and σ 2 e = variance of (Y X). 11.4.1 Estmaton of Multple Regresson Coeffcents In multple lnear regresson, we essentally solve n equatons for the p unnown parameters. hus n must e equal to or greater than p and n practce n should e

More information

Topology optimization of plate structures subject to initial excitations for minimum dynamic performance index

Topology optimization of plate structures subject to initial excitations for minimum dynamic performance index th World Congress on Structural and Multdsclnary Otmsaton 7 th -2 th, June 25, Sydney Australa oology otmzaton of late structures subject to ntal exctatons for mnmum dynamc erformance ndex Kun Yan, Gengdong

More information

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,

More information

Pattern Classification (II) 杜俊

Pattern Classification (II) 杜俊 attern lassfcaton II 杜俊 junu@ustc.eu.cn Revew roalty & Statstcs Bayes theorem Ranom varales: screte vs. contnuous roalty struton: DF an DF Statstcs: mean, varance, moment arameter estmaton: MLE Informaton

More information

Classification (klasifikácia) Feedforward Multi-Layer Perceptron (Dopredná viacvrstvová sieť) 14/11/2016. Perceptron (Frank Rosenblatt, 1957)

Classification (klasifikácia) Feedforward Multi-Layer Perceptron (Dopredná viacvrstvová sieť) 14/11/2016. Perceptron (Frank Rosenblatt, 1957) 4//06 IAI: Lecture 09 Feedforard Mult-Layer Percetron (Doredná vacvrstvová seť) Lubca Benuskova AIMA 3rd ed. Ch. 8.6.4 8.7.5 Classfcaton (klasfkáca) In machne learnng and statstcs, classfcaton s the roblem

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Confidence intervals for weighted polynomial calibrations

Confidence intervals for weighted polynomial calibrations Confdence ntervals for weghted olynomal calbratons Sergey Maltsev, Amersand Ltd., Moscow, Russa; ur Kalambet, Amersand Internatonal, Inc., Beachwood, OH e-mal: kalambet@amersand-ntl.com htt://www.chromandsec.com

More information

Multi-Step-Ahead Prediction of Stock Price Using a New Architecture of Neural Networks

Multi-Step-Ahead Prediction of Stock Price Using a New Architecture of Neural Networks Journal of Computer & Robotcs 8(), 05 47-56 47 Mult-Step-Ahead Predcton of Stoc Prce Usng a New Archtecture of Neural Networs Mohammad Taleb Motlagh *, Hamd Khaloozadeh Department of Systems and Control,

More information

Conservative Surrogate Model using Weighted Kriging Variance for Sampling-based RBDO

Conservative Surrogate Model using Weighted Kriging Variance for Sampling-based RBDO 9 th World Congress on Structural and Multdsclnary Otmzaton June 13-17, 011, Shzuoka, Jaan Conservatve Surrogate Model usng Weghted Krgng Varance for Samlng-based RBDO Lang Zhao 1, K.K. Cho, Ikn Lee 3,

More information

Uncertainty models for modal parameters of flexible structures: A complex-rational to real-rational controller design strategy.

Uncertainty models for modal parameters of flexible structures: A complex-rational to real-rational controller design strategy. Uncertanty models for modal arameters of flexle structures: A comlex-ratonal to real-ratonal controller desgn strategy. Nal Aouf, Benot Boulet Deartment of Electrcal and Comuter Engneerng and McGll Center

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

x yi In chapter 14, we want to perform inference (i.e. calculate confidence intervals and perform tests of significance) in this setting.

x yi In chapter 14, we want to perform inference (i.e. calculate confidence intervals and perform tests of significance) in this setting. The Practce of Statstcs, nd ed. Chapter 14 Inference for Regresson Introducton In chapter 3 we used a least-squares regresson lne (LSRL) to represent a lnear relatonshp etween two quanttatve explanator

More information

Advanced Topics in Optimization. Piecewise Linear Approximation of a Nonlinear Function

Advanced Topics in Optimization. Piecewise Linear Approximation of a Nonlinear Function Advanced Tocs n Otmzaton Pecewse Lnear Aroxmaton of a Nonlnear Functon Otmzaton Methods: M8L Introducton and Objectves Introducton There exsts no general algorthm for nonlnear rogrammng due to ts rregular

More information

FORECASTING EXCHANGE RATE USING SUPPORT VECTOR MACHINES

FORECASTING EXCHANGE RATE USING SUPPORT VECTOR MACHINES Proceedngs of the Fourth Internatonal Conference on Machne Learnng and Cybernetcs, Guangzhou, 8- August 005 FORECASTING EXCHANGE RATE USING SUPPORT VECTOR MACHINES DING-ZHOU CAO, SU-LIN PANG, YUAN-HUAI

More information

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis Appled Mechancs and Materals Submtted: 24-6-2 ISSN: 662-7482, Vols. 62-65, pp 2383-2386 Accepted: 24-6- do:.428/www.scentfc.net/amm.62-65.2383 Onlne: 24-8- 24 rans ech Publcatons, Swtzerland RBF Neural

More information

A General Class of Selection Procedures and Modified Murthy Estimator

A General Class of Selection Procedures and Modified Murthy Estimator ISS 684-8403 Journal of Statstcs Volume 4, 007,. 3-9 A General Class of Selecton Procedures and Modfed Murthy Estmator Abdul Bast and Muhammad Qasar Shahbaz Abstract A new selecton rocedure for unequal

More information

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

Multilayer Perceptron (MLP)

Multilayer Perceptron (MLP) Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne

More information

A prediction model for wind farm power generation based on fuzzy modeling

A prediction model for wind farm power generation based on fuzzy modeling Avalable onlne at www.scencedrect.com Proceda Envronmental Scences (0 ) 9 0 Internatonal Conference on Envronmental Scence and Engneerng (ICESE0) A redcton model for wnd farm ower generaton based on fuzzy

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

Bayesian Decision Theory

Bayesian Decision Theory No.4 Bayesan Decson Theory Hu Jang Deartment of Electrcal Engneerng and Comuter Scence Lassonde School of Engneerng York Unversty, Toronto, Canada Outlne attern Classfcaton roblems Bayesan Decson Theory

More information

Application of artificial intelligence in earthquake forecasting

Application of artificial intelligence in earthquake forecasting Alcaton of artfcal ntellgence n earthquae forecastng Zhou Shengu, Wang Chengmn and Ma L Center for Analyss and Predcton of CSB, 63 Fuxng Road, Bejng 00036 P.R.Chna (e-mal zhou@ca.ac.cn; hone: 86 0 6827

More information

290 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 45, NO. 3, MARCH H d (e j! ;e j!

290 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 45, NO. 3, MARCH H d (e j! ;e j! 9 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 45, NO. 3, MARCH 998 Transactons Brefs Two-Dmensonal FIR Notch Flter Desgn Usng Sngular Value Decomoston S.-C. Pe,

More information

PERFORMANCE COMPARISON BETWEEN BACK PROPAGATION, RPE AND MRPE ALGORITHMS FOR TRAINING MLP NETWORKS

PERFORMANCE COMPARISON BETWEEN BACK PROPAGATION, RPE AND MRPE ALGORITHMS FOR TRAINING MLP NETWORKS PERFORMANCE COMPARISON BETWEEN BACK PROPAGATION, RPE AND MRPE ALGORITHMS FOR TRAINING MLP NETWORKS Mohd Yusoff Mashor School of Electrcal and Electronc Engneerng, Unversty Scence Malaysa, Pera Branch Campus,

More information

Comparison of Regression Lines

Comparison of Regression Lines STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence

More information

Non-linear Canonical Correlation Analysis Using a RBF Network

Non-linear Canonical Correlation Analysis Using a RBF Network ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane

More information

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks Internet Engneerng Jacek Mazurkewcz, PhD Softcomputng Part 3: Recurrent Artfcal Neural Networks Self-Organsng Artfcal Neural Networks Recurrent Artfcal Neural Networks Feedback sgnals between neurons Dynamc

More information

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30 STATS 306B: Unsupervsed Learnng Sprng 2014 Lecture 10 Aprl 30 Lecturer: Lester Mackey Scrbe: Joey Arthur, Rakesh Achanta 10.1 Factor Analyss 10.1.1 Recap Recall the factor analyss (FA) model for lnear

More information

Machine Remaining Useful Life Prediction Based on Adaptive Neuro-Fuzzy and High-Order Particle Filtering

Machine Remaining Useful Life Prediction Based on Adaptive Neuro-Fuzzy and High-Order Particle Filtering Machne Remanng Useful Lfe Predcton Based on Adatve Neuro-Fuzzy and Hgh-Order Partcle Flterng Chaochao Chen, George Vachtsevanos, and Marcos E. Orchard 2 Georga Insttute of Technology, Atlanta, GA 30332

More information

Multigradient for Neural Networks for Equalizers 1

Multigradient for Neural Networks for Equalizers 1 Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Exerments-I MODULE III LECTURE - 2 EXPERIMENTAL DESIGN MODELS Dr. Shalabh Deartment of Mathematcs and Statstcs Indan Insttute of Technology Kanur 2 We consder the models

More information

Appendix B: Resampling Algorithms

Appendix B: Resampling Algorithms 407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

Fuzzy approach to solve multi-objective capacitated transportation problem

Fuzzy approach to solve multi-objective capacitated transportation problem Internatonal Journal of Bonformatcs Research, ISSN: 0975 087, Volume, Issue, 00, -0-4 Fuzzy aroach to solve mult-objectve caactated transortaton roblem Lohgaonkar M. H. and Bajaj V. H.* * Deartment of

More information

Digital PI Controller Equations

Digital PI Controller Equations Ver. 4, 9 th March 7 Dgtal PI Controller Equatons Probably the most common tye of controller n ndustral ower electroncs s the PI (Proortonal - Integral) controller. In feld orented motor control, PI controllers

More information

A new Approach for Solving Linear Ordinary Differential Equations

A new Approach for Solving Linear Ordinary Differential Equations , ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of

More information

Parameter Estimation for Dynamic System using Unscented Kalman filter

Parameter Estimation for Dynamic System using Unscented Kalman filter Parameter Estmaton for Dynamc System usng Unscented Kalman flter Jhoon Seung 1,a, Amr Atya F. 2,b, Alexander G.Parlos 3,c, and Klto Chong 1,4,d* 1 Dvson of Electroncs Engneerng, Chonbuk Natonal Unversty,

More information

INF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018

INF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018 INF 5860 Machne learnng for mage classfcaton Lecture 3 : Image classfcaton and regresson part II Anne Solberg January 3, 08 Today s topcs Multclass logstc regresson and softma Regularzaton Image classfcaton

More information

Regularized Discriminant Analysis for Face Recognition

Regularized Discriminant Analysis for Face Recognition 1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths

More information

The Decibel and its Usage

The Decibel and its Usage The Decbel and ts Usage Consder a two-stage amlfer system, as shown n Fg.. Each amlfer rodes an ncrease of the sgnal ower. Ths effect s referred to as the ower gan,, of the amlfer. Ths means that the sgnal

More information

2-Adic Complexity of a Sequence Obtained from a Periodic Binary Sequence by Either Inserting or Deleting k Symbols within One Period

2-Adic Complexity of a Sequence Obtained from a Periodic Binary Sequence by Either Inserting or Deleting k Symbols within One Period -Adc Comlexty of a Seuence Obtaned from a Perodc Bnary Seuence by Ether Insertng or Deletng Symbols wthn One Perod ZHAO Lu, WEN Qao-yan (State Key Laboratory of Networng and Swtchng echnology, Bejng Unversty

More information

Switching Median Filter Based on Iterative Clustering Noise Detection

Switching Median Filter Based on Iterative Clustering Noise Detection Swtchng Medan Flter Based on Iteratve Clusterng Nose Detecton Chngakham Neeta Dev 1, Kesham Prtamdas 2 1 Deartment of Comuter Scence and Engneerng, Natonal Insttute of Technology, Manur, Inda 2 Deartment

More information

SPE Copyright 2008, Society of Petroleum Engineers

SPE Copyright 2008, Society of Petroleum Engineers SPE-4 Neural-Network based Senstvty Analyss for Injector- Producer Relatonsh Identfcaton U. Demryurek, F. Banae-Kashan and C. Shahab, Unversty of Southern Calforna; Frank Wlknson, Chevron Coyrght 008,

More information

Naïve Bayes Classifier

Naïve Bayes Classifier 9/8/07 MIST.6060 Busness Intellgence and Data Mnng Naïve Bayes Classfer Termnology Predctors: the attrbutes (varables) whose values are used for redcton and classfcaton. Predctors are also called nut varables,

More information

Comparative study of modeling and identification of the pneumatic artificial muscle (PAM) manipulator using recurrent neural networks

Comparative study of modeling and identification of the pneumatic artificial muscle (PAM) manipulator using recurrent neural networks Journal of Mechancal Scence and Technology Journal of Mechancal Scence and Technology 22 (2008) 287~298 www.sprngerlnk.com/content/738-494x Comparatve study of modelng and dentfcaton of the pneumatc artfcal

More information

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester 0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #

More information

A HYBRID APPROACH TO MODEL NONSTATIONARY SPACE-TIME SERIES

A HYBRID APPROACH TO MODEL NONSTATIONARY SPACE-TIME SERIES A HYBRID APPROACH TO MODEL NONSTATIONARY SPACE-TIME SERIES T. Cheng a, *, J.Q. Wang b, X. L b, W. Zhang b a Dpartment of Cvl, Envronmental and Geomatc Engneerng, Unversty College London, Gower Street,

More information

Solving Nonlinear Differential Equations by a Neural Network Method

Solving Nonlinear Differential Equations by a Neural Network Method Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,

More information

A Hybrid Variational Iteration Method for Blasius Equation

A Hybrid Variational Iteration Method for Blasius Equation Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method

More information

On New Selection Procedures for Unequal Probability Sampling

On New Selection Procedures for Unequal Probability Sampling Int. J. Oen Problems Comt. Math., Vol. 4, o. 1, March 011 ISS 1998-66; Coyrght ICSRS Publcaton, 011 www.-csrs.org On ew Selecton Procedures for Unequal Probablty Samlng Muhammad Qaser Shahbaz, Saman Shahbaz

More information

Predictive Control of a Boiler-turbine System

Predictive Control of a Boiler-turbine System Recent Researches n Crcuts and Systems Predctve Control of a Boler-turbne System JAKUB NOVAK, PR CHALUPA aculty of Aled Informatcs omas Bata Unversty n Zln Nam.G.Masaryka 5555, Zln CZCH RPUBLIC jnovak@fa.utb.cz

More information

Lesson 16: Basic Control Modes

Lesson 16: Basic Control Modes 0/8/05 Lesson 6: Basc Control Modes ET 438a Automatc Control Systems Technology lesson6et438a.tx Learnng Objectves Ater ths resentaton you wll be able to: Descrbe the common control modes used n analog

More information

Hidden Markov Model Cheat Sheet

Hidden Markov Model Cheat Sheet Hdden Markov Model Cheat Sheet (GIT ID: dc2f391536d67ed5847290d5250d4baae103487e) Ths document s a cheat sheet on Hdden Markov Models (HMMs). It resembles lecture notes, excet that t cuts to the chase

More information

An Improved multiple fractal algorithm

An Improved multiple fractal algorithm Advanced Scence and Technology Letters Vol.31 (MulGraB 213), pp.184-188 http://dx.do.org/1.1427/astl.213.31.41 An Improved multple fractal algorthm Yun Ln, Xaochu Xu, Jnfeng Pang College of Informaton

More information

A neural network with localized receptive fields for visual pattern classification

A neural network with localized receptive fields for visual pattern classification Unversty of Wollongong Research Onlne Faculty of Informatcs - Papers (Archve) Faculty of Engneerng and Informaton Scences 2005 A neural network wth localzed receptve felds for vsual pattern classfcaton

More information

MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN

MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN S. Chtwong, S. Wtthayapradt, S. Intajag, and F. Cheevasuvt Faculty of Engneerng, Kng Mongkut s Insttute of Technology

More information

Fault Diagnosis of Autonomous Underwater Vehicles

Fault Diagnosis of Autonomous Underwater Vehicles Research Journal of Appled Scences, Engneerng and Technology 5(6): 407-4076, 03 SSN: 040-7459; e-ssn: 040-7467 Maxwell Scentfc Organzaton, 03 Submtted: March 3, 0 Accepted: January, 03 Publshed: Aprl 30,

More information

Neural Networks & Learning

Neural Networks & Learning Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred

More information

A Neural Network Training Algorithm Utilizing Multiple Sets of Linear Equations

A Neural Network Training Algorithm Utilizing Multiple Sets of Linear Equations A Neural Network Tranng Algorthm Utlzng Multle Sets of Lnear Equatons Hung-Han Chen a, Mchael T. Manry b, and Hema Chandrasekaran b a CYTEL Systems, Inc., Hudson, MA 0749 b Deartment of Electrcal Engneerng

More information

Lecture 23: Artificial neural networks

Lecture 23: Artificial neural networks Lecture 23: Artfcal neural networks Broad feld that has developed over the past 20 to 30 years Confluence of statstcal mechancs, appled math, bology and computers Orgnal motvaton: mathematcal modelng of

More information

ONE DIMENSIONAL TRIANGULAR FIN EXPERIMENT. Technical Advisor: Dr. D.C. Look, Jr. Version: 11/03/00

ONE DIMENSIONAL TRIANGULAR FIN EXPERIMENT. Technical Advisor: Dr. D.C. Look, Jr. Version: 11/03/00 ONE IMENSIONAL TRIANGULAR FIN EXPERIMENT Techncal Advsor: r..c. Look, Jr. Verson: /3/ 7. GENERAL OJECTIVES a) To understand a one-dmensonal epermental appromaton. b) To understand the art of epermental

More information

Machine Learning. Classification. Theory of Classification and Nonparametric Classifier. Representing data: Hypothesis (classifier) Eric Xing

Machine Learning. Classification. Theory of Classification and Nonparametric Classifier. Representing data: Hypothesis (classifier) Eric Xing Machne Learnng 0-70/5 70/5-78, 78, Fall 008 Theory of Classfcaton and Nonarametrc Classfer Erc ng Lecture, Setember 0, 008 Readng: Cha.,5 CB and handouts Classfcaton Reresentng data: M K Hyothess classfer

More information

Home Assignment 4. Figure 1: A sample input sequence for NER tagging

Home Assignment 4. Figure 1: A sample input sequence for NER tagging Advanced Methods n NLP Due Date: May 22, 2018 Home Assgnment 4 Lecturer: Jonathan Berant In ths home assgnment we wll mplement models for NER taggng, get famlar wth TensorFlow and learn how to use TensorBoard

More information

Negative Binomial Regression

Negative Binomial Regression STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...

More information

What would be a reasonable choice of the quantization step Δ?

What would be a reasonable choice of the quantization step Δ? CE 108 HOMEWORK 4 EXERCISE 1. Suppose you are samplng the output of a sensor at 10 KHz and quantze t wth a unform quantzer at 10 ts per sample. Assume that the margnal pdf of the sgnal s Gaussan wth mean

More information

Using Genetic Algorithms in System Identification

Using Genetic Algorithms in System Identification Usng Genetc Algorthms n System Identfcaton Ecaterna Vladu Deartment of Electrcal Engneerng and Informaton Technology, Unversty of Oradea, Unverstat, 410087 Oradea, Româna Phone: +40259408435, Fax: +40259408408,

More information

Irregular vibrations in multi-mass discrete-continuous systems torsionally deformed

Irregular vibrations in multi-mass discrete-continuous systems torsionally deformed (2) 4 48 Irregular vbratons n mult-mass dscrete-contnuous systems torsonally deformed Abstract In the paper rregular vbratons of dscrete-contnuous systems consstng of an arbtrary number rgd bodes connected

More information

Probabilistic Variation Mode and Effect Analysis: A Case Study of an Air Engine Component

Probabilistic Variation Mode and Effect Analysis: A Case Study of an Air Engine Component Probablstc Varaton Mode and Effect Analyss: A Case Study of an Ar Engne Comonent Pär Johannesson Fraunhofer-Chalmers Research Centre for Industral Mathematcs, Sweden; Par.Johannesson@fcc.chalmers.se Thomas

More information

Multi-step-ahead Method for Wind Speed Prediction Correction Based on Numerical Weather Prediction and Historical Measurement Data

Multi-step-ahead Method for Wind Speed Prediction Correction Based on Numerical Weather Prediction and Historical Measurement Data Journal of Physcs: Conference Seres PAPER OPEN ACCESS Mult-step-ahead Method for Wnd Speed Predcton Correcton Based on Numercal Weather Predcton and Hstorcal Measurement Data To cte ths artcle: Han Wang

More information

A Comparison on Neural Network Forecasting

A Comparison on Neural Network Forecasting 011 nternatonal Conference on Crcuts, System and Smulaton PCST vol.7 (011) (011) ACST Press, Sngapore A Comparson on Neural Networ Forecastng Hong-Choon Ong 1, a and Shn-Yue Chan 1, b 1 School of Mathematcal

More information

A New Evolutionary Computation Based Approach for Learning Bayesian Network

A New Evolutionary Computation Based Approach for Learning Bayesian Network Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang

More information

Outline. EM Algorithm and its Applications. K-Means Classifier. K-Means Classifier (Cont.) Introduction of EM K-Means EM EM Applications.

Outline. EM Algorithm and its Applications. K-Means Classifier. K-Means Classifier (Cont.) Introduction of EM K-Means EM EM Applications. EM Algorthm and ts Alcatons Y L Deartment of omuter Scence and Engneerng Unversty of Washngton utlne Introducton of EM K-Means EM EM Alcatons Image Segmentaton usng EM bect lass Recognton n BIR olor lusterng

More information

829. An adaptive method for inertia force identification in cantilever under moving mass

829. An adaptive method for inertia force identification in cantilever under moving mass 89. An adaptve method for nerta force dentfcaton n cantlever under movng mass Qang Chen 1, Mnzhuo Wang, Hao Yan 3, Haonan Ye 4, Guola Yang 5 1,, 3, 4 Department of Control and System Engneerng, Nanng Unversty,

More information

Microwave Diversity Imaging Compression Using Bioinspired

Microwave Diversity Imaging Compression Using Bioinspired Mcrowave Dversty Imagng Compresson Usng Bonspred Neural Networks Youwe Yuan 1, Yong L 1, Wele Xu 1, Janghong Yu * 1 School of Computer Scence and Technology, Hangzhou Danz Unversty, Hangzhou, Zhejang,

More information

Approccio Statistico all'analisi di Sistemi Caotici e Applicazioni all'ingegneria dell'informazione

Approccio Statistico all'analisi di Sistemi Caotici e Applicazioni all'ingegneria dell'informazione Arocco tatstco all'anals d stem Caotc e Alcazon all'ingegnera dell'informazone Ganluca ett 3 Rccardo Rovatt 3 D. d Ingegnera Unverstà d Ferrara D. d Elettronca, Informatca e stemstca - Unverstà d Bologna

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Exerments-I MODULE II LECTURE - GENERAL LINEAR HYPOTHESIS AND ANALYSIS OF VARIANCE Dr. Shalabh Deartment of Mathematcs and Statstcs Indan Insttute of Technology Kanur 3.

More information

AN EXTENDED MPC CONVERGENCE CONDITION

AN EXTENDED MPC CONVERGENCE CONDITION Latn Amercan Aled esearch 36:57-6 (6 AN EXENDED MPC CONVEGENCE CONDIION A. H. GONZÁLEZ and. L. MACHEI Insttuto de Desarrollo ecnológco ara la Industra uímca, INEC (UNL - CONICE alegon@cerde.gov.ar lmarch@cerde.gov.ar

More information

Pattern Recognition. Approximating class densities, Bayesian classifier, Errors in Biometric Systems

Pattern Recognition. Approximating class densities, Bayesian classifier, Errors in Biometric Systems htt://.cubs.buffalo.edu attern Recognton Aromatng class denstes, Bayesan classfer, Errors n Bometrc Systems B. W. Slverman, Densty estmaton for statstcs and data analyss. London: Chaman and Hall, 986.

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB Journal of Envronmental Protecton, 01, 3, 689-693 http://dxdoorg/10436/jep0137081 Publshed Onlne July 01 (http://wwwscrporg/journal/jep) 689 Atmospherc Envronmental Qualty Assessment RBF Model Based on

More information

GENERALIZED LINEAR MODELS FOR SIMULATED HYDROFORMING PROCESSES

GENERALIZED LINEAR MODELS FOR SIMULATED HYDROFORMING PROCESSES Kolloquum - Forschergrue FreFormFlächen GENERALIZED LINEAR MODELS FOR SIMULATED HYDROFORMING PROCESSES U. Gather, S. Kuhnt Char of Mathematcal Statstcs and Industral Alcatons, Deartment of Statstcs, Unversty

More information

Basic Business Statistics, 10/e

Basic Business Statistics, 10/e Chapter 13 13-1 Basc Busness Statstcs 11 th Edton Chapter 13 Smple Lnear Regresson Basc Busness Statstcs, 11e 009 Prentce-Hall, Inc. Chap 13-1 Learnng Objectves In ths chapter, you learn: How to use regresson

More information

Comparing two Quantiles: the Burr Type X and Weibull Cases

Comparing two Quantiles: the Burr Type X and Weibull Cases IOSR Journal of Mathematcs (IOSR-JM) e-issn: 78-578, -ISSN: 39-765X. Volume, Issue 5 Ver. VII (Se. - Oct.06), PP 8-40 www.osrjournals.org Comarng two Quantles: the Burr Tye X and Webull Cases Mohammed

More information

Mixture of Gaussians Expectation Maximization (EM) Part 2

Mixture of Gaussians Expectation Maximization (EM) Part 2 Mture of Gaussans Eectaton Mamaton EM Part 2 Most of the sldes are due to Chrstoher Bsho BCS Summer School Eeter 2003. The rest of the sldes are based on lecture notes by A. Ng Lmtatons of K-means Hard

More information

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

Hidden Markov Models & The Multivariate Gaussian (10/26/04) CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Algorithms for factoring

Algorithms for factoring CSA E0 235: Crytograhy Arl 9,2015 Instructor: Arta Patra Algorthms for factorng Submtted by: Jay Oza, Nranjan Sngh Introducton Factorsaton of large ntegers has been a wdely studed toc manly because of

More information

CHAPTER III Neural Networks as Associative Memory

CHAPTER III Neural Networks as Associative Memory CHAPTER III Neural Networs as Assocatve Memory Introducton One of the prmary functons of the bran s assocatve memory. We assocate the faces wth names, letters wth sounds, or we can recognze the people

More information

On the Connectedness of the Solution Set for the Weak Vector Variational Inequality 1

On the Connectedness of the Solution Set for the Weak Vector Variational Inequality 1 Journal of Mathematcal Analyss and Alcatons 260, 15 2001 do:10.1006jmaa.2000.7389, avalable onlne at htt:.dealbrary.com on On the Connectedness of the Soluton Set for the Weak Vector Varatonal Inequalty

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 31 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 6. Rdge regresson The OLSE s the best lnear unbased

More information

System identifications by SIRMs models with linear transformation of input variables

System identifications by SIRMs models with linear transformation of input variables ORIGINAL RESEARCH System dentfcatons by SIRMs models wth lnear transformaton of nput varables Hrofum Myama, Nortaka Shge, Hrom Myama Graduate School of Scence and Engneerng, Kagoshma Unversty, Japan Receved:

More information

Global Sensitivity. Tuesday 20 th February, 2018

Global Sensitivity. Tuesday 20 th February, 2018 Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values

More information

Logistic regression with one predictor. STK4900/ Lecture 7. Program

Logistic regression with one predictor. STK4900/ Lecture 7. Program Logstc regresson wth one redctor STK49/99 - Lecture 7 Program. Logstc regresson wth one redctor 2. Maxmum lkelhood estmaton 3. Logstc regresson wth several redctors 4. Devance and lkelhood rato tests 5.

More information