Evolutionary Product-Unit Neural Networks for Classification 1

Size: px
Start display at page:

Download "Evolutionary Product-Unit Neural Networks for Classification 1"

Transcription

1 Evoutionary Product-Unit Neura Networs for Cassification F.. Martínez-Estudio, C. Hervás-Martínez, P. A. Gutiérrez Peña A. C. Martínez-Estudio and S. Ventura-Soto Department of Management and Quantitative Methods, ETEA, Escritor Castia Aguayo 4, 4005, Córdoba, Spain (corresponding author, phone ; fax ; Department of Computing and Numerica Anaysis of the University of Córdoba, Campus de Rabanaes, 407, Córdoba, Spain Abstract. We propose a cassification method based on a specia cass of feedforward neura networ, namey product-unit neura networs. They are based on mutipicative nodes instead of additive ones, where the noninear basis functions express the possibe strong interactions between variabes. We appy an evoutionary agorithm to determine the basic structure of the product-unit mode and to estimate the coefficients of the mode. We use softmax transformation as the decision rue and the cross-entropy error function because of its probabiistic interpretation. The empirica resuts over four benchmar data sets show that the proposed mode is very promising in terms of cassification accuracy and the compexity of the cassifier, yieding a state-ofthe-art performance. Keywords: Cassification; Product-Unit; Evoutionary Neura Networs Introduction The simpest method for cassification provides the cass eve given its observation via inear functions in the predictor variabes. Frequenty, in a rea-probem of cassification, we cannot mae the stringent assumption of additive and purey inear effects of the variabes. A traditiona technique to overcome these difficuties is augmenting/repacing the input vector with new variabes, the basis functions, which are transformations of the input variabes, and then to using inear modes in this new space of derived input features. Once the number and the structure of the basis functions have been determined, the modes are inear in these new variabes and the fitting is a we nown standard procedure. Methods ie sigmoida feed-forward neura networs, proection pursuit earning, generaized additive modes [], and This wor has been financed in part by TIN C05-0 proects of the Spanish Inter- Ministeria Commission of Science and Technoogy (MICYT) and FEDER funds.

2 PoyMARS [], a hybrid of mutivariate adaptive spines (MARS) specificay designed for cassification probems, can be seen as different basis function modes. The maor drawbac of these approaches is to state the optima number and the typoogy of corresponding basis functions. We tace this probem proposing a noninear mode aong with an evoutionary agorithm that finds the optima structure of the mode and estimates the corresponding parameters. Concretey, our approach tries to overcome the noninear effects of variabes by means of a mode based on noninear basis functions constructed with the product of the inputs raised to arbitrary powers. These basis functions express the possibe strong interactions between the variabes, where the exponents may even tae on rea vaues and are suitabe for automatic adustment. The proposed mode corresponds to a specia cass of feedforward neura networ, namey product-unit neura networs, PUNN, introduced by Durbin and Rumehart [3]. They are an aternative to sigmoida neura networs and are based on mutipicative nodes instead of additive ones. Up to now, PUNN have been used mainy to sove regression probems [4], [5]. Evoutionary artificia neura networs (EANNs) have been a ey research area in the past decade providing a better patform for optimizing both the weights and the architecture of the networ simutaneousy. The probem of finding a suitabe architecture and the corresponding weights of the networ is a very compex tas (for a very interesting review on this subect the reader can consut [6]). This probem, together with the compexity of the error surface associated with a product-unit neura networ, ustifies the use of an evoutionary agorithm to design the structure and training of the weights. The evoutionary process determines the number of basis functions of the mode, associated coefficients and corresponding exponents. In our evoutionary agorithm we encourage parsimony in evoved networs by attempting different mutations sequentiay. Our experimenta resuts show that evoving parsimonious networs by sequentiay appying different mutations is an aternative to the use of a reguarization term in the fitness function to penaize arge networs. We use the softmax activation function and the cross-entropy error function. From a statistica point of view, the approach can be seen as a noninear mutinomia ogistic regression, where we optimize the og-ieihood using evoutionary computation. Reay, we attempt to estimate conditiona cass probabiities using a mutiogistic mode, where the noninear mode is given by a product-unit neura networ. We evauate the performance of our methodoogy on four data sets taen from the UCI repository [7]. Empirica resuts show that the proposed method performs we compared to severa earning cassification techniques. This paper is organized as foows: Section is dedicated to a description of product-unit based neura networs; Section 3 describes the evoution of product-unit neura networs; Section 4 expains the experiments carried out; and finay, Section 5 shows the concusions of our wor. Product-Unit Neura Networs In this section we present the famiy of product-unit basis functions used in the cassification process and its representation by means of a neura networ structure. This cass of mutipicative neura networs comprises such types as sigma-pi

3 networs and product unit networs. A mutipicative node is given by y i= wi i = x, where is the number of the inputs. If the exponents are {0,} we obtain a higherorder unit, aso nown by the name of sigma-pi unit. In contrast to the sigma-pi unit, in the product-unit the exponents are not fixed and may even tae rea vaues. Some advantages of product-unit based neura networs are increased information capacity and the abiity to form higher-order combinations of the inputs. Besides that, it is possibe to obtain upper bounds of the VC dimension of product-unit neura networs simiar to those obtained for sigmoida neura networs [8]. Moreover, it is a straightforward consequence of the Stone-Weierstrass Theorem to prove that productunit neura networs are universa approximators, (observe that poynomia functions in severa variabes are a subset of product-unit modes). Despite these advantages, product-unit based networs have a maor drawbac: they have more oca minima and more probabiity of becoming trapped in them [9]. The main reason for this difficuty is that sma changes in the exponents can cause arge changes in the tota error surface and therefore their training is more difficut than the training of standard sigmoida based networs. Severa efforts have been made to carry out earning methods for product units [9],[0]. Studies carried out on PUNNs have not taced the probem of the simutaneousy design of the structure and weights in this ind of neura networ, either using cassic or evoutionary based methods. Moreover, so far, product units have been appied mainy to sove regression probems. We consider a product-unit neura networ with the foowing structure (Fig. ): an input ayer with nodes, a node for every input variabe, a hidden ayer with m nodes and an output ayer with nodes, one for each cass eve. There are no connections between the nodes of a ayer and none between the input and output ayers either. The activation function of the -th node in the hidden ayer is given wi byb ( x, w ) = xi, where w i is the weight of the connection between input node i= i and hidden node and w = ( w,..., w ) the weights vector. The activation function of each output node is given by f ( x, θ) = β0 + βb ( x, w ), =,,...,, where β is the weight of the connection between the hidden node and the output node. The transfer function of a hidden and output nodes is the identity function. We consider the softmax activation function given by: = ( x θ ) ( x θ ) m = exp f, g ( x, θ ) =, =,,...,. exp f, where θ = ( β, w,..., w m ), θ = ( θ,..., θ ) and β = ( β 0, β,..., β m ). It interesting to note that the mode can be regarded as the feed-forward computation of a three-ayer t neura networ where the activation function of each hidden node is exp( t) = e and where we have to do a ogarithmic transformation of the input variabes x i, []. ()

4 g() x g() x () gx Softmax Softmax Softmax,,..., bias β 0 β0 β 0 β β β β β β β m β m β m,,...,m w w w w w m w m x x w w x w m,,..., Fig.. Mode of a product-unit based neura networ. 3 Cassification probem In a cassification probem, measurements x i, i =,,...,, are taen on a singe individua (or obect), and the individuas have to be cassified into one of the D = ( x, y ); n =,,..., N is casses based on these measurements. A training sampe { } avaiabe, where x n = ( x n,..., xn ) is the random vector of measurements taing vaues in Ω R, and y n is the cass eve of the n -th individua. We adopt the common technique of representing the cass eves using a -of- encoding vector () () ( ) ( ) y = y, y..., y, such as y = if x corresponds to an exampe beonging to ( ) cass ; otherwise, ( ) y = function C : {,,..., } n n 0. Based on the training sampe we try to find a decision Ω for cassifying the individuas. A miscassification occurs when the decision rue C assigns an individua to a cass, when it actuay comes from a cass N n =. We define the corrected cassified rate by N CCR = I( C( xn ) = y n ), where Ii ( ) is the zero-one oss function. A good cassifier tries to achieve the highest possibe CCR in a given probem. We define the cross-entropy error function for the training observations as: N N ( ) ( ) ( ) y og g (, ) y f ( ) og exp f θ = n xn θ = n n ( n ) N n= = N x,θ + x,θ () n= = =

5 The Hessian matrix of the error function ( θ ) is, in genera, indefinite and the error surface associated with the mode is very convouted with numerous oca optimums. Moreover, the optima number of basis functions of the mode (i.e. the number of hidden nodes in the neura networ) is unnown. Thus, the estimation of the vector parameters ˆθ is carried out by means an evoutionary agorithm (see Section 4). The optimum rue C( x ) is the foowing: C( x ) = ˆ, where ˆ = arg max g ( x, θ ˆ), for =,...,. Observe that softmax transformation produces positive estimates that sum to one and therefore the outputs can be interpreted as the conditiona probabiity of cass membership and the cassification rue coincides with the optima Bayes rue. On the other hand, the probabiity for one of the casses does not need to be estimated, because of the normaization condition. Usuay, one activation function is set to zero and we reduce the number of parameters to estimate. Therefore, we set f ( x,θ ) = 0. 4 Evoutionary Product-Unit Neura Networs In this paragraph we carry out the evoutionary product-unit neura networs agorithm (EPUNN) to estimate the parameter that minimizes the cross-entropy error function. We buid an evoutionary agorithm to design the structure and earn the weights of the networs. The search begins with an initia popuation of product-unit neura networs, and, in each iteration, the popuation is updated using a popuationupdate agorithm. The popuation is subected to the operations of repication and mutation. Crossover is not used due to its potentia disadvantages in evoving artificia networs. With these features the agorithm fas into the cass of evoutionary programming.the genera structure of the EA is the foowing: () Generate a random popuation of size N p. () Repeat unti the stopping criterion is fufied (a) Cacuate the fitness of every individua in the popuation. (b) Ran the individuas with respect to their fitness. (c) The best individua is copied into the new popuation. (d) The best 0% of popuation individuas are repicated and substitute the worst 0% of individuas. Over that intermediate popuation we: (e) Appy parametric mutation to the best 0% of individuas. (f) Appy structura mutation to the remaining 90% of individuas. We consider ( θ ) being the error function of an individua g of the popuation. Observe that g can be seen as the mutivauated function ( ) = ( ) ( ) g g g x, θ ( x, θ,..., x, θ ). The fitness measure is a stricty decreasing transformation of the error function ( ) mutation is accompished for each coefficient θ given by Ag ( ) ( ( )) w i, = + θ. Parametric β of the mode with Gaussian noise: wi ( t + ) = wi ( t) + ξ( t), β ( t + ) = β ( t) + ξ ( t), where ξ ( t) N(0, α ( t)),

6 =,, represents a one-dimensiona normay-distributed random variabe with mean 0 and variance α ( t ), where α( t) < α ( t), and t is the t -th generation. Once the mutation is performed, the fitness of the individua is recacuated and the usua simuated anneaing is appied. Thus, if A is the difference in the fitness function after and preceding the random step, the criterion is: if A 0 the step is accepted, if A < 0, the step is accepted with a probabiity exp( A/ T ( g)), where the temperature T ( g ) of an individua g is given by T ( g) = A( g), 0 T ( g) <. The variance α ( t ) is updated throughout the evoution of the agorithm. There are different methods to update the variance. We use the /5 success rue of Rechenberg, one of the simpest methods. This rue states that the ratio of successfu mutations shoud be /5. Therefore, if the ratio of successfu mutations is arger than /5, the mutation deviation shoud increase; otherwise, the deviation shoud decrease. Thus: ( + λ) α ( t) if sg > / 5 α ( t + s) = ( λ) α ( t), if sg < / 5 α ( t) if sg = / 5 (3) where =,, s g is the frequency of successfu mutations over s generations and λ = 0.. The adaptation tries to avoid being trapped in oca minima and aso to speed up the evoutionary process when searching conditions are suitabe. Structura mutation impies a modification in the neura networ structure and aows exporations of different regions in the search space whie heping to eep up the diversity of the popuation. There are five different structura mutations: node deetion, connection deetion, node addition, connection addition and node fusion. These five mutations are appied sequentiay to each networ. In the node fusion, two randomy seected hidden nodes, a and b, are repaced by a new node, c, which is a combination of both. The connections that are common to both nodes are ept, with weights given by: βc = βa + βb, wc = ( wa + wb ). The connections that are not shared by the nodes are inherited by c with a probabiity of 0.5 and its weight is unchanged. In our agorithm, node or connection deetion and node fusion is aways attempted before addition. If a deetion or fusion mutation is successfu, no other mutation wi be made. If the probabiity does not seect any mutation, one of the mutations is chosen at random and appied to the networ. 5 Experiments The parameters used in the evoutionary agorithm are common for the four probems. We have considered α (0) = 0.5, α (0) =, λ = 0. and s = 5. The exponents w i are initiaized in the [ 5,5] interva, the coefficients β are initiaized in [ 5,5]. The

7 maximum number of hidden nodes is m = 6. The size of the popuation is N = 000. The number of nodes that can be added or removed in a structura P mutation is within the [, ] interva. The number of connections that can be added or removed in a structura mutation is within the [,6 ] interva. The stop criterion is reached if the foowing condition is fufied: for 0 generations there is no improvement either in the average performance of the best 0% of the popuation or in the fitness of the best individua. We have done a simpe inear rescaing of the,, being X the transformed variabes. input variabes in the interva [ ] We evauate the performance of our method on four data sets taen from the UCI repository [7]. For every dataset we performed ten runs of ten-fod stratified crossvaidation. This gives a hundred data points for each dataset, from which the average cassification accuracy and standard deviation is cacuated. Tabe shows the statistica resuts over 0 runs for each fod of the evoutionary agorithm for the four data sets. With the obective of presenting an empirica evauation of the performance of the EPUNN method, we compare our approach to the most recent resuts [] obtained using different methodoogies (see Tabe ). Logistic mode tree, LMT, to ogistic regression (with attribute seection, SLogistic, and for a fu ogistic mode, MLogistic); induction trees (C4.5 and CART); two ogistic tree agorithms: LTreeLog and finay, mutipe-tree modes M5 for cassification, and boosted C4.5 trees using AdaBoost.M with 0 and 00 boosting interactions. We can see that the resuts obtained by EPUNN, with architectures (3::), (34:3:), (4:5:3) and (5:3:) for each data set, are competitive with the earning schemes mentioned previousy. Tabe. Statistica resuts of training and testing for 30 executions of EPUNN mode. CCR T CCR #connect G Data set #node Mean SD Best Worst Mean SD Best Worst Mean SD Heart-stat Ionosphere Baance Austraian Tabe. Mean cassification accuracy and standard deviation for: LMT, SLogistic, MLogistic, C4.5, CART, NBTree, LTreeLin, LTreeLog, M5', ABOOST and EPUNN method. Data set LMT SLogistic MLogistic C4.5 CART NBTree Heart-stat 83.± ± ± ± ± ±7. Ionosphere 9.99± ± ± ± ± ±5. Baance 89.7± ± ± ± ± ±5.3 Austraian 85.04± ± ± ± ± ±4.03 Data set LTreeLin LTreeLog M5' ABoost(0) ABoost(00) EPUNN W/L Heart-stat 83.5± ± ± ± ± ±6.90 5/6 Ionosphere 88.95± ± ± ± ± ±5.5 7/4 Baance 9.86± ± ± ± ± ±.36 /0 Austraian 84.99± ± ± ± ± ±3.90 0/ * i

8 6 Concusions We propose a cassification method that combines a noninear mode, based on a specia cass of feed-forward neura networ, namey product-unit neura networs, and an evoutionary agorithm that finds the optima structure of the mode and estimates the corresponding parameters. Up to now, the studies on product units have been appied mainy to sove regression probems and have not addressed the probem of the design of both structure and weights simutaneousy in this ind of neura networ, either using cassic or evoutionary based methods. Our approach uses softmax transformation and the cross-entropy error function. From a statistica point of view, the approach can be seen as noninear mutinomia ogistic regression, where optimization of the og-ieihood is made by using evoutionary computation. The empirica resuts show that the evoutionary product-unit mode performs we compared to other earning cassification techniques. We obtain very promising resuts in terms of cassification accuracy and the compexity of the cassifier. References [] T.. Hastie and R.. Tibshirani, Generaized Additive Modes. London: Chapman & Ha, 990. [] C. Kooperberg, S. Bose, and C.. Stone, "Poychotomous Regression," ourna of the American Statistica Association, vo. 9, pp. 7-7, 997. [3] R. Durbin and D. Rumehart, "Products Units: A computationay powerfu and bioogicay pausibe extension to bacpropagation networs," Neura Computation, vo., pp. 33-4, 989. [4] A. C. Martínez-Estudio, F.. Martínez-Estudio, C. Hervás-Martínez, et a., "Evoutionary Product Unit based Neura Networs for Regression," Neura Networs, pp , 006. [5] A. C. Martínez-Estudio, C. Hervás-Martínez, A. C. Martínez-Estudio, et a., "Hybridation of evoutionary agorithms and oca search by means of a custering method," IEEE Transactions on Systems, Man and Cybernetics, Part. B: Cybernetics, vo. 36, pp , 006. [6] X. Yao, "Evoving artificia neura networ," Proceedings of the IEEE, vo. 9 (87), pp , 999. [7] C. Bae and C.. Merz, " UCI repository of machine earning data bases," mearn/mlrepository.thm, 998. [8] M. Schmitt, "On the Compexity of Computing and Learning with Mutipicative Neura Networs," Neura Computation, vo. 4, pp. 4-30, 00. [9] A. Ismai and A. P. Engebrecht, "Goba optimization agorithms for training product units neura networs," presented at Internationa oint Conference on Neura Networs ICNN`000, Como, Itay, 000. [0] A. P. Engebrecht and A. Ismai, "Training product unit neura networs," Stabiity and Contro: Theory and Appications, vo., pp , 999. [] K. Saito and R. Naano, "Extracting Regression Rues From Neura Networs," Neura Networs, vo. 5, pp , 00. [] N. Landwehr, M. Ha, and F. Eibe, "Logistic Mode Trees," Machine Learning, vo. 59, pp. 6-05, 005.

Expectation-Maximization for Estimating Parameters for a Mixture of Poissons

Expectation-Maximization for Estimating Parameters for a Mixture of Poissons Expectation-Maximization for Estimating Parameters for a Mixture of Poissons Brandon Maone Department of Computer Science University of Hesini February 18, 2014 Abstract This document derives, in excrutiating

More information

DIGITAL FILTER DESIGN OF IIR FILTERS USING REAL VALUED GENETIC ALGORITHM

DIGITAL FILTER DESIGN OF IIR FILTERS USING REAL VALUED GENETIC ALGORITHM DIGITAL FILTER DESIGN OF IIR FILTERS USING REAL VALUED GENETIC ALGORITHM MIKAEL NILSSON, MATTIAS DAHL AND INGVAR CLAESSON Bekinge Institute of Technoogy Department of Teecommunications and Signa Processing

More information

FRST Multivariate Statistics. Multivariate Discriminant Analysis (MDA)

FRST Multivariate Statistics. Multivariate Discriminant Analysis (MDA) 1 FRST 531 -- Mutivariate Statistics Mutivariate Discriminant Anaysis (MDA) Purpose: 1. To predict which group (Y) an observation beongs to based on the characteristics of p predictor (X) variabes, using

More information

BP neural network-based sports performance prediction model applied research

BP neural network-based sports performance prediction model applied research Avaiabe onine www.jocpr.com Journa of Chemica and Pharmaceutica Research, 204, 6(7:93-936 Research Artice ISSN : 0975-7384 CODEN(USA : JCPRC5 BP neura networ-based sports performance prediction mode appied

More information

Determining The Degree of Generalization Using An Incremental Learning Algorithm

Determining The Degree of Generalization Using An Incremental Learning Algorithm Determining The Degree of Generaization Using An Incrementa Learning Agorithm Pabo Zegers Facutad de Ingeniería, Universidad de os Andes San Caros de Apoquindo 22, Las Condes, Santiago, Chie pzegers@uandes.c

More information

Bayesian Learning. You hear a which which could equally be Thanks or Tanks, which would you go with?

Bayesian Learning. You hear a which which could equally be Thanks or Tanks, which would you go with? Bayesian Learning A powerfu and growing approach in machine earning We use it in our own decision making a the time You hear a which which coud equay be Thanks or Tanks, which woud you go with? Combine

More information

Statistical Learning Theory: A Primer

Statistical Learning Theory: A Primer Internationa Journa of Computer Vision 38(), 9 3, 2000 c 2000 uwer Academic Pubishers. Manufactured in The Netherands. Statistica Learning Theory: A Primer THEODOROS EVGENIOU, MASSIMILIANO PONTIL AND TOMASO

More information

Appendix A: MATLAB commands for neural networks

Appendix A: MATLAB commands for neural networks Appendix A: MATLAB commands for neura networks 132 Appendix A: MATLAB commands for neura networks p=importdata('pn.xs'); t=importdata('tn.xs'); [pn,meanp,stdp,tn,meant,stdt]=prestd(p,t); for m=1:10 net=newff(minmax(pn),[m,1],{'tansig','purein'},'trainm');

More information

A proposed nonparametric mixture density estimation using B-spline functions

A proposed nonparametric mixture density estimation using B-spline functions A proposed nonparametric mixture density estimation using B-spine functions Atizez Hadrich a,b, Mourad Zribi a, Afif Masmoudi b a Laboratoire d Informatique Signa et Image de a Côte d Opae (LISIC-EA 4491),

More information

A Novel Learning Method for Elman Neural Network Using Local Search

A Novel Learning Method for Elman Neural Network Using Local Search Neura Information Processing Letters and Reviews Vo. 11, No. 8, August 2007 LETTER A Nove Learning Method for Eman Neura Networ Using Loca Search Facuty of Engineering, Toyama University, Gofuu 3190 Toyama

More information

An Algorithm for Pruning Redundant Modules in Min-Max Modular Network

An Algorithm for Pruning Redundant Modules in Min-Max Modular Network An Agorithm for Pruning Redundant Modues in Min-Max Moduar Network Hui-Cheng Lian and Bao-Liang Lu Department of Computer Science and Engineering, Shanghai Jiao Tong University 1954 Hua Shan Rd., Shanghai

More information

CS229 Lecture notes. Andrew Ng

CS229 Lecture notes. Andrew Ng CS229 Lecture notes Andrew Ng Part IX The EM agorithm In the previous set of notes, we taked about the EM agorithm as appied to fitting a mixture of Gaussians. In this set of notes, we give a broader view

More information

From Margins to Probabilities in Multiclass Learning Problems

From Margins to Probabilities in Multiclass Learning Problems From Margins to Probabiities in Muticass Learning Probems Andrea Passerini and Massimiiano Ponti 2 and Paoo Frasconi 3 Abstract. We study the probem of muticass cassification within the framework of error

More information

The EM Algorithm applied to determining new limit points of Mahler measures

The EM Algorithm applied to determining new limit points of Mahler measures Contro and Cybernetics vo. 39 (2010) No. 4 The EM Agorithm appied to determining new imit points of Maher measures by Souad E Otmani, Georges Rhin and Jean-Marc Sac-Épée Université Pau Veraine-Metz, LMAM,

More information

A Solution to the 4-bit Parity Problem with a Single Quaternary Neuron

A Solution to the 4-bit Parity Problem with a Single Quaternary Neuron Neura Information Processing - Letters and Reviews Vo. 5, No. 2, November 2004 LETTER A Soution to the 4-bit Parity Probem with a Singe Quaternary Neuron Tohru Nitta Nationa Institute of Advanced Industria

More information

SVM: Terminology 1(6) SVM: Terminology 2(6)

SVM: Terminology 1(6) SVM: Terminology 2(6) Andrew Kusiak Inteigent Systems Laboratory 39 Seamans Center he University of Iowa Iowa City, IA 54-57 SVM he maxima margin cassifier is simiar to the perceptron: It aso assumes that the data points are

More information

A. Distribution of the test statistic

A. Distribution of the test statistic A. Distribution of the test statistic In the sequentia test, we first compute the test statistic from a mini-batch of size m. If a decision cannot be made with this statistic, we keep increasing the mini-batch

More information

FORECASTING TELECOMMUNICATIONS DATA WITH AUTOREGRESSIVE INTEGRATED MOVING AVERAGE MODELS

FORECASTING TELECOMMUNICATIONS DATA WITH AUTOREGRESSIVE INTEGRATED MOVING AVERAGE MODELS FORECASTING TEECOMMUNICATIONS DATA WITH AUTOREGRESSIVE INTEGRATED MOVING AVERAGE MODES Niesh Subhash naawade a, Mrs. Meenakshi Pawar b a SVERI's Coege of Engineering, Pandharpur. nieshsubhash15@gmai.com

More information

A Brief Introduction to Markov Chains and Hidden Markov Models

A Brief Introduction to Markov Chains and Hidden Markov Models A Brief Introduction to Markov Chains and Hidden Markov Modes Aen B MacKenzie Notes for December 1, 3, &8, 2015 Discrete-Time Markov Chains You may reca that when we first introduced random processes,

More information

Melodic contour estimation with B-spline models using a MDL criterion

Melodic contour estimation with B-spline models using a MDL criterion Meodic contour estimation with B-spine modes using a MDL criterion Damien Loive, Ney Barbot, Oivier Boeffard IRISA / University of Rennes 1 - ENSSAT 6 rue de Kerampont, B.P. 80518, F-305 Lannion Cedex

More information

PARSIMONIOUS VARIATIONAL-BAYES MIXTURE AGGREGATION WITH A POISSON PRIOR. Pierrick Bruneau, Marc Gelgon and Fabien Picarougne

PARSIMONIOUS VARIATIONAL-BAYES MIXTURE AGGREGATION WITH A POISSON PRIOR. Pierrick Bruneau, Marc Gelgon and Fabien Picarougne 17th European Signa Processing Conference (EUSIPCO 2009) Gasgow, Scotand, August 24-28, 2009 PARSIMONIOUS VARIATIONAL-BAYES MIXTURE AGGREGATION WITH A POISSON PRIOR Pierric Bruneau, Marc Gegon and Fabien

More information

Inductive Bias: How to generalize on novel data. CS Inductive Bias 1

Inductive Bias: How to generalize on novel data. CS Inductive Bias 1 Inductive Bias: How to generaize on nove data CS 478 - Inductive Bias 1 Overfitting Noise vs. Exceptions CS 478 - Inductive Bias 2 Non-Linear Tasks Linear Regression wi not generaize we to the task beow

More information

NEW DEVELOPMENT OF OPTIMAL COMPUTING BUDGET ALLOCATION FOR DISCRETE EVENT SIMULATION

NEW DEVELOPMENT OF OPTIMAL COMPUTING BUDGET ALLOCATION FOR DISCRETE EVENT SIMULATION NEW DEVELOPMENT OF OPTIMAL COMPUTING BUDGET ALLOCATION FOR DISCRETE EVENT SIMULATION Hsiao-Chang Chen Dept. of Systems Engineering University of Pennsyvania Phiadephia, PA 904-635, U.S.A. Chun-Hung Chen

More information

Explicit overall risk minimization transductive bound

Explicit overall risk minimization transductive bound 1 Expicit overa risk minimization transductive bound Sergio Decherchi, Paoo Gastado, Sandro Ridea, Rodofo Zunino Dept. of Biophysica and Eectronic Engineering (DIBE), Genoa University Via Opera Pia 11a,

More information

MARKOV CHAINS AND MARKOV DECISION THEORY. Contents

MARKOV CHAINS AND MARKOV DECISION THEORY. Contents MARKOV CHAINS AND MARKOV DECISION THEORY ARINDRIMA DATTA Abstract. In this paper, we begin with a forma introduction to probabiity and expain the concept of random variabes and stochastic processes. After

More information

II. PROBLEM. A. Description. For the space of audio signals

II. PROBLEM. A. Description. For the space of audio signals CS229 - Fina Report Speech Recording based Language Recognition (Natura Language) Leopod Cambier - cambier; Matan Leibovich - matane; Cindy Orozco Bohorquez - orozcocc ABSTRACT We construct a rea time

More information

Bayesian Unscented Kalman Filter for State Estimation of Nonlinear and Non-Gaussian Systems

Bayesian Unscented Kalman Filter for State Estimation of Nonlinear and Non-Gaussian Systems Bayesian Unscented Kaman Fiter for State Estimation of Noninear and Non-aussian Systems Zhong Liu, Shing-Chow Chan, Ho-Chun Wu and iafei Wu Department of Eectrica and Eectronic Engineering, he University

More information

Trainable fusion rules. I. Large sample size case

Trainable fusion rules. I. Large sample size case Neura Networks 19 (2006) 1506 1516 www.esevier.com/ocate/neunet Trainabe fusion rues. I. Large sampe size case Šarūnas Raudys Institute of Mathematics and Informatics, Akademijos 4, Vinius 08633, Lithuania

More information

Asynchronous Control for Coupled Markov Decision Systems

Asynchronous Control for Coupled Markov Decision Systems INFORMATION THEORY WORKSHOP (ITW) 22 Asynchronous Contro for Couped Marov Decision Systems Michae J. Neey University of Southern Caifornia Abstract This paper considers optima contro for a coection of

More information

Stochastic Variational Inference with Gradient Linearization

Stochastic Variational Inference with Gradient Linearization Stochastic Variationa Inference with Gradient Linearization Suppementa Materia Tobias Pötz * Anne S Wannenwetsch Stefan Roth Department of Computer Science, TU Darmstadt Preface In this suppementa materia,

More information

arxiv: v2 [cond-mat.stat-mech] 14 Nov 2008

arxiv: v2 [cond-mat.stat-mech] 14 Nov 2008 Random Booean Networks Barbara Drosse Institute of Condensed Matter Physics, Darmstadt University of Technoogy, Hochschustraße 6, 64289 Darmstadt, Germany (Dated: June 27) arxiv:76.335v2 [cond-mat.stat-mech]

More information

Statistical Learning Theory: a Primer

Statistical Learning Theory: a Primer ??,??, 1 6 (??) c?? Kuwer Academic Pubishers, Boston. Manufactured in The Netherands. Statistica Learning Theory: a Primer THEODOROS EVGENIOU AND MASSIMILIANO PONTIL Center for Bioogica and Computationa

More information

Interactive Fuzzy Programming for Two-level Nonlinear Integer Programming Problems through Genetic Algorithms

Interactive Fuzzy Programming for Two-level Nonlinear Integer Programming Problems through Genetic Algorithms Md. Abu Kaam Azad et a./asia Paciic Management Review (5) (), 7-77 Interactive Fuzzy Programming or Two-eve Noninear Integer Programming Probems through Genetic Agorithms Abstract Md. Abu Kaam Azad a,*,

More information

Paragraph Topic Classification

Paragraph Topic Classification Paragraph Topic Cassification Eugene Nho Graduate Schoo of Business Stanford University Stanford, CA 94305 enho@stanford.edu Edward Ng Department of Eectrica Engineering Stanford University Stanford, CA

More information

6.434J/16.391J Statistics for Engineers and Scientists May 4 MIT, Spring 2006 Handout #17. Solution 7

6.434J/16.391J Statistics for Engineers and Scientists May 4 MIT, Spring 2006 Handout #17. Solution 7 6.434J/16.391J Statistics for Engineers and Scientists May 4 MIT, Spring 2006 Handout #17 Soution 7 Probem 1: Generating Random Variabes Each part of this probem requires impementation in MATLAB. For the

More information

Florida State University Libraries

Florida State University Libraries Forida State University Libraries Eectronic Theses, Treatises and Dissertations The Graduate Schoo Construction of Efficient Fractiona Factoria Mixed-Leve Designs Yong Guo Foow this and additiona works

More information

Moreau-Yosida Regularization for Grouped Tree Structure Learning

Moreau-Yosida Regularization for Grouped Tree Structure Learning Moreau-Yosida Reguarization for Grouped Tree Structure Learning Jun Liu Computer Science and Engineering Arizona State University J.Liu@asu.edu Jieping Ye Computer Science and Engineering Arizona State

More information

Optimality of Inference in Hierarchical Coding for Distributed Object-Based Representations

Optimality of Inference in Hierarchical Coding for Distributed Object-Based Representations Optimaity of Inference in Hierarchica Coding for Distributed Object-Based Representations Simon Brodeur, Jean Rouat NECOTIS, Département génie éectrique et génie informatique, Université de Sherbrooke,

More information

Support Vector Machine and Its Application to Regression and Classification

Support Vector Machine and Its Application to Regression and Classification BearWorks Institutiona Repository MSU Graduate Theses Spring 2017 Support Vector Machine and Its Appication to Regression and Cassification Xiaotong Hu As with any inteectua project, the content and views

More information

Structural health monitoring of concrete dams using least squares support vector machines

Structural health monitoring of concrete dams using least squares support vector machines Structura heath monitoring of concrete dams using east squares support vector machines *Fei Kang ), Junjie Li ), Shouju Li 3) and Jia Liu ) ), ), ) Schoo of Hydrauic Engineering, Daian University of echnoogy,

More information

Ant Colony Algorithms for Constructing Bayesian Multi-net Classifiers

Ant Colony Algorithms for Constructing Bayesian Multi-net Classifiers Ant Coony Agorithms for Constructing Bayesian Muti-net Cassifiers Khaid M. Saama and Aex A. Freitas Schoo of Computing, University of Kent, Canterbury, UK. {kms39,a.a.freitas}@kent.ac.uk December 5, 2013

More information

Do Schools Matter for High Math Achievement? Evidence from the American Mathematics Competitions Glenn Ellison and Ashley Swanson Online Appendix

Do Schools Matter for High Math Achievement? Evidence from the American Mathematics Competitions Glenn Ellison and Ashley Swanson Online Appendix VOL. NO. DO SCHOOLS MATTER FOR HIGH MATH ACHIEVEMENT? 43 Do Schoos Matter for High Math Achievement? Evidence from the American Mathematics Competitions Genn Eison and Ashey Swanson Onine Appendix Appendix

More information

Unconditional security of differential phase shift quantum key distribution

Unconditional security of differential phase shift quantum key distribution Unconditiona security of differentia phase shift quantum key distribution Kai Wen, Yoshihisa Yamamoto Ginzton Lab and Dept of Eectrica Engineering Stanford University Basic idea of DPS-QKD Protoco. Aice

More information

Learning Fully Observed Undirected Graphical Models

Learning Fully Observed Undirected Graphical Models Learning Fuy Observed Undirected Graphica Modes Sides Credit: Matt Gormey (2016) Kayhan Batmangheich 1 Machine Learning The data inspires the structures we want to predict Inference finds {best structure,

More information

Convergence Property of the Iri-Imai Algorithm for Some Smooth Convex Programming Problems

Convergence Property of the Iri-Imai Algorithm for Some Smooth Convex Programming Problems Convergence Property of the Iri-Imai Agorithm for Some Smooth Convex Programming Probems S. Zhang Communicated by Z.Q. Luo Assistant Professor, Department of Econometrics, University of Groningen, Groningen,

More information

MONTE CARLO SIMULATIONS

MONTE CARLO SIMULATIONS MONTE CARLO SIMULATIONS Current physics research 1) Theoretica 2) Experimenta 3) Computationa Monte Caro (MC) Method (1953) used to study 1) Discrete spin systems 2) Fuids 3) Poymers, membranes, soft matter

More information

Automobile Prices in Market Equilibrium. Berry, Pakes and Levinsohn

Automobile Prices in Market Equilibrium. Berry, Pakes and Levinsohn Automobie Prices in Market Equiibrium Berry, Pakes and Levinsohn Empirica Anaysis of demand and suppy in a differentiated products market: equiibrium in the U.S. automobie market. Oigopoistic Differentiated

More information

Structural Control of Probabilistic Boolean Networks and Its Application to Design of Real-Time Pricing Systems

Structural Control of Probabilistic Boolean Networks and Its Application to Design of Real-Time Pricing Systems Preprints of the 9th Word Congress The Internationa Federation of Automatic Contro Structura Contro of Probabiistic Booean Networks and Its Appication to Design of Rea-Time Pricing Systems Koichi Kobayashi

More information

Scalable Spectrum Allocation for Large Networks Based on Sparse Optimization

Scalable Spectrum Allocation for Large Networks Based on Sparse Optimization Scaabe Spectrum ocation for Large Networks ased on Sparse Optimization innan Zhuang Modem R&D Lab Samsung Semiconductor, Inc. San Diego, C Dongning Guo, Ermin Wei, and Michae L. Honig Department of Eectrica

More information

SVM-based Supervised and Unsupervised Classification Schemes

SVM-based Supervised and Unsupervised Classification Schemes SVM-based Supervised and Unsupervised Cassification Schemes LUMINITA STATE University of Pitesti Facuty of Mathematics and Computer Science 1 Targu din Vae St., Pitesti 110040 ROMANIA state@cicknet.ro

More information

Jan van den Brakel Department of Statistical Methods, Statistics Netherlands

Jan van den Brakel Department of Statistical Methods, Statistics Netherlands Survey Research Methods (2010) Vo.4, No.2, pp. 103-119 ISSN 1864-3361 http://www.surveymethods.org c European Survey Research Association Samping and estimation techniques for the impementation of new

More information

Optimal Control of Assembly Systems with Multiple Stages and Multiple Demand Classes 1

Optimal Control of Assembly Systems with Multiple Stages and Multiple Demand Classes 1 Optima Contro of Assemby Systems with Mutipe Stages and Mutipe Demand Casses Saif Benjaafar Mohsen EHafsi 2 Chung-Yee Lee 3 Weihua Zhou 3 Industria & Systems Engineering, Department of Mechanica Engineering,

More information

Converting Z-number to Fuzzy Number using. Fuzzy Expected Value

Converting Z-number to Fuzzy Number using. Fuzzy Expected Value ISSN 1746-7659, Engand, UK Journa of Information and Computing Science Vo. 1, No. 4, 017, pp.91-303 Converting Z-number to Fuzzy Number using Fuzzy Expected Vaue Mahdieh Akhbari * Department of Industria

More information

Combining reaction kinetics to the multi-phase Gibbs energy calculation

Combining reaction kinetics to the multi-phase Gibbs energy calculation 7 th European Symposium on Computer Aided Process Engineering ESCAPE7 V. Pesu and P.S. Agachi (Editors) 2007 Esevier B.V. A rights reserved. Combining reaction inetics to the muti-phase Gibbs energy cacuation

More information

Notes on Backpropagation with Cross Entropy

Notes on Backpropagation with Cross Entropy Notes on Backpropagation with Cross Entropy I-Ta ee, Dan Gowasser, Bruno Ribeiro Purue University October 3, 07. Overview This note introuces backpropagation for a common neura network muti-cass cassifier.

More information

MINIMAX PROBABILITY MACHINE (MPM) is a

MINIMAX PROBABILITY MACHINE (MPM) is a Efficient Minimax Custering Probabiity Machine by Generaized Probabiity Product Kerne Haiqin Yang, Kaizhu Huang, Irwin King and Michae R. Lyu Abstract Minimax Probabiity Machine (MPM), earning a decision

More information

STA 216 Project: Spline Approach to Discrete Survival Analysis

STA 216 Project: Spline Approach to Discrete Survival Analysis : Spine Approach to Discrete Surviva Anaysis November 4, 005 1 Introduction Athough continuous surviva anaysis differs much from the discrete surviva anaysis, there is certain ink between the two modeing

More information

Two view learning: SVM-2K, Theory and Practice

Two view learning: SVM-2K, Theory and Practice Two view earning: SVM-2K, Theory and Practice Jason D.R. Farquhar jdrf99r@ecs.soton.ac.uk Hongying Meng hongying@cs.york.ac.uk David R. Hardoon drh@ecs.soton.ac.uk John Shawe-Tayor jst@ecs.soton.ac.uk

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Schoo of Computer Science Probabiistic Graphica Modes Gaussian graphica modes and Ising modes: modeing networks Eric Xing Lecture 0, February 0, 07 Reading: See cass website Eric Xing @ CMU, 005-07 Network

More information

A Ridgelet Kernel Regression Model using Genetic Algorithm

A Ridgelet Kernel Regression Model using Genetic Algorithm A Ridgeet Kerne Regression Mode using Genetic Agorithm Shuyuan Yang, Min Wang, Licheng Jiao * Institute of Inteigence Information Processing, Department of Eectrica Engineering Xidian University Xi an,

More information

Collective organization in an adaptative mixture of experts

Collective organization in an adaptative mixture of experts Coective organization in an adaptative mixture of experts Vincent Vigneron, Christine Fuchen, Jean-Marc Martinez To cite this version: Vincent Vigneron, Christine Fuchen, Jean-Marc Martinez. Coective organization

More information

A unified framework for Regularization Networks and Support Vector Machines. Theodoros Evgeniou, Massimiliano Pontil, Tomaso Poggio

A unified framework for Regularization Networks and Support Vector Machines. Theodoros Evgeniou, Massimiliano Pontil, Tomaso Poggio MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL INTELLIGENCE LABORATORY and CENTER FOR BIOLOGICAL AND COMPUTATIONAL LEARNING DEPARTMENT OF BRAIN AND COGNITIVE SCIENCES A.I. Memo No. 1654 March23, 1999

More information

First-Order Corrections to Gutzwiller s Trace Formula for Systems with Discrete Symmetries

First-Order Corrections to Gutzwiller s Trace Formula for Systems with Discrete Symmetries c 26 Noninear Phenomena in Compex Systems First-Order Corrections to Gutzwier s Trace Formua for Systems with Discrete Symmetries Hoger Cartarius, Jörg Main, and Günter Wunner Institut für Theoretische

More information

How the backpropagation algorithm works Srikumar Ramalingam School of Computing University of Utah

How the backpropagation algorithm works Srikumar Ramalingam School of Computing University of Utah How the backpropagation agorithm works Srikumar Ramaingam Schoo of Computing University of Utah Reference Most of the sides are taken from the second chapter of the onine book by Michae Nieson: neuranetworksanddeepearning.com

More information

Multilayer Kerceptron

Multilayer Kerceptron Mutiayer Kerceptron Zotán Szabó, András Lőrincz Department of Information Systems, Facuty of Informatics Eötvös Loránd University Pázmány Péter sétány 1/C H-1117, Budapest, Hungary e-mai: szzoi@csetehu,

More information

Available online at ScienceDirect. Procedia Computer Science 96 (2016 )

Available online at  ScienceDirect. Procedia Computer Science 96 (2016 ) Avaiabe onine at www.sciencedirect.com ScienceDirect Procedia Computer Science 96 (206 92 99 20th Internationa Conference on Knowedge Based and Inteigent Information and Engineering Systems Connected categorica

More information

Radar/ESM Tracking of Constant Velocity Target : Comparison of Batch (MLE) and EKF Performance

Radar/ESM Tracking of Constant Velocity Target : Comparison of Batch (MLE) and EKF Performance adar/ racing of Constant Veocity arget : Comparison of Batch (LE) and EKF Performance I. Leibowicz homson-csf Deteis/IISA La cef de Saint-Pierre 1 Bd Jean ouin 7885 Eancourt Cede France Isabee.Leibowicz

More information

Nonlinear Analysis of Spatial Trusses

Nonlinear Analysis of Spatial Trusses Noninear Anaysis of Spatia Trusses João Barrigó October 14 Abstract The present work addresses the noninear behavior of space trusses A formuation for geometrica noninear anaysis is presented, which incudes

More information

Asymptotic Properties of a Generalized Cross Entropy Optimization Algorithm

Asymptotic Properties of a Generalized Cross Entropy Optimization Algorithm 1 Asymptotic Properties of a Generaized Cross Entropy Optimization Agorithm Zijun Wu, Michae Koonko, Institute for Appied Stochastics and Operations Research, Caustha Technica University Abstract The discrete

More information

1. Introduction. Keywords: artificial neural networks, model selection, resampling, data snooping.

1. Introduction. Keywords: artificial neural networks, model selection, resampling, data snooping. Resamping techniques and neura networks: some recent deveopments for mode seection Tecniche di ricampionamento e reti neurai: acuni recenti sviuppi per a seezione de modeo Michee La Rocca and Cira Perna

More information

ASummaryofGaussianProcesses Coryn A.L. Bailer-Jones

ASummaryofGaussianProcesses Coryn A.L. Bailer-Jones ASummaryofGaussianProcesses Coryn A.L. Baier-Jones Cavendish Laboratory University of Cambridge caj@mrao.cam.ac.uk Introduction A genera prediction probem can be posed as foows. We consider that the variabe

More information

Stochastic Automata Networks (SAN) - Modelling. and Evaluation. Paulo Fernandes 1. Brigitte Plateau 2. May 29, 1997

Stochastic Automata Networks (SAN) - Modelling. and Evaluation. Paulo Fernandes 1. Brigitte Plateau 2. May 29, 1997 Stochastic utomata etworks (S) - Modeing and Evauation Pauo Fernandes rigitte Pateau 2 May 29, 997 Institut ationa Poytechnique de Grenobe { IPG Ecoe ationae Superieure d'informatique et de Mathematiques

More information

On generalized quantum Turing machine and its language classes

On generalized quantum Turing machine and its language classes Proceedings of the 11th WSEAS Internationa onference on APPLIED MATHEMATIS, Daas, Texas, USA, March -4, 007 51 On generaized quantum Turing machine and its anguage casses SATOSHI IRIYAMA Toyo University

More information

Data Mining Technology for Failure Prognostic of Avionics

Data Mining Technology for Failure Prognostic of Avionics IEEE Transactions on Aerospace and Eectronic Systems. Voume 38, #, pp.388-403, 00. Data Mining Technoogy for Faiure Prognostic of Avionics V.A. Skormin, Binghamton University, Binghamton, NY, 1390, USA

More information

TELECOMMUNICATION DATA FORECASTING BASED ON ARIMA MODEL

TELECOMMUNICATION DATA FORECASTING BASED ON ARIMA MODEL TEECOMMUNICATION DATA FORECASTING BASED ON ARIMA MODE Anjuman Akbar Muani 1, Prof. Sachin Muraraka 2, Prof. K. Sujatha 3 1Student ME E&TC,Shree Ramchandra Coege of Engineering, onikand,pune,maharashtra

More information

arxiv: v2 [stat.ml] 19 Oct 2016

arxiv: v2 [stat.ml] 19 Oct 2016 Sparse Quadratic Discriminant Anaysis and Community Bayes arxiv:1407.4543v2 [stat.ml] 19 Oct 2016 Ya Le Department of Statistics Stanford University ye@stanford.edu Abstract Trevor Hastie Department of

More information

On the evaluation of saving-consumption plans

On the evaluation of saving-consumption plans On the evauation of saving-consumption pans Steven Vanduffe Jan Dhaene Marc Goovaerts Juy 13, 2004 Abstract Knowedge of the distribution function of the stochasticay compounded vaue of a series of future

More information

Maintenance activities planning and grouping for complex structure systems

Maintenance activities planning and grouping for complex structure systems Maintenance activities panning and grouping for compex structure systems Hai Canh u, Phuc Do an, Anne Barros, Christophe Bérenguer To cite this version: Hai Canh u, Phuc Do an, Anne Barros, Christophe

More information

Componentwise Determination of the Interval Hull Solution for Linear Interval Parameter Systems

Componentwise Determination of the Interval Hull Solution for Linear Interval Parameter Systems Componentwise Determination of the Interva Hu Soution for Linear Interva Parameter Systems L. V. Koev Dept. of Theoretica Eectrotechnics, Facuty of Automatics, Technica University of Sofia, 1000 Sofia,

More information

Research Article Optimal Control of Probabilistic Logic Networks and Its Application to Real-Time Pricing of Electricity

Research Article Optimal Control of Probabilistic Logic Networks and Its Application to Real-Time Pricing of Electricity Hindawi Pubishing Corporation Mathematica Probems in Engineering Voume 05, Artice ID 950, 0 pages http://dxdoiorg/055/05/950 Research Artice Optima Contro of Probabiistic Logic Networks and Its Appication

More information

Learning Structural Changes of Gaussian Graphical Models in Controlled Experiments

Learning Structural Changes of Gaussian Graphical Models in Controlled Experiments Learning Structura Changes of Gaussian Graphica Modes in Controed Experiments Bai Zhang and Yue Wang Bradey Department of Eectrica and Computer Engineering Virginia Poytechnic Institute and State University

More information

Safety Evaluation Model of Chemical Logistics Park Operation Based on Back Propagation Neural Network

Safety Evaluation Model of Chemical Logistics Park Operation Based on Back Propagation Neural Network 1513 A pubication of CHEMICAL ENGINEERING TRANSACTIONS VOL. 6, 017 Guest Editors: Fei Song, Haibo Wang, Fang He Copyright 017, AIDIC Servizi S.r.. ISBN 978-88-95608-60-0; ISSN 83-916 The Itaian Association

More information

Lecture Note 3: Stationary Iterative Methods

Lecture Note 3: Stationary Iterative Methods MATH 5330: Computationa Methods of Linear Agebra Lecture Note 3: Stationary Iterative Methods Xianyi Zeng Department of Mathematica Sciences, UTEP Stationary Iterative Methods The Gaussian eimination (or

More information

General Certificate of Education Advanced Level Examination June 2010

General Certificate of Education Advanced Level Examination June 2010 Genera Certificate of Education Advanced Leve Examination June 2010 Human Bioogy HBI6T/Q10/task Unit 6T A2 Investigative Skis Assignment Task Sheet The effect of using one or two eyes on the perception

More information

Study on Fusion Algorithm of Multi-source Image Based on Sensor and Computer Image Processing Technology

Study on Fusion Algorithm of Multi-source Image Based on Sensor and Computer Image Processing Technology Sensors & Transducers 3 by IFSA http://www.sensorsporta.com Study on Fusion Agorithm of Muti-source Image Based on Sensor and Computer Image Processing Technoogy Yao NAN, Wang KAISHENG, 3 Yu JIN The Information

More information

<C 2 2. λ 2 l. λ 1 l 1 < C 1

<C 2 2. λ 2 l. λ 1 l 1 < C 1 Teecommunication Network Contro and Management (EE E694) Prof. A. A. Lazar Notes for the ecture of 7/Feb/95 by Huayan Wang (this document was ast LaT E X-ed on May 9,995) Queueing Primer for Muticass Optima

More information

Ensemble Online Clustering through Decentralized Observations

Ensemble Online Clustering through Decentralized Observations Ensembe Onine Custering through Decentraized Observations Dimitrios Katseis, Caroyn L. Bec and Mihaea van der Schaar Abstract We investigate the probem of onine earning for an ensembe of agents custering

More information

Parsimony Pressure Made Easy

Parsimony Pressure Made Easy Parsimony Pressure Made Easy Riccardo Poi Department of Computing and Eectronic Systems University of Essex, UK rpoi@essex.ac.uk Nichoas Freitag McPhee Division of Science and Mathematics University of

More information

A Comparison Study of the Test for Right Censored and Grouped Data

A Comparison Study of the Test for Right Censored and Grouped Data Communications for Statistica Appications and Methods 2015, Vo. 22, No. 4, 313 320 DOI: http://dx.doi.org/10.5351/csam.2015.22.4.313 Print ISSN 2287-7843 / Onine ISSN 2383-4757 A Comparison Study of the

More information

Global sensitivity analysis using low-rank tensor approximations

Global sensitivity analysis using low-rank tensor approximations Goba sensitivity anaysis using ow-rank tensor approximations K. Konaki 1 and B. Sudret 1 1 Chair of Risk, Safety and Uncertainty Quantification, arxiv:1605.09009v1 [stat.co] 29 May 2016 ETH Zurich, Stefano-Franscini-Patz

More information

Research Article Analysis of Heart Transplant Survival Data Using Generalized Additive Models

Research Article Analysis of Heart Transplant Survival Data Using Generalized Additive Models Hindawi Pubishing Corporation Computationa and Mathematica Methods in Medicine Voume 2013, Artice ID 609857, 7 pages http://dx.doi.org/10.1155/2013/609857 Research Artice Anaysis of Heart Transpant Surviva

More information

Multiway Regularized Generalized Canonical Correlation Analysis

Multiway Regularized Generalized Canonical Correlation Analysis Mutiway Reguarized Generaized Canonica Correation Anaysis Arthur Tenenhaus, Laurent Le Brusquet, Gisea Lechuga To cite this version: Arthur Tenenhaus, Laurent Le Brusquet, Gisea Lechuga. Mutiway Reguarized

More information

Seasonal Time Series Data Forecasting by Using Neural Networks Multiscale Autoregressive Model

Seasonal Time Series Data Forecasting by Using Neural Networks Multiscale Autoregressive Model American ourna of Appied Sciences 7 (): 372-378, 2 ISSN 546-9239 2 Science Pubications Seasona Time Series Data Forecasting by Using Neura Networks Mutiscae Autoregressive Mode Suhartono, B.S.S. Uama and

More information

XSAT of linear CNF formulas

XSAT of linear CNF formulas XSAT of inear CN formuas Bernd R. Schuh Dr. Bernd Schuh, D-50968 Kön, Germany; bernd.schuh@netcoogne.de eywords: compexity, XSAT, exact inear formua, -reguarity, -uniformity, NPcompeteness Abstract. Open

More information

Algorithms to solve massively under-defined systems of multivariate quadratic equations

Algorithms to solve massively under-defined systems of multivariate quadratic equations Agorithms to sove massivey under-defined systems of mutivariate quadratic equations Yasufumi Hashimoto Abstract It is we known that the probem to sove a set of randomy chosen mutivariate quadratic equations

More information

MATH 172: MOTIVATION FOR FOURIER SERIES: SEPARATION OF VARIABLES

MATH 172: MOTIVATION FOR FOURIER SERIES: SEPARATION OF VARIABLES MATH 172: MOTIVATION FOR FOURIER SERIES: SEPARATION OF VARIABLES Separation of variabes is a method to sove certain PDEs which have a warped product structure. First, on R n, a inear PDE of order m is

More information

Model-based Clustering by Probabilistic Self-organizing Maps

Model-based Clustering by Probabilistic Self-organizing Maps EEE TRANATN N NERA NETRK, V. XX, N., ode-based ustering by robabiistic ef-organizing aps hih-ian heng, Hsin-hia u, ember, EEE, and Hsin-in ang, ember, EEE Abstract n this paper, we consider the earning

More information

Power Control and Transmission Scheduling for Network Utility Maximization in Wireless Networks

Power Control and Transmission Scheduling for Network Utility Maximization in Wireless Networks ower Contro and Transmission Scheduing for Network Utiity Maximization in Wireess Networks Min Cao, Vivek Raghunathan, Stephen Hany, Vinod Sharma and. R. Kumar Abstract We consider a joint power contro

More information

Probabilistic Assessment of Total Transfer Capability Using SQP and Weather Effects

Probabilistic Assessment of Total Transfer Capability Using SQP and Weather Effects J Eectr Eng Techno Vo. 9, No. 5: 52-526, 24 http://dx.doi.org/.537/jeet.24.9.5.52 ISSN(Print) 975-2 ISSN(Onine) 293-7423 Probabiistic Assessment of Tota Transfer Capabiity Using SQP and Weather Effects

More information

How the backpropagation algorithm works Srikumar Ramalingam School of Computing University of Utah

How the backpropagation algorithm works Srikumar Ramalingam School of Computing University of Utah How the backpropagation agorithm works Srikumar Ramaingam Schoo of Computing University of Utah Reference Most of the sides are taken from the second chapter of the onine book by Michae Nieson: neuranetworksanddeepearning.com

More information