The Importance of Scaling in Data Mining for Toxicity Prediction
|
|
- Juliet Rose
- 5 years ago
- Views:
Transcription
1 1250 J. Chem. Inf. Comput. Sci. 2002, 42, The Importance of Scaling in Data Mining for Toxicity Prediction Paolo Mazzatorta and Emilio Benfenati* Istituto di Ricerche Farmacologiche Mario Negri Milano, Via Eritrea, 62, Milano, Italy Daniel Neagu and Giuseppina Gini Dipartimento di Elettronica e Informazione, Politecnico di Milano, Piazza L. da Vinci 32, Milano, Italy Received March 26, 2002 While mining a data set of 554 chemicals in order to extract information on their toxicity value, we faced the problem of scaling all the data. There are numerous different approaches to this procedure, and in most cases the choice greatly influences the results. The aim of this paper is 2-fold. First, we propose a universal scaling procedure for acute toxicity in fish according to the Directive 92/32/EEC. Second, we look at how expert preprocessing of the data effects the performance of qualitative structure-activity relationship (QSAR) approach to toxicity prediction. INTRODUCTION Preprocessing is the first step in predictive data mining (i.e., data mining with the goal of constructing a predictive model). It includes several operations on the data set before a statistical model is developed. Preprocessing can be based on a priori knowledge about the data, on assumptions underlying the statistical model, or on practical experience. Such manipulations can take many forms reflecting different modeling objectives and levels of knowledge about the properties of the data. 9,14,24 In modeling, a proper selection of the transformation of a response variable implies a number of benefits. Notably, it may (i) simplify the response function by linearizing a nonlinear response-factor relationship; (ii) stabilize the variance; and (iii) make the distribution more normal. Many quantitative structure-activity relationship (QSAR) methods require scaling of the original data to extract significant and useful information and remove unimportant, not interesting features. We used a QSAR approach based on artificial neural networks (ANN), to predict acute toxicity, i.e., the lethal concentration for 50% of the test animals (LC 50 ), for the fathead minnow (Pimephales promelas). This approach employs a powerful pattern recognition paradigm, able to analyze various types of data, 1,6,15 but all the data must be scaled between zero and one. Scaling the descriptors is a very delicate procedure because we do not know the underlying relationship between the descriptor and the toxicity for most of them, 3,4 and therefore cannot foresee the influence of these manipulations. We therefore maintained the original distribution, using a range scaling in order to conserve it. For the LC 50, we based ourselves on the EU Directive 92/ 32/EEC annex VI point 5.1, 10 which classifies the chemicals as shown in Table 1: * Corresponding author phone: ; fax: ; benfenati@marionegri.it. Table 1. EC Classification for Fish (Directive 92/32EEC Annex VI Point 5.1) class LC 50 dangerous for the environment I <1 mg/l very toxic to aquatic organisms II 1-10 mg/l toxic to aquatic organisms III mg/l harmful to aquatic organisms IV >100 mg/l may cause long-term adverse effects in the aquatic environment The classification is clearly based on a logarithmic scale. This directive gives useful guideline for scaling acute toxicity. Furthermore, to have a useful general instrument, the scaling procedure must go beyond the limits of the data set mined. Because there is a natural lower limit, 0 mg/l, but not a upper one, the only solution was to have a function between 0 and 1 with an asymptote to 1. Thus every possible real value for the LC 50 is represented. The inevitable loss of knowledge about the highest values is acceptable because they are in the less toxic class and because, according to the EU directive, less precision is required on high values. This paper presents a scaling procedure for LC 50 that takes account of the EU regulation, and the need for a universal approach, useful for aquatic toxicity, is presented in this paper. MATERIALS AND METHODS Data Set. We mined a data set of 554 organic compounds, commonly used in industrial processes. The U.S. Environmental Protection Agency 11-13,28 helped build up this data set, starting from a review of experimental data in the literature, referring to acute toxicity 96-h LC 50, for the fathead minnow (Pimephales promelas). Close analysis of a large amount of experimental information led to the association of a mechanism of action (MOA) for each compound. This toxicological data in this set is one of the biggest available and very reliable, 28 and therefore the information extracted from it has a general validity /ci025520n CCC: $ American Chemical Society Published on Web 08/20/2002
2 SCALING IN DATA MINING FOR TOXICITY PREDICTION J. Chem. Inf. Comput. Sci., Vol. 42, No. 5, Table 2. Statistical Information about the Toxicity Values in the Data Set value (mg/l) maximum minimum range e+004 standard deviation e+003 variance e+007 geometric mean arithmetic average e+003 Table 3. Descriptors Selected for the Models descriptor total energy (kcal/mol) heat of formation (kcal/mol) LUMO (ev) relative number of N atoms relative number of single bonds molecular weight (amu) Kier&Hall index (order 0) average information content (order 1) moment of inertia B molecular volume molecular surface area TMSA total molecular surface area [Zefirov s PC] FPSA-2 fractional PPSA (PPSA-2/TMSA) [Zefirov s PC] PPSA-3 atomic charge weighted PPSA [Zefirov s PC] FPSA-3 fractional PPSA (PPSA-3/TMSA) [Zefirov s PC] LogD ph9 LogP code QM1 QM3 QM6 C9 C24 C35 T6 T22 G2 G10 G12 E13 E24 E28 E31 ph9 LogP The data set was randomly partitioned 70-30% between 388 training cases, used to develop the models and 166 testing cases, and used to evaluate the prediction ability of the models. Statistical information about the data set mined is summarized in Table 2. Descriptors. The descriptors were calculated using different software: Hyperchem 5.0 (Hypercube Inc., Gainesville, FL, U.S.A.), CODESSA (SemiChem Inc., Shawnee, KS, U.S.A.), and Pallas 2.1 (CompuDrug; Budapest, Hungary). Out of the hundreds of descriptors proposed by these software just 153 gave a nonconstant or nonmissing value for all the objects. The set of descriptors resulting can be split into six categories according to the classification present in the software CODESSA: 20 constitutional (34), geometrical (14), topological (38), electrostatic (57), quantumchemical (6), and physicochemical descriptors (4). To ensure a good model it is useful to select the variables that describe the molecules best. Some of these descriptors add no information and just increase the noise, making it more difficult to analyze the results. Furthermore, with a limited number of variables the risk of overfitting is reduced. 17,29 The descriptor selection was done through principal components analysis (PCA) with the principal components eigenvalue (Scree) plot method. Descriptors with the highest scores on the first four components of PCA were chosen and reduced, eliminating those most closely correlated. A final criterion was to keep a pool of descriptors representing the different aspects of the molecule considered (physicochemical, electronic, and topological, etc.). The selected descriptors are listed in Table 3. NIKE. ANN are powerful tools, used to develop a wide range of real-world applications, especially when traditional solving methods fail. 2,5,25 They offer advantages such as ideal learning ability from data, classification capabilities and generalization, computational speed once trained, with parallel processing, and noise tolerance. The major shortcoming of neural networks is their low transparency. In developing ANN models we used the hybrid intelligent system shell NIKE, 22,23 in order to automate the processes involved, from data representation for toxicity measurements, to the prediction of toxicity for a given new input. We used the first NIKE module, called IKM-CNN (implicit knowledge module-based on crisp neural networks), which takes charge of models the data set as a multilayer perceptron (MLP). 27 For a chosen IKM-CNN, the descriptors are considered the inputs for the neural nets. IKM-CNN are described as MISO (multi-input single output) structures, having as output the normalized value of toxicity (LC 50 ). The neural nets are trained on the specific data and adjusted using the number of the hidden layer neurons and momentum term. Scaling. We scaled all descriptors between zero and one, using a range scaling. However toxicity was scaled with the following six scaling procedures: Range Scaling (RS): y i ) x i - min(x) max(x) - min(x) where y i ) scaled value, x i ) original value, min(x) ) minimum of the collection of x objects, and max(x) ) maximum of the collection of x objects. This is the most common approach because it is simple and keeps the linear distribution of the data. Range-Logarithmic Scaling (RLS): y i ) log 10 (x i ) - min(log 10 (x)) max(log 10 (x)) - min(log 10 (x)) (1) Logarithmic transformation is generally used when the coefficient of variance is constant. Range-Logarithmic Scaling Modified (RLSM): y i ) log 10 (x i + 1) - min(log 10 (x + 1)) max(log 10 (x + 1)) - min(log 10 (x + 1)) (2) (3) This is a modification of the previous transformation, it uses log 10 (x i + 1) to overcome to limit of log 10 (x i ) when x i ) 0. Tangent Hyperbolic Scaling (THS): y i ) tanh(x i ) (4) This transformation is introduced to extrapolate the model beyond the data set limits, because it tends asymptotically to 1. Tangent Hyperbolic-Logarithmic Scaling (THLS): y i ) tanh(log 10 (x i + 1)) (5) This is intended to combine the efficiency of stabilizing the variance and generalizing the data set, given by the previous transformations. Tangent Hyperbolic - Logarithmic Scaling Modified (THLSM): y i ) tanh(0.4903log 10 (x i + 1) ) (6)
3 1252 J. Chem. Inf. Comput. Sci., Vol. 42, No. 5, 2002 MAZZATORTA ET AL. Figure 1. Performance validation of the 554 chemicals for: (a) range scaling; (b) range-logarithmic scaling; (c) range-logarithmic scaling modified; (d) tangent hyperbolic scaling; (e) tangent hyperbolic-logarithmic scaling; and (f) tangent hyperbolic-logarithmic scaling modified. On the x-axes there are the real values of the scaled LC 50 and on the y-axes there are the predicted value. This is a modification of the previous algorithm in order to achieve the best application to the EC classification. 4 RESULTS AND DISCUSSION We used a fully connected three-layered crisp neural network with 25 hidden neurons, according to the general rules on the maximum and minimum number of hidden neurons. 16,18,21 The back-propagation algorithm with a momentum term of 0.9 was used for training (a high momentum term prevents too many oscillations of the error function, 27 and small variations around 0.9 did not suggest in our case high differences). The networks were trained up to 5000 epochs. Figure 1 shows the predictive ability of models developed with the six different scaling procedures. The preprocessing operations have very strong influence. Besides the real performance of the model, uncritical scaling makes it extremely difficult to extract information from the data set. Table 1 shows statistical indices of the models developed. R 2 is the explained variance by the model and represents the ability of the model in correlating the descriptors and the experimental answer. Mean squared error (MSE) gives
4 SCALING IN DATA MINING FOR TOXICITY PREDICTION J. Chem. Inf. Comput. Sci., Vol. 42, No. 5, Figure 2. Scaling algorithms in the whole range (a) and in the significant interval [0-150 mg/l] (b). The black dots in (b) are landmarks for the ideal transformation. On the x-axes there are the original values and on the y-axes there are the scaled values. Figure 3. Percentage distribution of total number of compounds (554) within the EC toxicological classes for the original data set (first column) and in four classes of the same width after each scaling procedure. a general information of the performances of the model on the data set mined. RS (Figure 1a) is not able to manage the high variance of the data. In view of the fact that very few data have high values, it concentrates most of the other data in a small interval and loses important information about them. Worse, it loses information about the most toxic class of compounds, which is, of course, the most important. RLS (Figure 1b) is a good example of preprocessing. The objects are well distributed in the whole interval and the model takes advantage of this. The weakness of this transformation lies in the presence of limits which restrict its application on the data set considered. In fact, it needs a min and max value to be computed so it is limited between these values. RLSM (Figure 1c) has a better distribution in the whole interval, and it results in a better ability of the model in Table 4. Statistical Indices of the Models Developed RS RLS RLSM THS THLS THLSM R MSE describing the relationship between the inputs and the output (R 2 ) but has worst mean error (Table 4). To overcome the problem of the limits of the data set we used a new approach. THS (Figure 1d) responds to our requirement for a generalizable manipulation but has a strong negative effect on the data distribution, with obvious consequences on the model performances; indeed, once again most of the data are compressed in one area. Using logarithmic transformation first, to keep to the EU guidelines, and then using tangent hyperbolic scaling in order to make it generalizable, we tried to overcome both these
5 1254 J. Chem. Inf. Comput. Sci., Vol. 42, No. 5, 2002 MAZZATORTA ET AL. hurdles. There is a useful improvement, but it is not yet the best distribution (Figure 1e). The idea to modify the previous transformation comes directly from the EU classification. We had a relatively good solution (THLS) which only needed to fit on the ideal distribution given by the directive. We used a nonlinear curve-fitting solver in the least squares sense (lsqcurvefit). 19 That is, given input data xdata and the observed output ydata, find coefficients x that best fit the equation F(x, xdata). Lsqcurvefit uses the large-scale algorithm, a subspace trust region method based on the interior-reflective Newton method described in refs 7 and 8. Each iteration involves the approximate solution of a large linear system using the method of preconditioned conjugate gradients (PCG) 1 min x 2 F(x,xdata) - ydata 2 2 ) 1 2 i where xdata is the vector of the class limits given by the EC, ydata is the vector of the best ideal distribution, and F(x, xdata) is the vector valued function. In this case the algorithm finds the coefficients x that best fit the equation THLSM: xdata ) [0; 1; 10; 100; inf] ydata ) [0; 0.25; 0.5; 0.75; 1] CONCLUSIONS (F(x,xdata i ) - ydata) 2 (7) F(x,xdata) ) tanh(x 1 log 10 (xdata + 1) + x 2 ) + x 3 (8) The strong effect of scaling is easily understandable in the following figure (Figure 2a,b), which shows the distribution of the data after each transformation in the whole interval of the data set (Figure 2a) and just for the significant interval [0-150 mg/l], where all the most toxic classes are concentrated (Figure 2b). The ideal transformation is the one that succeeds in scaling the original toxic classes into classes of the same width so that each transformed class has the same accuracy and the same original variance. Figure 2 shows that RS does not describe the data set reliably. This transformation forces almost every object (99%) into a very small interval (0-0.25), losing important information. THS behaves much the same way, but this scaling puts most of the data (87%) in the last interval [0.75-1]. RLS assigns too few elements (less than 2%) to the first class. RLSM and THLS are easily understood from Figure 2b, in the light of the previous reasoning. They have a better distribution than the previous transformations but are far from ideal. THLSM is the scaling that best fits the characteristics of the ideal transformation. Further considerations can be drawn from Figure 3. The distribution of the objects into four classes of the same width after every transformation is compared with the original distribution of the data set according to the EU classification. Once more THLSM gives the best approximation. Although this is not the most important characteristics RLS has quite a different distribution but still gives good modelssthis is an indicator of the reliability of this scaling technique. ACKNOWLEDGMENT This work is partially funded by the EU under contract HPRN-CT The authors wish to express their appreciation to the reviewers and to thank Prof. A. R. Katritzky (Gainesville, FL) and Prof. M. Karelson (Tartu, Estonia) for the use of CODESSA. REFERENCES AND NOTES (1) Bate, A.; Lindquist, M.; Edwards, I. R.; Olsson, S.; Orre, R.; Lansner, A.; De Freitas, R. M. Eur. J. Clin. Pharmacol. 1998, 54, (2) Becraft, R.; Lee, P. L.; Newell, R. B. Integration of Neural Networks and Expert Systems for Process Fault Diagnosis. In Proceedings of the International Joint Conference on Artificial Intelligence; 1991; pp (3) Benfenati, E.; Grasso, P.; Pelagatti, S.; Gini, G. On Variables and Variability in PredictiVe Toxicology; IV Girona Seminar on Molecular Similarity, 5-7 July 1999, Girona, Spain. (4) Benfenati, E.; Piclin, N.; Roncaglioni, A.; Varì, M. R. Factors influencing predictive models for toxicology. SAR QSAR EnViron. Res. 2001, 12, (5) Bishop, C. M. Neural networks for pattern recognition; Clarendon Press: Oxford, (6) Chastrette, M.; Cretin, D.; Aidi, C. E. J. Chem. Inf. Comput. Sci. 1995, 36, (7) Coleman, T. F.; Li, Y. An Interior, Trust Region Approach for Nonlinear Minimization Subject to Bounds. SIAM J. Optimization 1996, 6, (8) Coleman, T. F.; Li, Y. On the Convergence of Reflective Newton Methods for Large-Scale Nonlinear Minimization Subject to Bounds. Mathematical Programming 1994, 67(2), (9) Cuesta Sánchez, F.; Lewi, P. J.; Massart, D. L. Effect of different preprocessing methods for principal component analysis applied to the composition of mixtures: Detection of impurities in HPLC ssdad. Chemometrics Intelligent Lab. Systems 1994, 25, (10) Directive 92/32/ECC, the seventh amendment to Directive 67/548/ ECC, OJL 154 of 5.VI.92; 1992; p 1. chemical.htm. (11) ECOTOX, ECOTOXicology Database System, Code List; prepared for U.S. Environmental Protection Agency, Office of Research, Laboratory Mid-Continent Division (MED), Duluth, MN, by OAO Corporation, Duluth, MN, February (12) ECOTOX, ECOTOXicology Database System, Data Field Definition; prepared for U.S. Environmental Protection Agency, Office of Research, Laboratory Mid-Continent Division (MED), Duluth, MN, by OAO Corporation, Duluth, MN, February (13) ECOTOX, ECOTOXicology Database System, User Guide; prepared for U.S. Environmental Protection Agency, Office of Research, Laboratory Mid-Continent Division (MED), Duluth, MN, by OAO Corporation, Duluth, MN, February (14) Famili, A.; Wei-Min, S.; Weber, R.; Simoudis, E. Data Pre-processing and Intelligent Data Analysis. Intelligent Data Analysis 1997, 1, (15) Gini, G.; Lorenzini, M.; Benfenati, E.; Grasso, P.; Bruschi, M. Predictive Carcinogenicity: A Model for Aromatic Compounds, with Nitrogen-Containing Substituents, Based on Molecular Descriptors Using an Artificial Neural Network. J. Chem. Inf. Comput. Sci. 1999, 39, (16) Gori, M.; Scarselli, F. Are multilayer perceptrons adequate for pattern recognition and verification? IEEE Trans. Pattern Anal. Machine Intelligence 1998, 20/11, (17) Hasegawa, K.; Kimura, T.; Funatsu, K. GA strategy for variable selection in QSAR studies: application of GA-based region selection to a 3D-QSAR study of acetylcholinesterase inhibitors. J. Chem. Inf. Comput. Sci. 1999, 39, (18) Hecht-Neilson, R. Neurocomputing; Addison-Wesley Pub. Co.: (19) lsqcurvefit.shtml. (20) Katritzky, A. R.; Lobanov, V. S.; Karelson, M. CODESSA ComprehensiVe Descriptors for Structural and Statistical Analysis; Reference Manual, version 2.0, Gainesville, FL, (21) Kurkovà, V. Kolmogorov s theorem is relevant. Neural Computational 1991, 2/4, (22) Neagu, C.-D.; Palade, V. Neural Explicit and Implicit Knowledge Representation. In Proceedings of the Fourth International Conference on Knowledge-Based Intelligent Engineering Systems & Allied Technologies KES2000, 30th & 31st Aug, 1st Sept 2000; University of Brighton, Sussex, U.K, 2000; pp (23) Neagu, C.-D.; Palade, V. Modular neuro-fuzzy networks used in explicit and implicit knowledge integration. The 15th International
6 SCALING IN DATA MINING FOR TOXICITY PREDICTION J. Chem. Inf. Comput. Sci., Vol. 42, No. 5, FLAIRS-02 Conference, Special Track on Integrated Intelligent Systems; AAAI Press: Pensacola, FL, 2002; in press. (24) de Noord, O. E. The influence of data preprocessing on the robustness and parsimony of multivariate calibration models. Chemometrics Intelligent Laboratory Systems 1994, 23, (25) Pal, S. K.; Mitra, A. Multilayer Perceptron, Fuzzy Sets, and Classification. IEEE Trans. Neural Networks 1992, 3(5), (26) Piclin, N.; Pintore, M.; Ros, F.; Benfenati, E.; Chrétien, J. R. Data Base Mining and Prediction of Pesticide Toxicity. Presented at Chimiométrie 2000, 6-7 December, 2000, Paris, France. (27) Rumelhart, D. E.; McClelland, J. L. Parallel Distributed Processing, Explanations in the Microstructure of Cognition; MIT Press: (28) Russom, C. L.; Bradbury, S. P.; Broderius, S. J.; Hammermeister D. E.; Drummond S. J. Predicting modes of toxic action from chemical structure: acute toxicity in the fathead minnow (pimephales promelas). EnViron. Toxicol. Chem. 1997, 16, (29) Xu, L.; Zhang, W.-J. Comparison of different methods for variable selection. Anal. Chim. Acta 2001, 446, CI025520N
Tuning Neural and Fuzzy-Neural Networks for Toxicity Modeling
J. Chem. Inf. Comput. Sci. 2003, 43, 513-518 513 Tuning Neural and Fuzzy-Neural Networks for Toxicity Modeling P. Mazzatorta,*, E. Benfenati, C.-D. Neagu, and G. Gini Istituto Mario Negri, via Eritrea
More informationNeuro-Fuzzy Knowledge Representation for Toxicity Prediction of Organic Compounds
Neuro-Fuzzy Knowledge Representation for Toxicity Prediction of Organic Compounds Dan Neagu, Emilio Benfenati 2, Giuseppina Gini 3, Paolo Mazzatorta 2, Alessandra Roncaglioni 2 Abstract. Models based on
More informationTraining through European Research Training Networks - Analysis of IMAGETOX
Training through European Research Training Networks - Analysis of IMAGETOX Giuseppina C. Gini Emilio Benfenati Ciprian Daniel Neagu 1 Dept of Electronics and Information Istituto di Ricerche Farmacologiche
More informationA Comparison of Probabilistic, Neural, and Fuzzy Modeling in Ecotoxicity
A Comparison of Probabilistic, Neural, and Fuzzy Modeling in Ecotoxicity Giuseppina Gini, Marco Giumelli DEI, Politecnico di Milano, piazza L. da Vinci 32, Milan, Italy Emilio Benfenati, Nadège Piclin
More informationA Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation
1 Introduction A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation J Wesley Hines Nuclear Engineering Department The University of Tennessee Knoxville, Tennessee,
More informationNeural and Neuro-Fuzzy Models of Toxic Action of Phenols
1 Neural and Neuro-Fuzzy Models of Toxic Action of Phenols Ciprian-Daniel N. NEAGU, Aynur O. APTULA, and Giuseppina GINI Abstract The problem of describing the bio-chemical action of different classes
More information1.QSAR identifier 1.1.QSAR identifier (title): Nonlinear QSAR: artificial neural network for acute oral toxicity (rat 1.2.Other related models:
QMRF identifier (ECB Inventory):Q17-10-1-297 QMRF Title: Nonlinear QSAR: artificial neural network for acute oral toxicity (rat cell line) Printing Date:Mar 30, 2011 1.QSAR identifier 1.1.QSAR identifier
More information1.3.Software coding the model: QSARModel Molcode Ltd., Turu 2, Tartu, 51014, Estonia
QMRF identifier (ECB Inventory): QMRF Title: QSAR for acute toxicity to fathead minnow Printing Date:Feb 16, 2010 1.QSAR identifier 1.1.QSAR identifier (title): QSAR for acute toxicity to fathead minnow
More informationLIFE Project Acronym and Number ANTARES LIFE08 ENV/IT/ Deliverable Report. Deliverable Name and Number
LIFE Project Acronym and Number ANTARES LIFE08 ENV/IT/000435 Deliverable Report Deliverable Name and Number Deliverable 2 Report on the identified criteria for non-testing methods, and their scores Deliverable
More information1.QSAR identifier 1.1.QSAR identifier (title): Nonlinear QSAR model for acute oral toxicity of rat 1.2.Other related models:
QMRF identifier (JRC Inventory): QMRF Title: Nonlinear QSAR model for acute oral toxicity of rat Printing Date:5.04.2011 1.QSAR identifier 1.1.QSAR identifier (title): Nonlinear QSAR model for acute oral
More informationOverview of Different Artificial Intelligence Approaches Combined with a Deductive Logic-based Expert System for Predicting Chemical Toxicity
94 From: AAAI Technical Report SS-99-01. Compilation copyright 1999, AAAI (www.aaai.org). All rights reserved. Overview of Different Artificial Intelligence Approaches Combined with a Deductive Logic-based
More information2.4.QMRF update(s): 3.Defining the endpoint - OECD Principle 1
QMRF identifier (JRC Inventory): QMRF Title: Nonlinear ANN QSAR Model for Repeated dose 90-day oral toxicity study in rodents Printing Date:19.05.2011 1.QSAR identifier 1.1.QSAR identifier (title): Nonlinear
More informationStructure-Activity Modeling - QSAR. Uwe Koch
Structure-Activity Modeling - QSAR Uwe Koch QSAR Assumption: QSAR attempts to quantify the relationship between activity and molecular strcucture by correlating descriptors with properties Biological activity
More informationIntelligent Modular Neural Network for Dynamic System Parameter Estimation
Intelligent Modular Neural Network for Dynamic System Parameter Estimation Andrzej Materka Technical University of Lodz, Institute of Electronics Stefanowskiego 18, 9-537 Lodz, Poland Abstract: A technique
More informationModeling Mutagenicity Status of a Diverse Set of Chemical Compounds by Envelope Methods
Modeling Mutagenicity Status of a Diverse Set of Chemical Compounds by Envelope Methods Subho Majumdar School of Statistics, University of Minnesota Envelopes in Chemometrics August 4, 2014 1 / 23 Motivation
More informationPOWER SYSTEM DYNAMIC SECURITY ASSESSMENT CLASSICAL TO MODERN APPROACH
Abstract POWER SYSTEM DYNAMIC SECURITY ASSESSMENT CLASSICAL TO MODERN APPROACH A.H.M.A.Rahim S.K.Chakravarthy Department of Electrical Engineering K.F. University of Petroleum and Minerals Dhahran. Dynamic
More informationAnalysis of Fast Input Selection: Application in Time Series Prediction
Analysis of Fast Input Selection: Application in Time Series Prediction Jarkko Tikka, Amaury Lendasse, and Jaakko Hollmén Helsinki University of Technology, Laboratory of Computer and Information Science,
More informationNext Generation Computational Chemistry Tools to Predict Toxicity of CWAs
Next Generation Computational Chemistry Tools to Predict Toxicity of CWAs William (Bill) Welsh welshwj@umdnj.edu Prospective Funding by DTRA/JSTO-CBD CBIS Conference 1 A State-wide, Regional and National
More informationKeywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm
Volume 4, Issue 5, May 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Huffman Encoding
More informationOECD QSAR Toolbox v.3.3. Step-by-step example of how to categorize an inventory by mechanistic behaviour of the chemicals which it consists
OECD QSAR Toolbox v.3.3 Step-by-step example of how to categorize an inventory by mechanistic behaviour of the chemicals which it consists Background Objectives Specific Aims Trend analysis The exercise
More informationChoosing Variables with a Genetic Algorithm for Econometric models based on Neural Networks learning and adaptation.
Choosing Variables with a Genetic Algorithm for Econometric models based on Neural Networks learning and adaptation. Daniel Ramírez A., Israel Truijillo E. LINDA LAB, Computer Department, UNAM Facultad
More information1. Introduction. Central European Journal of Chemistry
Cent. Eur. J. Chem. 9(1) 2011 165-174 DOI: 10.2478/s11532-010-0135-7 Central European Journal of Chemistry a tool for QSAR-modeling of carcinogenicity: an unexpected good prediction based on a model that
More informationArtificial Neural Networks
Artificial Neural Networks Stephan Dreiseitl University of Applied Sciences Upper Austria at Hagenberg Harvard-MIT Division of Health Sciences and Technology HST.951J: Medical Decision Support Knowledge
More informationNon-linear Measure Based Process Monitoring and Fault Diagnosis
Non-linear Measure Based Process Monitoring and Fault Diagnosis 12 th Annual AIChE Meeting, Reno, NV [275] Data Driven Approaches to Process Control 4:40 PM, Nov. 6, 2001 Sandeep Rajput Duane D. Bruns
More informationCOMPARING PERFORMANCE OF NEURAL NETWORKS RECOGNIZING MACHINE GENERATED CHARACTERS
Proceedings of the First Southern Symposium on Computing The University of Southern Mississippi, December 4-5, 1998 COMPARING PERFORMANCE OF NEURAL NETWORKS RECOGNIZING MACHINE GENERATED CHARACTERS SEAN
More informationNeural network modelling of reinforced concrete beam shear capacity
icccbe 2010 Nottingham University Press Proceedings of the International Conference on Computing in Civil and Building Engineering W Tizani (Editor) Neural network modelling of reinforced concrete beam
More informationTechnical information on alternative methods
1 Technical information on alternative methods Andrew Worth European Commission, Joint Research Centre, Systems Toxicology Unit, Italy CADASTER workshop on the use of QSAR models in REACH, Slovenia, 1-2
More informationMultilayer Perceptron Tutorial
Multilayer Perceptron Tutorial Leonardo Noriega School of Computing Staffordshire University Beaconside Staffordshire ST18 0DG email: l.a.noriega@staffs.ac.uk November 17, 2005 1 Introduction to Neural
More informationESTIMATING THE ACTIVATION FUNCTIONS OF AN MLP-NETWORK
ESTIMATING THE ACTIVATION FUNCTIONS OF AN MLP-NETWORK P.V. Vehviläinen, H.A.T. Ihalainen Laboratory of Measurement and Information Technology Automation Department Tampere University of Technology, FIN-,
More information1.2.Other related models: Same endpoint and dataset as presented model in reference [1].
QMRF identifier (JRC Inventory): QMRF Title: QSAR model for Rat Chronic LOAEL Toxicity to other above-ground organisms Printing Date:20.10.2010 1.QSAR identifier 1.1.QSAR identifier (title): QSAR model
More informationARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92
ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 BIOLOGICAL INSPIRATIONS Some numbers The human brain contains about 10 billion nerve cells (neurons) Each neuron is connected to the others through 10000
More informationRecurrence Enhances the Spatial Encoding of Static Inputs in Reservoir Networks
Recurrence Enhances the Spatial Encoding of Static Inputs in Reservoir Networks Christian Emmerich, R. Felix Reinhart, and Jochen J. Steil Research Institute for Cognition and Robotics (CoR-Lab), Bielefeld
More informationOECD QSAR Toolbox v.3.0
OECD QSAR Toolbox v.3.0 Step-by-step example of how to categorize an inventory by mechanistic behaviour of the chemicals which it consists Background Objectives Specific Aims Trend analysis The exercise
More informationCHAPTER 6 QUANTITATIVE STRUCTURE ACTIVITY RELATIONSHIP (QSAR) ANALYSIS
159 CHAPTER 6 QUANTITATIVE STRUCTURE ACTIVITY RELATIONSHIP (QSAR) ANALYSIS 6.1 INTRODUCTION The purpose of this study is to gain on insight into structural features related the anticancer, antioxidant
More informationNeural Network Control of Robot Manipulators and Nonlinear Systems
Neural Network Control of Robot Manipulators and Nonlinear Systems F.L. LEWIS Automation and Robotics Research Institute The University of Texas at Arlington S. JAG ANNATHAN Systems and Controls Research
More informationA Sliding Mode Controller Using Neural Networks for Robot Manipulator
ESANN'4 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), 8-3 April 4, d-side publi., ISBN -9337-4-8, pp. 93-98 A Sliding Mode Controller Using Neural Networks for Robot
More informationSmall sample size generalization
9th Scandinavian Conference on Image Analysis, June 6-9, 1995, Uppsala, Sweden, Preprint Small sample size generalization Robert P.W. Duin Pattern Recognition Group, Faculty of Applied Physics Delft University
More informationArtificial Intelligence (AI) Common AI Methods. Training. Signals to Perceptrons. Artificial Neural Networks (ANN) Artificial Intelligence
Artificial Intelligence (AI) Artificial Intelligence AI is an attempt to reproduce intelligent reasoning using machines * * H. M. Cartwright, Applications of Artificial Intelligence in Chemistry, 1993,
More informationA Particle Swarm Optimization (PSO) Primer
A Particle Swarm Optimization (PSO) Primer With Applications Brian Birge Overview Introduction Theory Applications Computational Intelligence Summary Introduction Subset of Evolutionary Computation Genetic
More informationQMRF# Title. number and title in JRC QSAR Model Data base 2.0 (new) number and title in JRC QSAR Model Data base 1.0
Q15-410-0003 ACD/Percepta model for genotoxicity (Ames test) Q31-47-42-424 ACD/Percepta model for genotoxicity (Ames test) Q15-42-0005 ACD/Percepta model for mouse acute oral toxicity Q32-48-43-426 ACD/Percepta
More informationSpeaker Representation and Verification Part II. by Vasileios Vasilakakis
Speaker Representation and Verification Part II by Vasileios Vasilakakis Outline -Approaches of Neural Networks in Speaker/Speech Recognition -Feed-Forward Neural Networks -Training with Back-propagation
More informationArtificial Neural Network Method of Rock Mass Blastability Classification
Artificial Neural Network Method of Rock Mass Blastability Classification Jiang Han, Xu Weiya, Xie Shouyi Research Institute of Geotechnical Engineering, Hohai University, Nanjing, Jiangshu, P.R.China
More informationModeling High-Dimensional Discrete Data with Multi-Layer Neural Networks
Modeling High-Dimensional Discrete Data with Multi-Layer Neural Networks Yoshua Bengio Dept. IRO Université de Montréal Montreal, Qc, Canada, H3C 3J7 bengioy@iro.umontreal.ca Samy Bengio IDIAP CP 592,
More informationEstimation of Inelastic Response Spectra Using Artificial Neural Networks
Estimation of Inelastic Response Spectra Using Artificial Neural Networks J. Bojórquez & S.E. Ruiz Universidad Nacional Autónoma de México, México E. Bojórquez Universidad Autónoma de Sinaloa, México SUMMARY:
More informationCorrelation of liquid viscosity with molecular structure for organic compounds using different variable selection methods
Issue in Honor of Professor Dionis Sunko ARKIVOC 2002 (IV) 45-59 Correlation of liquid viscosity with molecular structure for organic compounds using different variable selection methods Bono Lu i, 1*
More informationCLOUD NOWCASTING: MOTION ANALYSIS OF ALL-SKY IMAGES USING VELOCITY FIELDS
CLOUD NOWCASTING: MOTION ANALYSIS OF ALL-SKY IMAGES USING VELOCITY FIELDS Yézer González 1, César López 1, Emilio Cuevas 2 1 Sieltec Canarias S.L. (Santa Cruz de Tenerife, Canary Islands, Spain. Tel. +34-922356013,
More informationSubcellular Localisation of Proteins in Living Cells Using a Genetic Algorithm and an Incremental Neural Network
Subcellular Localisation of Proteins in Living Cells Using a Genetic Algorithm and an Incremental Neural Network Marko Tscherepanow and Franz Kummert Applied Computer Science, Faculty of Technology, Bielefeld
More informationQMRF identifier (ECB Inventory):To be entered by JRC QMRF Title: Nonlinear ANN QSAR model for Chronic toxicity LOAEL (rat) Printing Date:Oct 19, 2010
QMRF identifier (ECB Inventory): QMRF Title: Nonlinear ANN QSAR model for Chronic toxicity LOAEL (rat) Printing Date:Oct 19, 2010 1.QSAR identifier 1.1.QSAR identifier (title): Nonlinear ANN QSAR model
More information1.QSAR identifier 1.1.QSAR identifier (title): QSARINS model for aquatic toxicity of organic chemicals in Pimephales
QMRF identifier (ECB Inventory): QMRF Title: QSARINS model for aquatic toxicity of organic chemicals in Pimephales promelas (Fathead minnow) Printing Date:5-ott-2015 1.QSAR identifier 1.1.QSAR identifier
More informationCS:4420 Artificial Intelligence
CS:4420 Artificial Intelligence Spring 2018 Neural Networks Cesare Tinelli The University of Iowa Copyright 2004 18, Cesare Tinelli and Stuart Russell a a These notes were originally developed by Stuart
More informationWeight Initialization Methods for Multilayer Feedforward. 1
Weight Initialization Methods for Multilayer Feedforward. 1 Mercedes Fernández-Redondo - Carlos Hernández-Espinosa. Universidad Jaume I, Campus de Riu Sec, Edificio TI, Departamento de Informática, 12080
More informationQSAR/QSPR modeling. Quantitative Structure-Activity Relationships Quantitative Structure-Property-Relationships
Quantitative Structure-Activity Relationships Quantitative Structure-Property-Relationships QSAR/QSPR modeling Alexandre Varnek Faculté de Chimie, ULP, Strasbourg, FRANCE QSAR/QSPR models Development Validation
More informationA Novel Activity Detection Method
A Novel Activity Detection Method Gismy George P.G. Student, Department of ECE, Ilahia College of,muvattupuzha, Kerala, India ABSTRACT: This paper presents an approach for activity state recognition of
More informationUser manual Strategies for grouping chemicals to fill data gaps to assess acute aquatic toxicity endpoints
User manual Strategies for grouping chemicals to fill data gaps to assess acute aquatic For the latest news and the most up-todate information, please consult the ECHA website. Document history Version
More informationCSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning
CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS Learning Neural Networks Classifier Short Presentation INPUT: classification data, i.e. it contains an classification (class) attribute.
More information1.3.Software coding the model: QSARModel Molcode Ltd., Turu 2, Tartu, 51014, Estonia
QMRF identifier (JRC Inventory): QMRF Title: QSAR model for Bioaccumulation, BAF smelt Printing Date:16.12.2010 1.QSAR identifier 1.1.QSAR identifier (title): QSAR model for Bioaccumulation, BAF smelt
More information1.3.Software coding the model: QSARModel Molcode Ltd., Turu 2, Tartu, 51014, Estonia
QMRF identifier (ECB Inventory):Q8-10-14-176 QMRF Title: QSAR for acute oral toxicity (in vitro) Printing Date:Mar 29, 2011 1.QSAR identifier 1.1.QSAR identifier (title): QSAR for acute oral toxicity (in
More informationIntroduction to Neural Networks
CUONG TUAN NGUYEN SEIJI HOTTA MASAKI NAKAGAWA Tokyo University of Agriculture and Technology Copyright by Nguyen, Hotta and Nakagawa 1 Pattern classification Which category of an input? Example: Character
More informationQuantitative Structure-Activity Relationship (QSAR) computational-drug-design.html
Quantitative Structure-Activity Relationship (QSAR) http://www.biophys.mpg.de/en/theoretical-biophysics/ computational-drug-design.html 07.11.2017 Ahmad Reza Mehdipour 07.11.2017 Course Outline 1. 1.Ligand-
More informationQuantitative Structure Activity Relationships: An overview
Quantitative Structure Activity Relationships: An overview Prachi Pradeep Oak Ridge Institute for Science and Education Research Participant National Center for Computational Toxicology U.S. Environmental
More informationI D I A P. Online Policy Adaptation for Ensemble Classifiers R E S E A R C H R E P O R T. Samy Bengio b. Christos Dimitrakakis a IDIAP RR 03-69
R E S E A R C H R E P O R T Online Policy Adaptation for Ensemble Classifiers Christos Dimitrakakis a IDIAP RR 03-69 Samy Bengio b I D I A P December 2003 D a l l e M o l l e I n s t i t u t e for Perceptual
More informationArtificial Neural Network
Artificial Neural Network Contents 2 What is ANN? Biological Neuron Structure of Neuron Types of Neuron Models of Neuron Analogy with human NN Perceptron OCR Multilayer Neural Network Back propagation
More informationActive Sonar Target Classification Using Classifier Ensembles
International Journal of Engineering Research and Technology. ISSN 0974-3154 Volume 11, Number 12 (2018), pp. 2125-2133 International Research Publication House http://www.irphouse.com Active Sonar Target
More informationComparative Application of Radial Basis Function and Multilayer Perceptron Neural Networks to Predict Traffic Noise Pollution in Tehran Roads
Journal of Ecological Engineering Received: 2017.10.02 Accepted: 2017.10.28 Published: 2018.01.01 Volume 19, Issue 1, January 2018, pages 113 121 https://doi.org/10.12911/22998993/79411 Comparative Application
More informationEntropy Manipulation of Arbitrary Non I inear Map pings
Entropy Manipulation of Arbitrary Non I inear Map pings John W. Fisher I11 JosC C. Principe Computational NeuroEngineering Laboratory EB, #33, PO Box 116130 University of Floridaa Gainesville, FL 326 1
More informationRegulatory use of (Q)SARs under REACH
Regulatory use of (Q)SARs under REACH Webinar on Information requirements 10 December 2009 http://echa.europa.eu 1 Using (Q)SAR models Application under REACH to fulfill information requirements Use of
More informationPart 8: Neural Networks
METU Informatics Institute Min720 Pattern Classification ith Bio-Medical Applications Part 8: Neural Netors - INTRODUCTION: BIOLOGICAL VS. ARTIFICIAL Biological Neural Netors A Neuron: - A nerve cell as
More informationArtificial Neural Network Based Approach for Design of RCC Columns
Artificial Neural Network Based Approach for Design of RCC Columns Dr T illai, ember I Karthekeyan, Non-member Recent developments in artificial neural network have opened up new possibilities in the field
More informationLecture 4: Perceptrons and Multilayer Perceptrons
Lecture 4: Perceptrons and Multilayer Perceptrons Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning Perceptrons, Artificial Neuronal Networks Lecture 4: Perceptrons
More informationDesign Collocation Neural Network to Solve Singular Perturbed Problems with Initial Conditions
Article International Journal of Modern Engineering Sciences, 204, 3(): 29-38 International Journal of Modern Engineering Sciences Journal homepage:www.modernscientificpress.com/journals/ijmes.aspx ISSN:
More informationCombination of M-Estimators and Neural Network Model to Analyze Inside/Outside Bark Tree Diameters
Combination of M-Estimators and Neural Network Model to Analyze Inside/Outside Bark Tree Diameters Kyriaki Kitikidou, Elias Milios, Lazaros Iliadis, and Minas Kaymakis Democritus University of Thrace,
More informationLecture 4: Feed Forward Neural Networks
Lecture 4: Feed Forward Neural Networks Dr. Roman V Belavkin Middlesex University BIS4435 Biological neurons and the brain A Model of A Single Neuron Neurons as data-driven models Neural Networks Training
More informationMultilayer Perceptron
Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Single Perceptron 3 Boolean Function Learning 4
More informationIterative face image feature extraction with Generalized Hebbian Algorithm and a Sanger-like BCM rule
Iterative face image feature extraction with Generalized Hebbian Algorithm and a Sanger-like BCM rule Clayton Aldern (Clayton_Aldern@brown.edu) Tyler Benster (Tyler_Benster@brown.edu) Carl Olsson (Carl_Olsson@brown.edu)
More informationNumerical Interpolation of Aircraft Aerodynamic Data through Regression Approach on MLP Network
, pp.12-17 http://dx.doi.org/10.14257/astl.2018.151.03 Numerical Interpolation of Aircraft Aerodynamic Data through Regression Approach on MLP Network Myeong-Jae Jo 1, In-Kyum Kim 1, Won-Hyuck Choi 2 and
More informationSerious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks. Cannot approximate (learn) non-linear functions
BACK-PROPAGATION NETWORKS Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks Cannot approximate (learn) non-linear functions Difficult (if not impossible) to design
More informationPATTERN CLASSIFICATION
PATTERN CLASSIFICATION Second Edition Richard O. Duda Peter E. Hart David G. Stork A Wiley-lnterscience Publication JOHN WILEY & SONS, INC. New York Chichester Weinheim Brisbane Singapore Toronto CONTENTS
More informationA Simple Algorithm for Learning Stable Machines
A Simple Algorithm for Learning Stable Machines Savina Andonova and Andre Elisseeff and Theodoros Evgeniou and Massimiliano ontil Abstract. We present an algorithm for learning stable machines which is
More informationPATTERN RECOGNITION FOR PARTIAL DISCHARGE DIAGNOSIS OF POWER TRANSFORMER
PATTERN RECOGNITION FOR PARTIAL DISCHARGE DIAGNOSIS OF POWER TRANSFORMER PO-HUNG CHEN 1, HUNG-CHENG CHEN 2, AN LIU 3, LI-MING CHEN 1 1 Department of Electrical Engineering, St. John s University, Taipei,
More informationOn the complexity of shallow and deep neural network classifiers
On the complexity of shallow and deep neural network classifiers Monica Bianchini and Franco Scarselli Department of Information Engineering and Mathematics University of Siena Via Roma 56, I-53100, Siena,
More information22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1
Neural Networks Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Brains as Computational Devices Brains advantages with respect to digital computers: Massively parallel Fault-tolerant Reliable
More informationData Mining Part 5. Prediction
Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,
More informationArtificial Neural Networks. Edward Gatt
Artificial Neural Networks Edward Gatt What are Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning Very
More informationNeural Networks and the Back-propagation Algorithm
Neural Networks and the Back-propagation Algorithm Francisco S. Melo In these notes, we provide a brief overview of the main concepts concerning neural networks and the back-propagation algorithm. We closely
More informationRole of Assembling Invariant Moments and SVM in Fingerprint Recognition
56 Role of Assembling Invariant Moments SVM in Fingerprint Recognition 1 Supriya Wable, 2 Chaitali Laulkar 1, 2 Department of Computer Engineering, University of Pune Sinhgad College of Engineering, Pune-411
More informationARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD
ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided
More informationMulti-Layer Boosting for Pattern Recognition
Multi-Layer Boosting for Pattern Recognition François Fleuret IDIAP Research Institute, Centre du Parc, P.O. Box 592 1920 Martigny, Switzerland fleuret@idiap.ch Abstract We extend the standard boosting
More informationSupervised (BPL) verses Hybrid (RBF) Learning. By: Shahed Shahir
Supervised (BPL) verses Hybrid (RBF) Learning By: Shahed Shahir 1 Outline I. Introduction II. Supervised Learning III. Hybrid Learning IV. BPL Verses RBF V. Supervised verses Hybrid learning VI. Conclusion
More informationResearch Article Stacked Heterogeneous Neural Networks for Time Series Forecasting
Hindawi Publishing Corporation Mathematical Problems in Engineering Volume 21, Article ID 373648, 2 pages doi:1.1155/21/373648 Research Article Stacked Heterogeneous Neural Networks for Time Series Forecasting
More informationKATE2017 on NET beta version https://kate2.nies.go.jp/nies/ Operating manual
KATE2017 on NET beta version http://kate.nies.go.jp https://kate2.nies.go.jp/nies/ Operating manual 2018.03.29 KATE2017 on NET was developed to predict the following ecotoxicity values: 50% effective concentration
More informationA New Hybrid System for Recognition of Handwritten-Script
computing@tanet.edu.te.ua www.tanet.edu.te.ua/computing ISSN 177-69 A New Hybrid System for Recognition of Handwritten-Script Khalid Saeed 1) and Marek Tabdzki ) Faculty of Computer Science, Bialystok
More informationPortugaliae Electrochimica Acta 26/4 (2008)
Portugaliae Electrochimica Acta 6/4 (008) 6-68 PORTUGALIAE ELECTROCHIMICA ACTA Comparison of Regression Model and Artificial Neural Network Model for the Prediction of Volume Percent of Diamond Deposition
More informationMulti-Layer Perceptrons for Functional Data Analysis: a Projection Based Approach 0
Multi-Layer Perceptrons for Functional Data Analysis: a Projection Based Approach 0 Brieuc Conan-Guez 1 and Fabrice Rossi 23 1 INRIA, Domaine de Voluceau, Rocquencourt, B.P. 105 78153 Le Chesnay Cedex,
More informationGas Detection System Based on Multi-Sensor Fusion with BP Neural Network
Sensors & Transducers 2013 by IFSA http://www.sensorsportal.com Gas Detection System Based on Multi-Sensor Fusion with BP Neural Network Qiu-Xia LIU Department of Physics, Heze University, Heze Shandong
More informationModified Learning for Discrete Multi-Valued Neuron
Proceedings of International Joint Conference on Neural Networks, Dallas, Texas, USA, August 4-9, 2013 Modified Learning for Discrete Multi-Valued Neuron Jin-Ping Chen, Shin-Fu Wu, and Shie-Jue Lee Department
More informationCOMP-4360 Machine Learning Neural Networks
COMP-4360 Machine Learning Neural Networks Jacky Baltes Autonomous Agents Lab University of Manitoba Winnipeg, Canada R3T 2N2 Email: jacky@cs.umanitoba.ca WWW: http://www.cs.umanitoba.ca/~jacky http://aalab.cs.umanitoba.ca
More informationA recursive algorithm based on the extended Kalman filter for the training of feedforward neural models. Isabelle Rivals and Léon Personnaz
In Neurocomputing 2(-3): 279-294 (998). A recursive algorithm based on the extended Kalman filter for the training of feedforward neural models Isabelle Rivals and Léon Personnaz Laboratoire d'électronique,
More informationDreem Challenge report (team Bussanati)
Wavelet course, MVA 04-05 Simon Bussy, simon.bussy@gmail.com Antoine Recanati, arecanat@ens-cachan.fr Dreem Challenge report (team Bussanati) Description and specifics of the challenge We worked on the
More informationArtifical Neural Networks
Neural Networks Artifical Neural Networks Neural Networks Biological Neural Networks.................................. Artificial Neural Networks................................... 3 ANN Structure...........................................
More informationQMRF identifier (JRC Inventory): QMRF Title: QSAR model for acute oral toxicity-in vitro method (IC50) Printing Date:
QMRF identifier (JRC Inventory): QMRF Title: QSAR model for acute oral toxicityin vitro method (IC50) Printing Date:21.01.2010 1.QSAR identifier 1.1.QSAR identifier (title): QSAR model for acute oral toxicityin
More information