Investigation of 3D copper grade changeability by Neural Networks in metasomatic ore deposit
|
|
- Derrick Chambers
- 5 years ago
- Views:
Transcription
1 Investigation of 3D copper grade changeability by Neural Networks in metasomatic ore deposit STANISLAV TOPALOV 1, VESSELIN HRISTOV 2, 1 Mine Surveying and Geodesy Dept., 2 Computer Science Dept. University of Mining and Geology St. Ivan Rilski, Sofia 1700, Studentski grad, BULGARIA stopalov@gmail.com, veso@mgu.bg Abstract: - The following neural networks (NN) types: Radial Basis Function (RBF), Generalized Regression Neural Networks (GRNN), and two/three layers Multilayer Perceptron (MLP 2, MLP 3) were examined and MLP2 and MLP3 ware determined as a suitable for the copper grade prognostication on one layer of ore deposit. In this paper following the same approach we attempt to determine the potentialities of the same NN types for copper grade prognostication to the deeper levels of the open pit mine. This approach of 3D training and testing of NN efficiency for prognostication is realized with raw exploration data in metasomatic ore deposit. Key words: ore deposit, exploration, 3D (spatial) exploration data, open pit mining method, neural networks, prognosis. 1 Introduction In this investigation STATISTICA Release 7 software is used. It is an advanced and comprehensive, integrated data analysis package. The system includes not only general-purpose statistical and graphics procedures, but also universal implementations of specialized modules (e.g., for social scientists, biomedical researchers, or engineers). One of its modules is STATISTICA Neural Networks (SNN) - powerful and extremely fast neural network data analysis package. It allows selection between several Neural Network models and testing their efficiency by parameters variation. SNN consists of two main tools. Intelligent Problem Solver (IPS) is a tool available to guide through the selection process. STATISTICA supports the most important classes of neural networks, including: Multilayer perceptron, Radial Basis Function Networks, Kohonen Self-Organizing Feature Maps, Probabilistic (Bayesian) Neural Networks, Generalized Regression Neural Networks, Linear Modeling. The IPS can create networks using data whose cases are independent (standard networks) as well as networks that predict future observations based on previous observations of the same variable (time series networks). A significant amount of time during the design of a neural network is spent on the selection of appropriate variables, and then optimizing the network architecture by heuristic search. IPS gives possibility to train automatically different types of NN using the same data and to make easy the best choice. StatSoft, Inc. The other tool is Custom Network Designer (CND), allowing to construct individual network architecture and to specify training algorithms. The best type of NN determined by IPS and its parameters can be used as a basis for network design in CND. The training process is able to be observed in real time and to be improved. An approach using these two SNN tools for prognostication of geological parameters value only in the limits of one production level (2D), is described in [5,6] 2. Problem Formulation The degree of the ore body exploration and its geological complexity are closely interrelated by the well-known reasons. The geological complexity and grade changeability determine the exploration methods [5]. Unfortunately, there is no normative standard for regulating of Bulgarian ore deposits geological complexity and changeability character. Similar standard would solve the problems associated to: the ratio between drilling and mining exploration activities, the geometry and the density of the exploration grid, the sampling methods and techniques, etc. [1, 3]. The difficulty in assessing quantitatively the geological complexity and changeability character of ore body (or its parts) arises out of the hypothetical character of the input geological data. The methods of interpolation and extrapolation, limited by the location of the sampling points are essential for the geological and mining solutions. ISBN: ISSN:
2 The Elatsite porphyry copper deposit is situated about km eastern from the city of Sofia and about 6 km southern from the town of Еtropole. The ore body of the deposit is an ore stock. In the ore are fixed 61 minerals generally. 48 of them are ore minerals and the others are non-metallic minerals. Basically they are essential, accessories and rare. The main ore minerals are chalcopyrite, pyrite, bornite and molybdenite. The ore body has been detailed prospected with exploratory holes. Now, during the mining process exploitational exploration is realized by systematic blast hole sampling. The samples are located in grid approximately m. Mining method is opencast mining. Most of the extracting levels are already worked-out. All the information about copper grade from these levels is available. 3. Problem Solution The information from exploration exploitational stage is organized in MS Excel format, which consist number of sample, the coordinates X, Y and H (3D), copper grade value, getting date etc. for every one extracting levels. As input variables the sample coordinates X, Y and H (three input parameters) are used. As an output variable from the neural network the copper grade (one output parameter) is required to be determined. All sampling data from four contiguous extract levels is divided of two main groups. The first contains the information (2540 sample points) from three consecutive levels and these data is served for neural network training. The data (690 sample points) from second group for the fourth extracting level (the deepest in our case) is served for testing of already trained neural networks. The IPS tool is used at the first stage of the investigation. The Radial Basis Function (RBF), Generalized Regression Neural Network (GRNN) and Multi Layer Perception (MLP with two or three hidden layers) type neural networks have been trained and tested. To avoid unnecessary elaborate approximation, the nodes number in hidden layers is specified between five and ten [2, 4]. Some different type trained neural networks with different parameters were determined as a result of IPS performance. The best network is chosen by the criteria of lowest training error and testing error. After several IPS performances it makes an impression that the best results give us Multi Layer Perception with three hidden layers (MLP3) - 10 nodes in the first, 7 in the second, trained in 100 epochs by Back propagation algorithm in a first phase, and in 20 and 26 epochs by Conjugate gradient descent algorithm in the second and the third phase respectively. The parameters of this neural network are given in Table 1. Graphically its architecture is shown in Figure 1. Table 1 IPS Network Tests No Profile Train Error Test Error Training 11 MLP 3: : BP100,CG20,CG26b Profile : MLP 3: :1 x y H Cu Fig. 1. The Multi Layer Perceptron chosen by IPS as the best The second investigation stage involves of neural network construction using the STATISTICA Custom Network Design tool. All initial characteristics correspond to IPS recommended i.e. analogous of ISBN: ISSN:
3 these given in Figure 1. The transfer function chosen was a logistic, normal distribution of the initial values of weights, and three training phases. Back Propagation algorithm with 100 epochs was used in the first phase and the Conjugate Gradient Descent algorithm with 500 epochs in the second and the third phase. As a result of training process the value of mean square training error is equal to 0.07 and the mean square test error value is The graph on Figure 2 presents the dynamic of train error decrease during the training process. It is going down promptly in the first phase and then decreases smoothly. Table 2 gives the weights of the three layers trained perceptron. Error Training Graph Epochs Fig. 2. CND Training Process Table 2. The weights of the three layers trained perceptron Nodes Thresh Figure 3 shows the behavior of NN performance on the test data. Similarity between Cu (row data) variation and Cu (NN output) variation can be clearly noted. The correlation coefficients as a measure of relation between copper grade raw data and neural network output (predicted data) are estimated. The scaterplots with linear relationship between all raw copper grade data, train and test data and neural network predicted are given in Figure 4. The significance of the correlation coefficient in training stage (0.81) denotes that the quality of training is quite good. The value of ISBN: ISSN:
4 correlation coefficient between raw test data and neural network predicted is lower (0.67) but significant too, which guarantee reliable prognosis of copper grade Cu (Row Data) Cu (NN Output) Fig. 3. Behavior of NN performance on the test data a) R = 0.79 b) R = 0.81 c) R = 0.62 Fig. 4. Correlation scatterplots and coefficients: a) all Cu row data versus all NN output, b) train Cu row data versus train NN output, c) test Cu row data versus test NN output 4. Conclusion Three types of neural network (MLP, RBF and GRNN) were tried using the IPS software tool of STATISTICA on real data provided from the stage of porphiry copper deposit exploration. The information includes coordinates (X, Y and H) of prospect hole and the value of copper grade for four consecutive levels. As the best neural network model, IPS indicates the Multi Layer Perceptron (MLP) with two hidden layers - 10 nodes in the first and 7 in the second. Using the STATISTICA Custom Network Design tool with initial characteristics recommended from IPS a prognostication model for copper grade value in the deepest extracting level of open pit mine is constructed. The reliability of training process and testing is estimated by the correlation coefficients between raw copper grade data and neural network predicted. So far, neural network application has not been employed extensively in estimations and prognoses concerning mineral deposits. Besides, the unique character of natural conditions, varying even within the same type of deposit, requires experimenting with different types of prognostication model. Narrowing of the scope of different types of neural network (even such with different parameters) was fulfilled using the module of Intelligent Problem Solver of the STATISTICA 7.0 software. The explorer can choose the best type of model from those suggested by IPS, a model whose parameters are the basis of active prognostication model designing using the STATISTICA 7.0 module of Custom Network Design. The module can trace the quality of training in the process of prognostication problem solving. ISBN: ISSN:
5 References: [1] Denby B., C. C. H. Burnet, GEMNet - Using Neural Networks to approximate the Location - Grade Relationship in Mineral Deposits., AIMS Research Unit, Department of Mineral Resources Engineering, University of Nottingham, UK, [2] Kapageridis I. K., B. Denby, G. Hunter, GEMNet II - A Neural Ore Grade Estimation System. In: 29th Interation Symposium on the Application of Computers and Operations Research in the Minerals Industries (APCOM '99), Denver, Colorado, [3] Kapageridis I. K., B. Denby, D. Scofield. GEMNet II - An Alternative Method for Grade Estimation. uk.geocities.com/adaptive_geoservices/mpes2000 _GEMNET.pdf [4] Johansson, E. M., Dowla, F. U., & Goodman D. M. "Backpropagation learning for multilayer feed - forward neural networks using the conjugate gradient method", International Journal of Neural Systems, Vol. 2, No. 4, pp , [5] Topalov S., K. Boev, I. Koshev. Experiment of Artificial Intelligence (Neural Network) use for geological parameter prognostication. X-th National Mine Surveying Conference, St. Konstantin and Helene, June, (Топалов Ст., К. Боев, Ив. Кошев, Опит за използване на изкуствен интелект (невронни мрежи) за прогнозиране на изучаван геоложки показател., Х-та Национална Маркшайдерска конференция., Св. Конст. и Елена, юни, 2003.) [6] Topalov S., V. Hristov, Prognostication of geological parameters value using Neural Networks., 13-th International Congress of the International Society for Mine Surveying, Budapest, Hungary, September 2007, (No 48). ISBN: ISSN:
ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92
ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 BIOLOGICAL INSPIRATIONS Some numbers The human brain contains about 10 billion nerve cells (neurons) Each neuron is connected to the others through 10000
More informationArtificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!
Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011! 1 Todayʼs lecture" How the brain works (!)! Artificial neural networks! Perceptrons! Multilayer feed-forward networks! Error
More informationIntroduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis
Introduction to Natural Computation Lecture 9 Multilayer Perceptrons and Backpropagation Peter Lewis 1 / 25 Overview of the Lecture Why multilayer perceptrons? Some applications of multilayer perceptrons.
More informationArtificial Neural Networks (ANN) Xiaogang Su, Ph.D. Department of Mathematical Science University of Texas at El Paso
Artificial Neural Networks (ANN) Xiaogang Su, Ph.D. Department of Mathematical Science University of Texas at El Paso xsu@utep.edu Fall, 2018 Outline Introduction A Brief History ANN Architecture Terminology
More informationArtificial Neural Networks. Edward Gatt
Artificial Neural Networks Edward Gatt What are Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning Very
More informationAssessment of Maximum Explosive Charge Used Per Delay in Surface Mines
Assessment of Maximum Explosive Charge Used Per Delay in Surface Mines MANOJ KHANDELWAL 1 & NIKOS MASTORAKIS 2 1 Department of Mining Engineering, College of Technology and Engineering, Maharana Pratap
More informationArtifical Neural Networks
Neural Networks Artifical Neural Networks Neural Networks Biological Neural Networks.................................. Artificial Neural Networks................................... 3 ANN Structure...........................................
More information4. Multilayer Perceptrons
4. Multilayer Perceptrons This is a supervised error-correction learning algorithm. 1 4.1 Introduction A multilayer feedforward network consists of an input layer, one or more hidden layers, and an output
More informationMultilayer Perceptrons and Backpropagation
Multilayer Perceptrons and Backpropagation Informatics 1 CG: Lecture 7 Chris Lucas School of Informatics University of Edinburgh January 31, 2017 (Slides adapted from Mirella Lapata s.) 1 / 33 Reading:
More informationArtificial Neural Networks (ANN)
Artificial Neural Networks (ANN) Edmondo Trentin April 17, 2013 ANN: Definition The definition of ANN is given in 3.1 points. Indeed, an ANN is a machine that is completely specified once we define its:
More informationNeural Networks. Intro to AI Bert Huang Virginia Tech
Neural Networks Intro to AI Bert Huang Virginia Tech Outline Biological inspiration for artificial neural networks Linear vs. nonlinear functions Learning with neural networks: back propagation https://en.wikipedia.org/wiki/neuron#/media/file:chemical_synapse_schema_cropped.jpg
More informationARTIFICIAL INTELLIGENCE MODELLING OF STOCHASTIC PROCESSES IN DIGITAL COMMUNICATION NETWORKS
Journal of ELECTRICAL ENGINEERING, VOL. 54, NO. 9-, 23, 255 259 ARTIFICIAL INTELLIGENCE MODELLING OF STOCHASTIC PROCESSES IN DIGITAL COMMUNICATION NETWORKS Dimitar Radev Svetla Radeva The paper presents
More informationHeterogeneous mixture-of-experts for fusion of locally valid knowledge-based submodels
ESANN'29 proceedings, European Symposium on Artificial Neural Networks - Advances in Computational Intelligence and Learning. Bruges Belgium), 22-24 April 29, d-side publi., ISBN 2-9337-9-9. Heterogeneous
More informationShort Term Load Forecasting Based Artificial Neural Network
Short Term Load Forecasting Based Artificial Neural Network Dr. Adel M. Dakhil Department of Electrical Engineering Misan University Iraq- Misan Dr.adelmanaa@gmail.com Abstract Present study develops short
More informationComparison of Soviet Classified Resources to JORC Code (2004) Source: Wardel Armstrong International
Comparison of Soviet Classified Resources to JORC Code (2004) Source: Wardel Armstrong International Table of Contents Soviet Resources and Reserves... 1 Soviet System of Resource/Reserve Classification
More informationIntroduction to Neural Networks
Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning
More informationNONLINEAR CLASSIFICATION AND REGRESSION. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition
NONLINEAR CLASSIFICATION AND REGRESSION Nonlinear Classification and Regression: Outline 2 Multi-Layer Perceptrons The Back-Propagation Learning Algorithm Generalized Linear Models Radial Basis Function
More informationBACKPROPAGATION. Neural network training optimization problem. Deriving backpropagation
BACKPROPAGATION Neural network training optimization problem min J(w) w The application of gradient descent to this problem is called backpropagation. Backpropagation is gradient descent applied to J(w)
More informationSupervised (BPL) verses Hybrid (RBF) Learning. By: Shahed Shahir
Supervised (BPL) verses Hybrid (RBF) Learning By: Shahed Shahir 1 Outline I. Introduction II. Supervised Learning III. Hybrid Learning IV. BPL Verses RBF V. Supervised verses Hybrid learning VI. Conclusion
More informationNeural Networks: Backpropagation
Neural Networks: Backpropagation Machine Learning Fall 2017 Based on slides and material from Geoffrey Hinton, Richard Socher, Dan Roth, Yoav Goldberg, Shai Shalev-Shwartz and Shai Ben-David, and others
More informationAN ARTIFICIAL NEURAL NETWORK MODEL FOR ROAD ACCIDENT PREDICTION: A CASE STUDY OF KHULNA METROPOLITAN CITY
Proceedings of the 4 th International Conference on Civil Engineering for Sustainable Development (ICCESD 2018), 9~11 February 2018, KUET, Khulna, Bangladesh (ISBN-978-984-34-3502-6) AN ARTIFICIAL NEURAL
More informationNeural Networks biological neuron artificial neuron 1
Neural Networks biological neuron artificial neuron 1 A two-layer neural network Output layer (activation represents classification) Weighted connections Hidden layer ( internal representation ) Input
More informationArtificial Neural Networks. Introduction to Computational Neuroscience Tambet Matiisen
Artificial Neural Networks Introduction to Computational Neuroscience Tambet Matiisen 2.04.2018 Artificial neural network NB! Inspired by biology, not based on biology! Applications Automatic speech recognition
More informationFor personal use only
20 16 FEBRUARY JANUARY 2017 ASX ANNOUNCEMENT OMMANEY FOLLOW-UP RC DRILLING Ommaney follow up drilling to commence in early to mid-march 2017. A 7 hole RC drilling program will follow up the intense alteration
More informationy(x n, w) t n 2. (1)
Network training: Training a neural network involves determining the weight parameter vector w that minimizes a cost function. Given a training set comprising a set of input vector {x n }, n = 1,...N,
More informationNeural Networks Task Sheet 2. Due date: May
Neural Networks 2007 Task Sheet 2 1/6 University of Zurich Prof. Dr. Rolf Pfeifer, pfeifer@ifi.unizh.ch Department of Informatics, AI Lab Matej Hoffmann, hoffmann@ifi.unizh.ch Andreasstrasse 15 Marc Ziegler,
More informationUnit III. A Survey of Neural Network Model
Unit III A Survey of Neural Network Model 1 Single Layer Perceptron Perceptron the first adaptive network architecture was invented by Frank Rosenblatt in 1957. It can be used for the classification of
More informationGradient Descent Training Rule: The Details
Gradient Descent Training Rule: The Details 1 For Perceptrons The whole idea behind gradient descent is to gradually, but consistently, decrease the output error by adjusting the weights. The trick is
More informationArtificial Neural Networks
Introduction ANN in Action Final Observations Application: Poverty Detection Artificial Neural Networks Alvaro J. Riascos Villegas University of los Andes and Quantil July 6 2018 Artificial Neural Networks
More informationContents 1 Introduction 2 Statistical Tools and Concepts
1 Introduction... 1 1.1 Objectives and Approach... 1 1.2 Scope of Resource Modeling... 2 1.3 Critical Aspects... 2 1.3.1 Data Assembly and Data Quality... 2 1.3.2 Geologic Model and Definition of Estimation
More informationPrediction of gas emission quantity using artificial neural networks
Available online www.jocpr.com Journal of Chemical and Pharmaceutical Research, 2014, 6(6):1653-165 Research Article ISSN : 095-384 CODEN(USA) : JCPRC5 Prediction of gas emission quantity using artificial
More informationArtificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter
Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter (Chair) STF - China Fellow francesco.dimaio@polimi.it
More informationModule 12. Machine Learning. Version 2 CSE IIT, Kharagpur
Module 12 Machine Learning Lesson 39 Neural Networks - III 12.4.4 Multi-Layer Perceptrons In contrast to perceptrons, multilayer networks can learn not only multiple decision boundaries, but the boundaries
More informationNonlinear Classification
Nonlinear Classification INFO-4604, Applied Machine Learning University of Colorado Boulder October 5-10, 2017 Prof. Michael Paul Linear Classification Most classifiers we ve seen use linear functions
More informationOptimal Artificial Neural Network Modeling of Sedimentation yield and Runoff in high flow season of Indus River at Besham Qila for Terbela Dam
Optimal Artificial Neural Network Modeling of Sedimentation yield and Runoff in high flow season of Indus River at Besham Qila for Terbela Dam Akif Rahim 1, Amina Akif 2 1 Ph.D Scholar in Center of integrated
More informationComputational statistics
Computational statistics Lecture 3: Neural networks Thierry Denœux 5 March, 2016 Neural networks A class of learning methods that was developed separately in different fields statistics and artificial
More informationIntroduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen
Neural Networks - I Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Neural Networks 1 /
More information22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1
Neural Networks Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Brains as Computational Devices Brains advantages with respect to digital computers: Massively parallel Fault-tolerant Reliable
More informationApplication of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption
Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption ANDRÉ NUNES DE SOUZA, JOSÉ ALFREDO C. ULSON, IVAN NUNES
More informationCHAPTER IX Radial Basis Function Networks
Ugur HAICI - METU EEE - ANKARA 2/2/2005 CHAPTER IX Radial Basis Function Networks Introduction Radial basis function (RBF) networks are feed-forward networks trained using a supervised training algorithm.
More informationArtificial Intelligence
Artificial Intelligence Jeff Clune Assistant Professor Evolving Artificial Intelligence Laboratory Announcements Be making progress on your projects! Three Types of Learning Unsupervised Supervised Reinforcement
More informationMODELLING ENERGY DEMAND FORECASTING USING NEURAL NETWORKS WITH UNIVARIATE TIME SERIES
MODELLING ENERGY DEMAND FORECASTING USING NEURAL NETWORKS WITH UNIVARIATE TIME SERIES S. Cankurt 1, M. Yasin 2 1&2 Ishik University Erbil, Iraq 1 s.cankurt@ishik.edu.iq, 2 m.yasin@ishik.edu.iq doi:10.23918/iec2018.26
More informationArtificial Neural Network
Artificial Neural Network Contents 2 What is ANN? Biological Neuron Structure of Neuron Types of Neuron Models of Neuron Analogy with human NN Perceptron OCR Multilayer Neural Network Back propagation
More informationLogistic Regression & Neural Networks
Logistic Regression & Neural Networks CMSC 723 / LING 723 / INST 725 Marine Carpuat Slides credit: Graham Neubig, Jacob Eisenstein Logistic Regression Perceptron & Probabilities What if we want a probability
More informationNeural Networks, Computation Graphs. CMSC 470 Marine Carpuat
Neural Networks, Computation Graphs CMSC 470 Marine Carpuat Binary Classification with a Multi-layer Perceptron φ A = 1 φ site = 1 φ located = 1 φ Maizuru = 1 φ, = 2 φ in = 1 φ Kyoto = 1 φ priest = 0 φ
More informationPOWER SYSTEM DYNAMIC SECURITY ASSESSMENT CLASSICAL TO MODERN APPROACH
Abstract POWER SYSTEM DYNAMIC SECURITY ASSESSMENT CLASSICAL TO MODERN APPROACH A.H.M.A.Rahim S.K.Chakravarthy Department of Electrical Engineering K.F. University of Petroleum and Minerals Dhahran. Dynamic
More informationNeural networks (NN) 1
Neural networks (NN) 1 Hedibert F. Lopes Insper Institute of Education and Research São Paulo, Brazil 1 Slides based on Chapter 11 of Hastie, Tibshirani and Friedman s book The Elements of Statistical
More informationRadial basis functions and kriging a gold case study
Page Radial basis functions and kriging a gold case study D Kentwell, Principal Consultant, SRK Consulting This article was first published in The AusIMM Bulletin, December. Introduction Recent advances
More informationIVANHOE MINES EXTENDS HUGO NORTH GEOPHYSICAL ANOMALY FOUR KILOMETRES NORTH ADDITIONAL GEOPHYSICAL TARGETS IDENTIFIED
December 6, 2004 IVANHOE MINES EXTENDS HUGO NORTH GEOPHYSICAL ANOMALY FOUR KILOMETRES NORTH ADDITIONAL GEOPHYSICAL TARGETS IDENTIFIED ULAANBAATAR, MONGOLIA Ivanhoe Mines Chairman Robert Friedland, Deputy
More informationECE521 Lectures 9 Fully Connected Neural Networks
ECE521 Lectures 9 Fully Connected Neural Networks Outline Multi-class classification Learning multi-layer neural networks 2 Measuring distance in probability space We learnt that the squared L2 distance
More informationChapter ML:VI (continued)
Chapter ML:VI (continued) VI. Neural Networks Perceptron Learning Gradient Descent Multilayer Perceptron Radial asis Functions ML:VI-56 Neural Networks STEIN 2005-2013 Definition 1 (Linear Separability)
More informationNeural Networks and the Back-propagation Algorithm
Neural Networks and the Back-propagation Algorithm Francisco S. Melo In these notes, we provide a brief overview of the main concepts concerning neural networks and the back-propagation algorithm. We closely
More informationCarrapateena Mineral Resources Explanatory Notes April OZ Minerals Limited. Carrapateena Mineral Resources Statement April
OZ Minerals Limited Carrapateena Mineral Resources Statement April 14 2011 CARRAPATEENA MINERAL RESOURCE STATEMENT April 14, 2011 The Carrapateena Resource Statement relates to an upgrading to an Inferred
More informationA Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation
1 Introduction A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation J Wesley Hines Nuclear Engineering Department The University of Tennessee Knoxville, Tennessee,
More informationLearning and Neural Networks
Artificial Intelligence Learning and Neural Networks Readings: Chapter 19 & 20.5 of Russell & Norvig Example: A Feed-forward Network w 13 I 1 H 3 w 35 w 14 O 5 I 2 w 23 w 24 H 4 w 45 a 5 = g 5 (W 3,5 a
More informationEngineering Part IIB: Module 4F10 Statistical Pattern Processing Lecture 6: Multi-Layer Perceptrons I
Engineering Part IIB: Module 4F10 Statistical Pattern Processing Lecture 6: Multi-Layer Perceptrons I Phil Woodland: pcw@eng.cam.ac.uk Michaelmas 2012 Engineering Part IIB: Module 4F10 Introduction In
More informationData Mining (Mineria de Dades)
Data Mining (Mineria de Dades) Lluís A. Belanche belanche@lsi.upc.edu Soft Computing Research Group Dept. de Llenguatges i Sistemes Informàtics (Software department) Universitat Politècnica de Catalunya
More informationFeed-forward Network Functions
Feed-forward Network Functions Sargur Srihari Topics 1. Extension of linear models 2. Feed-forward Network Functions 3. Weight-space symmetries 2 Recap of Linear Models Linear Models for Regression, Classification
More informationReading Group on Deep Learning Session 1
Reading Group on Deep Learning Session 1 Stephane Lathuiliere & Pablo Mesejo 2 June 2016 1/31 Contents Introduction to Artificial Neural Networks to understand, and to be able to efficiently use, the popular
More informationSpeaker Representation and Verification Part II. by Vasileios Vasilakakis
Speaker Representation and Verification Part II by Vasileios Vasilakakis Outline -Approaches of Neural Networks in Speaker/Speech Recognition -Feed-Forward Neural Networks -Training with Back-propagation
More informationDATA MINING WITH DIFFERENT TYPES OF X-RAY DATA
DATA MINING WITH DIFFERENT TYPES OF X-RAY DATA 315 C. K. Lowe-Ma, A. E. Chen, D. Scholl Physical & Environmental Sciences, Research and Advanced Engineering Ford Motor Company, Dearborn, Michigan, USA
More informationFEEDBACK GMDH-TYPE NEURAL NETWORK AND ITS APPLICATION TO MEDICAL IMAGE ANALYSIS OF LIVER CANCER. Tadashi Kondo and Junji Ueno
International Journal of Innovative Computing, Information and Control ICIC International c 2012 ISSN 1349-4198 Volume 8, Number 3(B), March 2012 pp. 2285 2300 FEEDBACK GMDH-TYPE NEURAL NETWORK AND ITS
More informationData and prognosis for renewable energy
The Hong Kong Polytechnic University Department of Electrical Engineering Project code: FYP_27 Data and prognosis for renewable energy by Choi Man Hin 14072258D Final Report Bachelor of Engineering (Honours)
More informationPattern Recognition Prof. P. S. Sastry Department of Electronics and Communication Engineering Indian Institute of Science, Bangalore
Pattern Recognition Prof. P. S. Sastry Department of Electronics and Communication Engineering Indian Institute of Science, Bangalore Lecture - 27 Multilayer Feedforward Neural networks with Sigmoidal
More informationCSE 417T: Introduction to Machine Learning. Final Review. Henry Chai 12/4/18
CSE 417T: Introduction to Machine Learning Final Review Henry Chai 12/4/18 Overfitting Overfitting is fitting the training data more than is warranted Fitting noise rather than signal 2 Estimating! "#$
More informationData Mining Part 5. Prediction
Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,
More informationCOMPARING PERFORMANCE OF NEURAL NETWORKS RECOGNIZING MACHINE GENERATED CHARACTERS
Proceedings of the First Southern Symposium on Computing The University of Southern Mississippi, December 4-5, 1998 COMPARING PERFORMANCE OF NEURAL NETWORKS RECOGNIZING MACHINE GENERATED CHARACTERS SEAN
More informationMachine Learning. Neural Networks. (slides from Domingos, Pardo, others)
Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward
More informationAN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009
AN INTRODUCTION TO NEURAL NETWORKS Scott Kuindersma November 12, 2009 SUPERVISED LEARNING We are given some training data: We must learn a function If y is discrete, we call it classification If it is
More informationModelling and Prediction of 150KW PV Array System in Northern India using Artificial Neural Network
International Journal of Engineering Science Invention ISSN (Online): 2319 6734, ISSN (Print): 2319 6726 Volume 5 Issue 5 May 2016 PP.18-25 Modelling and Prediction of 150KW PV Array System in Northern
More informationMachine Learning. Neural Networks. (slides from Domingos, Pardo, others)
Machine Learning Neural Networks (slides from Domingos, Pardo, others) Human Brain Neurons Input-Output Transformation Input Spikes Output Spike Spike (= a brief pulse) (Excitatory Post-Synaptic Potential)
More informationPattern Recognition and Machine Learning
Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability
More informationIntroduction to Neural Networks
CUONG TUAN NGUYEN SEIJI HOTTA MASAKI NAKAGAWA Tokyo University of Agriculture and Technology Copyright by Nguyen, Hotta and Nakagawa 1 Pattern classification Which category of an input? Example: Character
More informationAccuracy improvement program for VMap1 to Multinational Geospatial Co-production Program (MGCP) using artificial neural networks
7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences. Edited by M. Caetano and M. Painho. Accuracy improvement program for VMap1 to Multinational Geospatial
More informationECLT 5810 Classification Neural Networks. Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann
ECLT 5810 Classification Neural Networks Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann Neural Networks A neural network is a set of connected input/output
More informationMODELLING OF TOOL LIFE, TORQUE AND THRUST FORCE IN DRILLING: A NEURO-FUZZY APPROACH
ISSN 1726-4529 Int j simul model 9 (2010) 2, 74-85 Original scientific paper MODELLING OF TOOL LIFE, TORQUE AND THRUST FORCE IN DRILLING: A NEURO-FUZZY APPROACH Roy, S. S. Department of Mechanical Engineering,
More informationMachine Learning of Environmental Spatial Data Mikhail Kanevski 1, Alexei Pozdnoukhov 2, Vasily Demyanov 3
1 3 4 5 6 7 8 9 10 11 1 13 14 15 16 17 18 19 0 1 3 4 5 6 7 8 9 30 31 3 33 International Environmental Modelling and Software Society (iemss) 01 International Congress on Environmental Modelling and Software
More informationKeywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm
Volume 4, Issue 5, May 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Huffman Encoding
More informationResearch Article NEURAL NETWORK TECHNIQUE IN DATA MINING FOR PREDICTION OF EARTH QUAKE K.Karthikeyan, Sayantani Basu
ISSN: 0975-766X CODEN: IJPTFI Available Online through Research Article www.ijptonline.com NEURAL NETWORK TECHNIQUE IN DATA MINING FOR PREDICTION OF EARTH QUAKE K.Karthikeyan, Sayantani Basu 1 Associate
More informationCOMP 551 Applied Machine Learning Lecture 14: Neural Networks
COMP 551 Applied Machine Learning Lecture 14: Neural Networks Instructor: Ryan Lowe (ryan.lowe@mail.mcgill.ca) Slides mostly by: Class web page: www.cs.mcgill.ca/~hvanho2/comp551 Unless otherwise noted,
More informationMachine Learning and Data Mining. Multi-layer Perceptrons & Neural Networks: Basics. Prof. Alexander Ihler
+ Machine Learning and Data Mining Multi-layer Perceptrons & Neural Networks: Basics Prof. Alexander Ihler Linear Classifiers (Perceptrons) Linear Classifiers a linear classifier is a mapping which partitions
More informationPattern Classification
Pattern Classification All materials in these slides were taen from Pattern Classification (2nd ed) by R. O. Duda,, P. E. Hart and D. G. Stor, John Wiley & Sons, 2000 with the permission of the authors
More informationMultitask Learning of Environmental Spatial Data
9th International Congress on Environmental Modelling and Software Brigham Young University BYU ScholarsArchive 6th International Congress on Environmental Modelling and Software - Leipzig, Germany - July
More informationMultilayer Neural Networks
Multilayer Neural Networks Multilayer Neural Networks Discriminant function flexibility NON-Linear But with sets of linear parameters at each layer Provably general function approximators for sufficient
More informationCSE446: Neural Networks Spring Many slides are adapted from Carlos Guestrin and Luke Zettlemoyer
CSE446: Neural Networks Spring 2017 Many slides are adapted from Carlos Guestrin and Luke Zettlemoyer Human Neurons Switching time ~ 0.001 second Number of neurons 10 10 Connections per neuron 10 4-5 Scene
More informationMachine Learning: Multi Layer Perceptrons
Machine Learning: Multi Layer Perceptrons Prof. Dr. Martin Riedmiller Albert-Ludwigs-University Freiburg AG Maschinelles Lernen Machine Learning: Multi Layer Perceptrons p.1/61 Outline multi layer perceptrons
More informationPREDICTION THE JOMINY CURVES BY MEANS OF NEURAL NETWORKS
Tomislav Filetin, Dubravko Majetić, Irena Žmak Faculty of Mechanical Engineering and Naval Architecture, University of Zagreb, Croatia PREDICTION THE JOMINY CURVES BY MEANS OF NEURAL NETWORKS ABSTRACT:
More informationComparison learning algorithms for artificial neural network model for flood forecasting, Chiang Mai, Thailand
22nd International Congress on Modelling and Simulation, Hobart, Tasmania, Australia, 3 to 8 December 2017 mssanz.org.au/modsim2017 Comparison learning algorithms for artificial neural network model for
More informationTameapa Regional Geology
Tameapa Project 1 Tameapa Regional Geology History San Francisco Mines of Mexico Ltd. (San Francisco), which completed an exploration program, including six drill holes (1,157 m) between 1956 and 1959.
More informationNeural networks. Chapter 19, Sections 1 5 1
Neural networks Chapter 19, Sections 1 5 Chapter 19, Sections 1 5 1 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 19, Sections 1 5 2 Brains 10
More informationARTIFICIAL INTELLIGENCE. Artificial Neural Networks
INFOB2KI 2017-2018 Utrecht University The Netherlands ARTIFICIAL INTELLIGENCE Artificial Neural Networks Lecturer: Silja Renooij These slides are part of the INFOB2KI Course Notes available from www.cs.uu.nl/docs/vakken/b2ki/schema.html
More informationLab 5: 16 th April Exercises on Neural Networks
Lab 5: 16 th April 01 Exercises on Neural Networks 1. What are the values of weights w 0, w 1, and w for the perceptron whose decision surface is illustrated in the figure? Assume the surface crosses the
More informationSerious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks. Cannot approximate (learn) non-linear functions
BACK-PROPAGATION NETWORKS Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks Cannot approximate (learn) non-linear functions Difficult (if not impossible) to design
More informationLecture 5: Logistic Regression. Neural Networks
Lecture 5: Logistic Regression. Neural Networks Logistic regression Comparison with generative models Feed-forward neural networks Backpropagation Tricks for training neural networks COMP-652, Lecture
More informationWeather based forecasting model for crops yield using neural network approach
Statistics and Applications Volume 9, Nos. &2, 20 (New Series), pp. 55-69 Weather based forecasting model for crops yield using neural network approach Ratna Ra Laxmi and Amrender Kumar 2 Reader, Department
More informationJournal of of Computer Applications Research Research and Development and Development (JCARD), ISSN (Print), ISSN
JCARD Journal of of Computer Applications Research Research and Development and Development (JCARD), ISSN 2248-9304(Print), ISSN 2248-9312 (JCARD),(Online) ISSN 2248-9304(Print), Volume 1, Number ISSN
More informationMultilayer Perceptrons (MLPs)
CSE 5526: Introduction to Neural Networks Multilayer Perceptrons (MLPs) 1 Motivation Multilayer networks are more powerful than singlelayer nets Example: XOR problem x 2 1 AND x o x 1 x 2 +1-1 o x x 1-1
More informationepochs epochs
Neural Network Experiments To illustrate practical techniques, I chose to use the glass dataset. This dataset has 214 examples and 6 classes. Here are 4 examples from the original dataset. The last values
More informationNeural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21
Neural Networks Chapter 8, Section 7 TB Artificial Intelligence Slides from AIMA http://aima.cs.berkeley.edu / 2 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural
More informationIncreased geometallurgical performance in industrial mineral operations through multivariate analysis of MWD-data
ISSN 1893-1170 (online utgave) ISSN 1893-1057 (trykt utgave) www.mineralproduksjon.no Note Increased geometallurgical performance in industrial mineral operations through multivariate analysis of MWD-data
More information