Prediction of p-wave velocity and anisotropic property of rock using artificial neural network technique

Size: px
Start display at page:

Download "Prediction of p-wave velocity and anisotropic property of rock using artificial neural network technique"

Transcription

1 Journal of Scientific & Industrial Research Vol. 63, January 2004, pp Prediction of velocity and anisotropic property of rock using artificial neural network technique T N Singh*, R Kanchan**, K Saigal** and A K Verma** *Department of Earth Science, Indian Institute of Technology, Powai, Bombay Received: 30 June 2003; accepted: 03 November 2003 Physico-mechanical properties of rocks are significant in all operational parts in mining activities, starting from exploration to final dispatch of material. Compressional wave velocity ( velocity) and anisotropic behaviour of the rocks are two such properties for understanding the rock response behaviour under stress conditions. They also influence the breakage mechanism of rock. There are some known methods to determine the velocity and in in situ as well as in the laboratory. These methods are cumbersome and time consuming. In the present investigation, artificial neural network (ANN) technique is used for the prediction of velocity and, taking chemical composition and other physico-mechanical properties of rocks as input parameters. Cross-validation technique termed as leaving-one-out is used, as the numbers of data sets are limited in number. Network with six input neurons, one hidden layer with five neurons, and two output neurons is designed for sandstone. Similar network for marble with four hidden neurons is also designed. To deal with the problem of overfitting of data, Bayesian regulation is used and network is trained with 1500 training epochs. The coefficients of correlation among the predicted and observed values are high and encouraging and mean absolute percentage error (MAPE) obtained are very low. Keywords: velocity, Anisotropic property, Rocks, Artificial neural network technique Introduction The physico-mechanical properties are important ingredients for planning and design of mining and civil excavations. The present in the rock mass adversely influences the strength of the rock mass. The long-term stability can be only achieved once the compressional wave velocity and anisotropic behaviour of rock mass are completely understood. The compressional wave velocity depends on the density, chemical composition, and hardness of the rock material. It is difficult to determine compressional wave velocity in the field and the laboratory. The anisotropic behaviour of rock controls the bearing capacity of the rock mass. When field and laboratory geophysical data are considered together with the data of geology and geochemistry, it is 1 possible to obtain much information as to the rocks and minerals that are likely to occur in the different deeper layers of the earth. Hence, it is necessary to study and understand the physical properties of the rocks and minerals of which the layers of the earth are composed. ** Author for correspondence Department of Mining Engineering, Institute of Technology, Banaras Hindu University, Varanasi Laboratory measurements of the elastic constants or the elastic wave in the rocks are needed for interpretation of seismic velocities. Numerous measurements of elastic wave velocities have been made in laboratories around the world 1. Measurement of wave velocities in rocks as well as in many other materials is available in literature. The relation of the seismic velocities in rocks of the western region of the central Asia to density and other physical parameters is discussed by Yudborovsky and Vilenskaya 2. The advantages and disadvantages of the static and dynamic methods have been discussed by Volarovich and Fan 3, who suggested the use of field seismic observations and the data obtained by static methods in geodynamics. Aveline et al. 4 have found in granite from the Sidobre massif. Studies, both in the laboratory and the field, revealed some, which is said to be due to microfissuration. These studies revealed the lower velocity in weathered granite, as compared to fresh granite. Berezkin and Mikhaylov 5 have found linear correlation among density and elastic wave velocities in rocks of the central and eastern region of the Russian platform.

2 SINGH et al.: PREDICTION OF P-WAVE VELOCITY & ANISOTROPIC PROPERTY OF ROCK 33 Artificial neural network (ANN) is the simulation of a biological brain. Neural network has the ability to learn from the pattern acquainted before. Once the network has been trained with sufficient number of sample data sets, it can make predictions on the basis of its previous learning about the output related to new input data set of similar pattern. Due to its multidisciplinary nature, ANN is becoming popular among the researchers, planners, and designers as an effective tool for the accomplishment of their work. Therefore, artificial neural network has been successfully used in industrial area as well. Maulenkamp and Grima 6 have developed a model by which uniaxial compressive strength can be predicted from Equotip hardness. They also found that the uniaxial compressive strength of the rock sample is more accurate than the statistic predictions. It is indicated by the consistency of the correlation coefficient for the different test set. Yang and Zhang 7 have made analysis for the point load testing with artificial neural network. Cai and Zhao 8 have used ANN for tunneling design and optimal selection of the rock support measure and to ensure the stability of the tunnel. These applications show that neural network model has superiority in solving problems in which many parameters influence the process and results, when process and results are not fully understood and where historical or experimental data are available. Prediction of the physical properties of the rock is of similar nature and can be more accurately predicted than the methods of correlation. Study aims at predicting the velocity and by artificial neural network model, having higher prediction accuracy, taking physicomechanical properties and chemical composition as inputs. Data Set As already been stated that present investigation aims at predicting the elastic property of the rocks ( velocity and ), taking physicomechanical properties and chemical composition of the rocks as inputs. It is, however, uneconomical to obtain all the parameters because they are expensive and time consuming. On the other hand, some of the parameters are strongly correlated. 9 Hence, it is not important to use all the variables as input parameters. Following parameters have been taken as input parameters for the networks: Parameters Physico-mechanical properties: Chemical composition (): Variables Compressive strength, density, hardness For sandstone: quartz, feldspar, and iron-oxide For marble: calcite, rock. fragments, and iron- oxide Thus, inall six parameters have been taken as input parameters for both the networks. The velocity and have been taken as output parameters for the networks. All the input and output parameters were scaled between 0 and 1. This was done to utilize the most sensitive part of neuron and since output neuron being sigmoid can only give output between 0 and 1 the scaling of output parameter was necessary. Scaled value = max. value unscaled value/max. value min. value Artificial Neural Network Neural network is a branch of the Artificial Intelligence, other than case based reasoning, Expert Systems, and Genetic Algorithms. The Classical Statistics, Fuzzy Logic and Chaos theory are also considered to be related fields. An artificial neural network is a highly interconnected structure that consists of many simple processing elements (called neurons) capable of performing massively parallel computation for data processing and knowledge representation. The paradigms in this field are based on direct modeling of the human neuronal system 10. A neural network can be considered as a black box that is able to predict an output pattern when it recognizes a given input pattern. The neural network must first be trained by having it process a large number of input patterns and showing what sort of output resulted from each input pattern. Once trained, the neural network is able to recognize similarities when presented with a new input pattern resulting from a predicted output pattern. Neural networks are able to detect similarities in inputs, even though a particular input may never have been seen previously. This property allows for excellent interpolation capabilities, especially when the input data is noisy (not exact). Neural networks may be used as direct substitutes for auto-correlation, multi-variable regression, linear regression, trigonometric, and other statistical analysis and techniques.

3 34 J SCI IND RES VOL 63 JANUARY 2004 When a data stream is analyzed using a neural network, it is possible to detect important predictive patterns that were not previously apparent to a nonexpert. Thus the neural network can act as an expert. Particular network can be defined, using three fundamental components: transfer function, network architecture, and learning law 11. One has to define these components, depending upon the problem to be solved. Training a Network with Back-Propagation A network needs first to be trained before interpreting new information. Several different algorithms are available for training of neural networks, but the back-propagation algorithm is the most versatile and robust technique, as it provides the most efficient learning procedure for multilayer neural networks. Also the fact that back-propagation algorithms are especially capable to solve predicting problems, makes them popular 6.The feed-forward back-propagation neural network always consists of at least three layers: input layer, hidden layer and output layer. Each layer consists of many elementary processing units, neurons, and each unit is connected to each unit in the next layer through weights, i.e., neurons in the input layer will send its output as input for neurons in the hidden layer and similar is the connection between hidden and output layer. Number of hidden layers and neurons in the hidden layer changes according to the problem to be solved. Number of input and output neurons is same as the number of input and output variables. To differentiate between the different processing units, values called biases are introduced in the transfer functions. These biases are referred to as the temperature of a neuron. Except for the input layer, all neurons in the back- propagation network are associated with a bias neuron and a transfer function. The bias is much like a weight, except that it has a constant input of 1, while the transfer function filters the summed signals received from this neuron. These transfer functions are designed to map neurons or layers net output to its actual output and they are simple step functions, either linear or non-linear. The application of these transfer functions depends on the purpose of the neural network. The output layer produces the computed output vectors corresponding to the solution. During training of the network, data are processed through the network, until it reach the output layer (forward pass). In this layer the output is compared to the measured values (the true output). The difference or error between both is processed back through the network (backward pass), updating the individual weights of the connections and the biases of the individual neurons. The input and output data are mostly represented as vectors called training pairs. The process, as mentioned above, is repeated for all the training pairs in the data set, until the network error converged to a threshold minimum defined by a corresponding cost function; usually the root mean squared error (RMS) or summed squared error (SSE). In Figure 1, the j th neuron is connected with a number of inputs: X i = (x 1, x 2 x 3 x n ) The net input values in the hidden layer will be, Net j = n i=1 X i W ij + θ j, where X i = Input units, W ij = Weight on the connection of i th input and j th neuron, θ j = Bias neuron (optional), and n = Number of input units. Thus the net output from hidden layer is calculated using a logarithmic sigmoid function O j = f (Net j ) = 1 / 1+ e - (Netj+θj). The total input to the l th unit is, Net l = n k=1 W kl i k + θ l, where θ l = Bias neuron, and W kl = Weight between k th neuron and l th output. Thus the total output from l th unit will be, O l = f (Net l ). In the learning process the network is presented with a pair of patterns, an input pattern and a Figure 1 Back-propagation neural network

4 SINGH et al.: PREDICTION OF P-WAVE VELOCITY & ANISOTROPIC PROPERTY OF ROCK 35 corresponding desired output pattern. The network computes its own output pattern using its (mostly incorrect) weights and thresholds. Now, the actual output is compared with the desired output. Hence, the error at any output in layer W k is: e l = t l O l, where t l is the desired output, and O l is the actual output. The total error function is given by, E = 0.5 n l=1 (t l O l ) 2. Training of the network is basically a process of arriving at an optimum weight space of the network. The descent down error surface is made using the following rule: W jk = -η (δe/δw jk ), where η is the learning rate parameter, and E is the error function. The update of weights for the (n +1) th pattern is given as: W jk (n+1) = W jk (n) + W jk (n). Similar logic applies to the connections between the hidden and output layers. This procedure is repeated with each pattern pair of training exemplar assigned for training the network. Each pass through all the training patterns is called a cycle or epoch. The process is then repeated as many epochs are needed until the error within the user specified goal is reached successfully. This quantity is the measure of how the network has learned. Network Architecture Feed-forward network is adopted here as this architecture is reported to be suitable for problem based on pattern identification. Pattern matching is basically an input/output mapping problem. The closer the mapping better is the performance of network. In the small sample, as we are analyzing the data here, a cross-validation technique termed leaving-oneout, is more appropriate 12. Out of 53 data sets, 52 were taken to train the network and tested on remaining one data set. The procedure is repeated 15- times, leaving one observation randomly chosen out at a time. In the experiment the first observation was predicted, using the outcome of an analysis based on the observations 2, 3, 53, and 53 rd observation was predicted from the observations 1, 2, 3, 52. This method is advantageous as it uses nearly entire data sets for training the network. For both the networks used for sandstone and marble, this cross-validation technique was used. For both the networks, input layer consists of 6 neurons and output layer consists of 2 neurons. Number of hidden layers was decided by training and predicting the training data and testing data by varying the number of hidden layers and neurons in the hidden layer. A suitable configuration has to be chosen for the best performance of the network. Out of the different configurations tested, single hidden layer with 5 hidden neurons produced the best result for sandstone and single hidden layer with 4 neurons for marble. Hence, the final configuration chosen for the network used for sandstone was: 6 input neurons, 1 hidden layer with 5 hidden neurons, and 2 output neurons. The final configuration of network used for marble was: 10 input neurons, single hidden layer with 4 neurons, and 2 output neurons. Suitable numbers of epochs have to be assigned to overcome the problem of over fitting and under fitting of data. In the present paper, to deal with the above mentioned problem, Bayesian regulation 13 was used. Bayesian regulation is an automated regulation the use of which removes the danger of overfitting, as it never lets the data to suffer from overfitting. This eliminates the guess work required in determining the optimum number of epochs of the network. In the network the learning rate assigned was 10-2, number of training epochs given were 1500, and error goal was set to be Results and Discussion To demonstrate the performance of the network the mean absolute percentage error (MAPE) and coefficient of correlation between the predicted and observed values are taken as the performance measures. The prediction was based on the input data sets discussed earlier. For sandstone, observed and predicted values of velocity and, along with the percentage error, are given in Table 1. Using 1 hidden layer with 5 neurons did training for the network. As we have used bayesian regulation 13, there was no danger of overfitting of problems, hence the network was trained with 1500 training epochs. The correlation coefficients for the relationship between predicted and observed values are as high as and for and velocity respectively (Figure 2a-b)

5 36 J SCI IND RES VOL 63 JANUARY 2004 Compressive strength (kgf/cm 2 ) Table 1 Observed and predicted values of velocity and with percentage error for sandstone Hardness Quartz Feldspar Iron-oxide Density (g/cc) Observed Observes Error for Error for ppv value R 2 = Observed value Value 2300 R 2 = Observbed Value Figure 2a Correlation between predicted and observed value of for sandstone Mean absolute percentage errors (MAPE) for these variables are and 0.115, respectively. Figure 2c shows the performance of the network during training process. number of parameters on which output is depending are 18, which is more than 6 (input parameters given for this network), which shows that all the parameters were not included. Figure 2b Correlation among predicted and observed value of velocity for sandstone For marble, training was done using single hidden layer and 4 hidden neurons. As in this network also Bayesian regulation 13 was used the network was trained with 1500 training epochs. The observed and predicted values of velocity and, along with the percentage error, are given in Table 2. The correlation coefficients and training performance are shown in Figure 3a-b. Coefficients of correlation

6 SINGH et al.: PREDICTION OF P-WAVE VELOCITY & ANISOTROPIC PROPERTY OF ROCK 37 Figure 2c Performance graph of neural network used for sandstone Value R 2 = Observed Value Figure 3a Correlation between predicted and observed value of for marble Predited Value R 2 = Observed Value Figure 3b Correlation between predicted and observed value of velocity, for marble between predicted and observed values are and 0.687, respectively for and velocity, which are not as high as observed in prediction of sandstone, but it shows good relationship between predicted and observed values. Figure 3c Performance graph of neural network used for marble MAPE for these variables are 4.98 and 0.44, respectively. It can be seen in the Figure 3c that the network depends on 21 parameters. In the present case, only six important parameters were included. This may be the reason of moderate relation of predicted and observed values in the marble. After observing MAPE, coefficient of correlation and number of predicted parameters, on which output is depending for both the networks, it can be said that prediction made for sandstone is closer to the observed values than for marble. Conclusions Using Bayesian regulation and optimum number of neurons in the hidden layer the mean absolute percentage errors (MAPE) for and velocity are: and 0.115, respectively, for sandstone and 4.98 and 0.44 for marble, respectively. The corresponding coefficients of correlation are: and for sandstone and and for marble. Considering the complexity of the relationship among the inputs and outputs the results obtained are much more encouraging. Since neural network can learn new patterns that are not previously available on the training data sets, and as they can update knowledge over time as long as more training data sets are presented, and process information in parallel way, they result in a greater degree of accuracy, robust and fault tolerance than any other analysis techniques.

7 38 J SCI IND RES VOL 63 JANUARY 2004 Compressive strength (kgf/cm 2 ) Table 2 Observed and predicted values of velocity and with percentage error for marble Hardness Calcite Rock Iron-oxide fragment Density (g/cc ) Observed Observed Error for Error for ppv References 1 Balakrishna S & Ramana Y V, Integrated studies of the elastic properties of some Indian rocks, Geophysical monograph12, American geophysical union (1968), Yudborovsky I & Vilenskaya S M, Some results of study of the elastic properties of rocks of western central Asia, Akad Nauk Turkmen SSR Ser Fiz-Tekh Khim Geol Nauk, 3 (1962) Volarovich M P & Vey-tsin Fan, Investigation of the elastic properties of rocks by static and dynamic methods at high hydrostatic pressure, Akad Nauk SSSR Ins Fiz Zemli Tr, 23(190) (1962) Aveline M, Experimental results on the relation between micro-fissuration and speed of propagation of ultrasounds in the granites of Sidbore, Sci. Terre, 9(4) (1964) Berezkin V M & Mikhaylov I N, On the correlational relationship between the density of rocks and their velocities of elastic wave propagation for the central and eastern regions of Russian platform, Geofiz, Razvedka, 16 (1964) Maulenkamp F & Grima, M A, Application of neural networks for the prediction of the unconfined compressive strength (UCS) from Equotip hardness, Int J Rock Mech. Min Sci, 36 (1999) Yang Y & Zhang Q, Analysis for the results of point load testing with artificial neural network, Comput Methods Adv Geomech, China, (1997) Cai J G & Zhano J, Use of neural networks in rock tunneling, Proc. Computer Methods and Ava in Geomech, China (1997) Hogstrom K, A study on strength parameter for aggregate from south western Sweden rocks, Res Rep (Chamer Univ of Technology, Goteborg, Sweden) Kosko B, Neural networks and fuzzy systems: a dynamical systems approach to machine intelligence (Prentice Hall of India, New Delhi) Simpson P K, Artificial neural system-foundation, paradigm, application and implementation (Pergamon Press, New York) Weiss S M & Kapouleas, I An empirical comparison of pattern recognition, neural sets and machine learning classification methods. 11 th Int Joint Conf on Arti Intell, Kaufmann (1989) MacKay D J C, Bayesian Interpolation, Neural Computation, 4, no.3 (1992)

Evaluation of blast-induced ground vibration predictors

Evaluation of blast-induced ground vibration predictors ARTICLE IN PRESS Evaluation of blast-induced ground vibration predictors Manoj Khandelwal, T.N. Singh Department of Earth Sciences, Indian Institute of Technology Bombay, Powai, Mumbai 4 76, India Abstract

More information

Data Mining Part 5. Prediction

Data Mining Part 5. Prediction Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,

More information

Address for Correspondence

Address for Correspondence Research Article APPLICATION OF ARTIFICIAL NEURAL NETWORK FOR INTERFERENCE STUDIES OF LOW-RISE BUILDINGS 1 Narayan K*, 2 Gairola A Address for Correspondence 1 Associate Professor, Department of Civil

More information

Artificial Neural Network

Artificial Neural Network Artificial Neural Network Contents 2 What is ANN? Biological Neuron Structure of Neuron Types of Neuron Models of Neuron Analogy with human NN Perceptron OCR Multilayer Neural Network Back propagation

More information

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided

More information

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm Volume 4, Issue 5, May 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Huffman Encoding

More information

Assessment of Maximum Explosive Charge Used Per Delay in Surface Mines

Assessment of Maximum Explosive Charge Used Per Delay in Surface Mines Assessment of Maximum Explosive Charge Used Per Delay in Surface Mines MANOJ KHANDELWAL 1 & NIKOS MASTORAKIS 2 1 Department of Mining Engineering, College of Technology and Engineering, Maharana Pratap

More information

Artificial Neural Network Based Approach for Design of RCC Columns

Artificial Neural Network Based Approach for Design of RCC Columns Artificial Neural Network Based Approach for Design of RCC Columns Dr T illai, ember I Karthekeyan, Non-member Recent developments in artificial neural network have opened up new possibilities in the field

More information

Lecture 7 Artificial neural networks: Supervised learning

Lecture 7 Artificial neural networks: Supervised learning Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in

More information

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, 2012 Sasidharan Sreedharan www.sasidharan.webs.com 3/1/2012 1 Syllabus Artificial Intelligence Systems- Neural Networks, fuzzy logic,

More information

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption ANDRÉ NUNES DE SOUZA, JOSÉ ALFREDO C. ULSON, IVAN NUNES

More information

Artificial Neural Network Method of Rock Mass Blastability Classification

Artificial Neural Network Method of Rock Mass Blastability Classification Artificial Neural Network Method of Rock Mass Blastability Classification Jiang Han, Xu Weiya, Xie Shouyi Research Institute of Geotechnical Engineering, Hohai University, Nanjing, Jiangshu, P.R.China

More information

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS Learning Neural Networks Classifier Short Presentation INPUT: classification data, i.e. it contains an classification (class) attribute.

More information

Artificial Intelligence (AI) Common AI Methods. Training. Signals to Perceptrons. Artificial Neural Networks (ANN) Artificial Intelligence

Artificial Intelligence (AI) Common AI Methods. Training. Signals to Perceptrons. Artificial Neural Networks (ANN) Artificial Intelligence Artificial Intelligence (AI) Artificial Intelligence AI is an attempt to reproduce intelligent reasoning using machines * * H. M. Cartwright, Applications of Artificial Intelligence in Chemistry, 1993,

More information

POWER SYSTEM DYNAMIC SECURITY ASSESSMENT CLASSICAL TO MODERN APPROACH

POWER SYSTEM DYNAMIC SECURITY ASSESSMENT CLASSICAL TO MODERN APPROACH Abstract POWER SYSTEM DYNAMIC SECURITY ASSESSMENT CLASSICAL TO MODERN APPROACH A.H.M.A.Rahim S.K.Chakravarthy Department of Electrical Engineering K.F. University of Petroleum and Minerals Dhahran. Dynamic

More information

COGS Q250 Fall Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November.

COGS Q250 Fall Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November. COGS Q250 Fall 2012 Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November. For the first two questions of the homework you will need to understand the learning algorithm using the delta

More information

Introduction to Artificial Neural Networks

Introduction to Artificial Neural Networks Facultés Universitaires Notre-Dame de la Paix 27 March 2007 Outline 1 Introduction 2 Fundamentals Biological neuron Artificial neuron Artificial Neural Network Outline 3 Single-layer ANN Perceptron Adaline

More information

Artificial Neural Network and Fuzzy Logic

Artificial Neural Network and Fuzzy Logic Artificial Neural Network and Fuzzy Logic 1 Syllabus 2 Syllabus 3 Books 1. Artificial Neural Networks by B. Yagnanarayan, PHI - (Cover Topologies part of unit 1 and All part of Unit 2) 2. Neural Networks

More information

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen Neural Networks - I Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Neural Networks 1 /

More information

ECLT 5810 Classification Neural Networks. Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann

ECLT 5810 Classification Neural Networks. Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann ECLT 5810 Classification Neural Networks Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann Neural Networks A neural network is a set of connected input/output

More information

Artificial Neural Networks. Edward Gatt

Artificial Neural Networks. Edward Gatt Artificial Neural Networks Edward Gatt What are Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning Very

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward

More information

Mr. Harshit K. Dave 1, Dr. Keyur P. Desai 2, Dr. Harit K. Raval 3

Mr. Harshit K. Dave 1, Dr. Keyur P. Desai 2, Dr. Harit K. Raval 3 Investigations on Prediction of MRR and Surface Roughness on Electro Discharge Machine Using Regression Analysis and Artificial Neural Network Programming Mr. Harshit K. Dave 1, Dr. Keyur P. Desai 2, Dr.

More information

Unit III. A Survey of Neural Network Model

Unit III. A Survey of Neural Network Model Unit III A Survey of Neural Network Model 1 Single Layer Perceptron Perceptron the first adaptive network architecture was invented by Frank Rosenblatt in 1957. It can be used for the classification of

More information

Forecasting of Rain Fall in Mirzapur District, Uttar Pradesh, India Using Feed-Forward Artificial Neural Network

Forecasting of Rain Fall in Mirzapur District, Uttar Pradesh, India Using Feed-Forward Artificial Neural Network International Journal of Engineering Science Invention ISSN (Online): 2319 6734, ISSN (Print): 2319 6726 Volume 2 Issue 8ǁ August. 2013 ǁ PP.87-93 Forecasting of Rain Fall in Mirzapur District, Uttar Pradesh,

More information

Estimation of the Pre-Consolidation Pressure in Soils Using ANN method

Estimation of the Pre-Consolidation Pressure in Soils Using ANN method Current World Environment Vol. 11(Special Issue 1), 83-88 (2016) Estimation of the Pre-Consolidation Pressure in Soils Using ANN method M. R. Motahari Department of Civil Engineering, Faculty of Engineering,

More information

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET Unit-. Definition Neural network is a massively parallel distributed processing system, made of highly inter-connected neural computing elements that have the ability to learn and thereby acquire knowledge

More information

Unit 8: Introduction to neural networks. Perceptrons

Unit 8: Introduction to neural networks. Perceptrons Unit 8: Introduction to neural networks. Perceptrons D. Balbontín Noval F. J. Martín Mateos J. L. Ruiz Reina A. Riscos Núñez Departamento de Ciencias de la Computación e Inteligencia Artificial Universidad

More information

Neural Networks and the Back-propagation Algorithm

Neural Networks and the Back-propagation Algorithm Neural Networks and the Back-propagation Algorithm Francisco S. Melo In these notes, we provide a brief overview of the main concepts concerning neural networks and the back-propagation algorithm. We closely

More information

Artificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino

Artificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino Artificial Neural Networks Data Base and Data Mining Group of Politecnico di Torino Elena Baralis Politecnico di Torino Artificial Neural Networks Inspired to the structure of the human brain Neurons as

More information

Real Time wave forecasting using artificial neural network with varying input parameter

Real Time wave forecasting using artificial neural network with varying input parameter 82 Indian Journal of Geo-Marine SciencesINDIAN J MAR SCI VOL. 43(1), JANUARY 2014 Vol. 43(1), January 2014, pp. 82-87 Real Time wave forecasting using artificial neural network with varying input parameter

More information

An artificial neural networks (ANNs) model is a functional abstraction of the

An artificial neural networks (ANNs) model is a functional abstraction of the CHAPER 3 3. Introduction An artificial neural networs (ANNs) model is a functional abstraction of the biological neural structures of the central nervous system. hey are composed of many simple and highly

More information

Artifical Neural Networks

Artifical Neural Networks Neural Networks Artifical Neural Networks Neural Networks Biological Neural Networks.................................. Artificial Neural Networks................................... 3 ANN Structure...........................................

More information

Lecture 4: Perceptrons and Multilayer Perceptrons

Lecture 4: Perceptrons and Multilayer Perceptrons Lecture 4: Perceptrons and Multilayer Perceptrons Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning Perceptrons, Artificial Neuronal Networks Lecture 4: Perceptrons

More information

Neural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2

Neural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2 Neural Nets in PR NM P F Outline Motivation: Pattern Recognition XII human brain study complex cognitive tasks Michal Haindl Faculty of Information Technology, KTI Czech Technical University in Prague

More information

CS:4420 Artificial Intelligence

CS:4420 Artificial Intelligence CS:4420 Artificial Intelligence Spring 2018 Neural Networks Cesare Tinelli The University of Iowa Copyright 2004 18, Cesare Tinelli and Stuart Russell a a These notes were originally developed by Stuart

More information

Computational statistics

Computational statistics Computational statistics Lecture 3: Neural networks Thierry Denœux 5 March, 2016 Neural networks A class of learning methods that was developed separately in different fields statistics and artificial

More information

Lecture 4: Feed Forward Neural Networks

Lecture 4: Feed Forward Neural Networks Lecture 4: Feed Forward Neural Networks Dr. Roman V Belavkin Middlesex University BIS4435 Biological neurons and the brain A Model of A Single Neuron Neurons as data-driven models Neural Networks Training

More information

Neural Networks with Applications to Vision and Language. Feedforward Networks. Marco Kuhlmann

Neural Networks with Applications to Vision and Language. Feedforward Networks. Marco Kuhlmann Neural Networks with Applications to Vision and Language Feedforward Networks Marco Kuhlmann Feedforward networks Linear separability x 2 x 2 0 1 0 1 0 0 x 1 1 0 x 1 linearly separable not linearly separable

More information

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks INFOB2KI 2017-2018 Utrecht University The Netherlands ARTIFICIAL INTELLIGENCE Artificial Neural Networks Lecturer: Silja Renooij These slides are part of the INFOB2KI Course Notes available from www.cs.uu.nl/docs/vakken/b2ki/schema.html

More information

Investigation of complex modulus of asphalt mastic by artificial neural networks

Investigation of complex modulus of asphalt mastic by artificial neural networks Indian Journal of Engineering & Materials Sciences Vol. 1, August 014, pp. 445-450 Investigation of complex modulus of asphalt mastic by artificial neural networks Kezhen Yan* & Lingyun You College of

More information

Portugaliae Electrochimica Acta 26/4 (2008)

Portugaliae Electrochimica Acta 26/4 (2008) Portugaliae Electrochimica Acta 6/4 (008) 6-68 PORTUGALIAE ELECTROCHIMICA ACTA Comparison of Regression Model and Artificial Neural Network Model for the Prediction of Volume Percent of Diamond Deposition

More information

Application of Artificial Neural Network for Short Term Load Forecasting

Application of Artificial Neural Network for Short Term Load Forecasting aerd Scientific Journal of Impact Factor(SJIF): 3.134 e-issn(o): 2348-4470 p-issn(p): 2348-6406 International Journal of Advance Engineering and Research Development Volume 2,Issue 4, April -2015 Application

More information

2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks. Todd W. Neller

2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks. Todd W. Neller 2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks Todd W. Neller Machine Learning Learning is such an important part of what we consider "intelligence" that

More information

Civil and Environmental Research ISSN (Paper) ISSN (Online) Vol.8, No.1, 2016

Civil and Environmental Research ISSN (Paper) ISSN (Online) Vol.8, No.1, 2016 Developing Artificial Neural Network and Multiple Linear Regression Models to Predict the Ultimate Load Carrying Capacity of Reactive Powder Concrete Columns Prof. Dr. Mohammed Mansour Kadhum Eng.Ahmed

More information

International Journal of Advanced Research in Computer Science and Software Engineering

International Journal of Advanced Research in Computer Science and Software Engineering Volume 3, Issue 4, April 2013 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Application of

More information

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Neural Networks Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Brains as Computational Devices Brains advantages with respect to digital computers: Massively parallel Fault-tolerant Reliable

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) Human Brain Neurons Input-Output Transformation Input Spikes Output Spike Spike (= a brief pulse) (Excitatory Post-Synaptic Potential)

More information

Short Term Load Forecasting Based Artificial Neural Network

Short Term Load Forecasting Based Artificial Neural Network Short Term Load Forecasting Based Artificial Neural Network Dr. Adel M. Dakhil Department of Electrical Engineering Misan University Iraq- Misan Dr.adelmanaa@gmail.com Abstract Present study develops short

More information

This is the published version.

This is the published version. Li, A.J., Khoo, S.Y., Wang, Y. and Lyamin, A.V. 2014, Application of neural network to rock slope stability assessments. In Hicks, Michael A., Brinkgreve, Ronald B.J.. and Rohe, Alexander. (eds), Numerical

More information

Neural networks. Chapter 19, Sections 1 5 1

Neural networks. Chapter 19, Sections 1 5 1 Neural networks Chapter 19, Sections 1 5 Chapter 19, Sections 1 5 1 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 19, Sections 1 5 2 Brains 10

More information

A Wavelet Neural Network Forecasting Model Based On ARIMA

A Wavelet Neural Network Forecasting Model Based On ARIMA A Wavelet Neural Network Forecasting Model Based On ARIMA Wang Bin*, Hao Wen-ning, Chen Gang, He Deng-chao, Feng Bo PLA University of Science &Technology Nanjing 210007, China e-mail:lgdwangbin@163.com

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward

More information

Evaluation of the relationships between Schmidt rebound number and strength of rocks

Evaluation of the relationships between Schmidt rebound number and strength of rocks Evaluation of the relationships between Schmidt rebound number and strength of rocks H. Suha Aksoy Assistant Professor Firat University Engineering Faculty Civil Engineering Dept. saksoy@firat.edu.tr Ufuk

More information

Neural Networks biological neuron artificial neuron 1

Neural Networks biological neuron artificial neuron 1 Neural Networks biological neuron artificial neuron 1 A two-layer neural network Output layer (activation represents classification) Weighted connections Hidden layer ( internal representation ) Input

More information

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis Introduction to Natural Computation Lecture 9 Multilayer Perceptrons and Backpropagation Peter Lewis 1 / 25 Overview of the Lecture Why multilayer perceptrons? Some applications of multilayer perceptrons.

More information

ESTIMATION OF HOURLY MEAN AMBIENT TEMPERATURES WITH ARTIFICIAL NEURAL NETWORKS 1. INTRODUCTION

ESTIMATION OF HOURLY MEAN AMBIENT TEMPERATURES WITH ARTIFICIAL NEURAL NETWORKS 1. INTRODUCTION Mathematical and Computational Applications, Vol. 11, No. 3, pp. 215-224, 2006. Association for Scientific Research ESTIMATION OF HOURLY MEAN AMBIENT TEMPERATURES WITH ARTIFICIAL NEURAL NETWORKS Ömer Altan

More information

Rainfall Prediction using Back-Propagation Feed Forward Network

Rainfall Prediction using Back-Propagation Feed Forward Network Rainfall Prediction using Back-Propagation Feed Forward Network Ankit Chaturvedi Department of CSE DITMR (Faridabad) MDU Rohtak (hry). ABSTRACT Back propagation is most widely used in neural network projects

More information

ECE521 Lectures 9 Fully Connected Neural Networks

ECE521 Lectures 9 Fully Connected Neural Networks ECE521 Lectures 9 Fully Connected Neural Networks Outline Multi-class classification Learning multi-layer neural networks 2 Measuring distance in probability space We learnt that the squared L2 distance

More information

Neural Networks Introduction

Neural Networks Introduction Neural Networks Introduction H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011 H. A. Talebi, Farzaneh Abdollahi Neural Networks 1/22 Biological

More information

Neuro -Finite Element Static Analysis of Structures by Assembling Elemental Neuro -Modelers

Neuro -Finite Element Static Analysis of Structures by Assembling Elemental Neuro -Modelers Neuro -Finite Element Static Analysis of Structures by Assembling Elemental Neuro -Modelers Abdolreza Joghataie Associate Prof., Civil Engineering Department, Sharif University of Technology, Tehran, Iran.

More information

Mining Classification Knowledge

Mining Classification Knowledge Mining Classification Knowledge Remarks on NonSymbolic Methods JERZY STEFANOWSKI Institute of Computing Sciences, Poznań University of Technology SE lecture revision 2013 Outline 1. Bayesian classification

More information

Feedforward Neural Nets and Backpropagation

Feedforward Neural Nets and Backpropagation Feedforward Neural Nets and Backpropagation Julie Nutini University of British Columbia MLRG September 28 th, 2016 1 / 23 Supervised Learning Roadmap Supervised Learning: Assume that we are given the features

More information

MODELING OF A HOT AIR DRYING PROCESS BY USING ARTIFICIAL NEURAL NETWORK METHOD

MODELING OF A HOT AIR DRYING PROCESS BY USING ARTIFICIAL NEURAL NETWORK METHOD MODELING OF A HOT AIR DRYING PROCESS BY USING ARTIFICIAL NEURAL NETWORK METHOD Ahmet DURAK +, Ugur AKYOL ++ + NAMIK KEMAL UNIVERSITY, Hayrabolu, Tekirdag, Turkey. + NAMIK KEMAL UNIVERSITY, Çorlu, Tekirdag,

More information

Lab 5: 16 th April Exercises on Neural Networks

Lab 5: 16 th April Exercises on Neural Networks Lab 5: 16 th April 01 Exercises on Neural Networks 1. What are the values of weights w 0, w 1, and w for the perceptron whose decision surface is illustrated in the figure? Assume the surface crosses the

More information

Temperature Prediction based on Artificial Neural Network and its Impact on Rice Production, Case Study: Bangladesh

Temperature Prediction based on Artificial Neural Network and its Impact on Rice Production, Case Study: Bangladesh erature Prediction based on Artificial Neural Network and its Impact on Rice Production, Case Study: Bangladesh Tushar Kanti Routh Lecturer, Department of Electronics & Telecommunication Engineering, South

More information

Neural Networks and Ensemble Methods for Classification

Neural Networks and Ensemble Methods for Classification Neural Networks and Ensemble Methods for Classification NEURAL NETWORKS 2 Neural Networks A neural network is a set of connected input/output units (neurons) where each connection has a weight associated

More information

Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter

Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter (Chair) STF - China Fellow francesco.dimaio@polimi.it

More information

COMP-4360 Machine Learning Neural Networks

COMP-4360 Machine Learning Neural Networks COMP-4360 Machine Learning Neural Networks Jacky Baltes Autonomous Agents Lab University of Manitoba Winnipeg, Canada R3T 2N2 Email: jacky@cs.umanitoba.ca WWW: http://www.cs.umanitoba.ca/~jacky http://aalab.cs.umanitoba.ca

More information

Neural network modelling of reinforced concrete beam shear capacity

Neural network modelling of reinforced concrete beam shear capacity icccbe 2010 Nottingham University Press Proceedings of the International Conference on Computing in Civil and Building Engineering W Tizani (Editor) Neural network modelling of reinforced concrete beam

More information

Simple Neural Nets For Pattern Classification

Simple Neural Nets For Pattern Classification CHAPTER 2 Simple Neural Nets For Pattern Classification Neural Networks General Discussion One of the simplest tasks that neural nets can be trained to perform is pattern classification. In pattern classification

More information

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 BIOLOGICAL INSPIRATIONS Some numbers The human brain contains about 10 billion nerve cells (neurons) Each neuron is connected to the others through 10000

More information

Ch.8 Neural Networks

Ch.8 Neural Networks Ch.8 Neural Networks Hantao Zhang http://www.cs.uiowa.edu/ hzhang/c145 The University of Iowa Department of Computer Science Artificial Intelligence p.1/?? Brains as Computational Devices Motivation: Algorithms

More information

Course 395: Machine Learning - Lectures

Course 395: Machine Learning - Lectures Course 395: Machine Learning - Lectures Lecture 1-2: Concept Learning (M. Pantic) Lecture 3-4: Decision Trees & CBC Intro (M. Pantic & S. Petridis) Lecture 5-6: Evaluating Hypotheses (S. Petridis) Lecture

More information

Artificial Neural Networks Examination, June 2005

Artificial Neural Networks Examination, June 2005 Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Jeff Clune Assistant Professor Evolving Artificial Intelligence Laboratory Announcements Be making progress on your projects! Three Types of Learning Unsupervised Supervised Reinforcement

More information

Empirical Estimation of Unconfined Compressive Strength and Modulus of Elasticity Using ANN

Empirical Estimation of Unconfined Compressive Strength and Modulus of Elasticity Using ANN Pak. J. Engg. & Appl. Sci. Vol. 18 January, 2016 (p. 98 110) Empirical Estimation of Unconfined Compressive Strength and Modulus of Elasticity Using ANN Hasan Gul 1, Khalid Farooq 2 and Hassan Mujtaba

More information

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA   1/ 21 Neural Networks Chapter 8, Section 7 TB Artificial Intelligence Slides from AIMA http://aima.cs.berkeley.edu / 2 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural

More information

Lecture 5: Logistic Regression. Neural Networks

Lecture 5: Logistic Regression. Neural Networks Lecture 5: Logistic Regression. Neural Networks Logistic regression Comparison with generative models Feed-forward neural networks Backpropagation Tricks for training neural networks COMP-652, Lecture

More information

Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!

Artificial Neural Networks and Nonparametric Methods CMPSCI 383 Nov 17, 2011! Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011! 1 Todayʼs lecture" How the brain works (!)! Artificial neural networks! Perceptrons! Multilayer feed-forward networks! Error

More information

Neural Networks: Introduction

Neural Networks: Introduction Neural Networks: Introduction Machine Learning Fall 2017 Based on slides and material from Geoffrey Hinton, Richard Socher, Dan Roth, Yoav Goldberg, Shai Shalev-Shwartz and Shai Ben-David, and others 1

More information

Effect of number of hidden neurons on learning in large-scale layered neural networks

Effect of number of hidden neurons on learning in large-scale layered neural networks ICROS-SICE International Joint Conference 009 August 18-1, 009, Fukuoka International Congress Center, Japan Effect of on learning in large-scale layered neural networks Katsunari Shibata (Oita Univ.;

More information

4. Multilayer Perceptrons

4. Multilayer Perceptrons 4. Multilayer Perceptrons This is a supervised error-correction learning algorithm. 1 4.1 Introduction A multilayer feedforward network consists of an input layer, one or more hidden layers, and an output

More information

CMSC 421: Neural Computation. Applications of Neural Networks

CMSC 421: Neural Computation. Applications of Neural Networks CMSC 42: Neural Computation definition synonyms neural networks artificial neural networks neural modeling connectionist models parallel distributed processing AI perspective Applications of Neural Networks

More information

Speaker Representation and Verification Part II. by Vasileios Vasilakakis

Speaker Representation and Verification Part II. by Vasileios Vasilakakis Speaker Representation and Verification Part II by Vasileios Vasilakakis Outline -Approaches of Neural Networks in Speaker/Speech Recognition -Feed-Forward Neural Networks -Training with Back-propagation

More information

Neural networks. Chapter 20. Chapter 20 1

Neural networks. Chapter 20. Chapter 20 1 Neural networks Chapter 20 Chapter 20 1 Outline Brains Neural networks Perceptrons Multilayer networks Applications of neural networks Chapter 20 2 Brains 10 11 neurons of > 20 types, 10 14 synapses, 1ms

More information

A Particle Swarm Optimization (PSO) Primer

A Particle Swarm Optimization (PSO) Primer A Particle Swarm Optimization (PSO) Primer With Applications Brian Birge Overview Introduction Theory Applications Computational Intelligence Summary Introduction Subset of Evolutionary Computation Genetic

More information

A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation

A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation 1 Introduction A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation J Wesley Hines Nuclear Engineering Department The University of Tennessee Knoxville, Tennessee,

More information

Keywords: neural networks; training; Elmen; acoustic parameters; Backpropagation

Keywords: neural networks; training; Elmen; acoustic parameters; Backpropagation Acoustics Parameters Estimation by Artificial Neural Networks Noureddine Djarfour*, Kamel Baddari*, Lab. physique de la terre,. Université de Boumerdés 35000, Algérie, E.mail:djarfour@usa.net Abstract

More information

Artificial Neural Networks Examination, June 2004

Artificial Neural Networks Examination, June 2004 Artificial Neural Networks Examination, June 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

Materials Science Forum Online: ISSN: , Vols , pp doi: /

Materials Science Forum Online: ISSN: , Vols , pp doi: / Materials Science Forum Online: 2004-12-15 ISSN: 1662-9752, Vols. 471-472, pp 687-691 doi:10.4028/www.scientific.net/msf.471-472.687 Materials Science Forum Vols. *** (2004) pp.687-691 2004 Trans Tech

More information

Multilayer Perceptron Tutorial

Multilayer Perceptron Tutorial Multilayer Perceptron Tutorial Leonardo Noriega School of Computing Staffordshire University Beaconside Staffordshire ST18 0DG email: l.a.noriega@staffs.ac.uk November 17, 2005 1 Introduction to Neural

More information

Deep Learning. Basics and Intuition. Constantin Gonzalez Principal Solutions Architect, Amazon Web Services

Deep Learning. Basics and Intuition. Constantin Gonzalez Principal Solutions Architect, Amazon Web Services Deep Learning Basics and Intuition Constantin Gonzalez Principal Solutions Architect, Amazon Web Services glez@amazon.de September 2017 2017, Amazon Web Services, Inc. or its Affiliates. All rights reserved.

More information

A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL FUZZY CLUSTERING *

A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL FUZZY CLUSTERING * No.2, Vol.1, Winter 2012 2012 Published by JSES. A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL * Faruk ALPASLAN a, Ozge CAGCAG b Abstract Fuzzy time series forecasting methods

More information

Pattern Recognition Prof. P. S. Sastry Department of Electronics and Communication Engineering Indian Institute of Science, Bangalore

Pattern Recognition Prof. P. S. Sastry Department of Electronics and Communication Engineering Indian Institute of Science, Bangalore Pattern Recognition Prof. P. S. Sastry Department of Electronics and Communication Engineering Indian Institute of Science, Bangalore Lecture - 27 Multilayer Feedforward Neural networks with Sigmoidal

More information

Neural Network Based Response Surface Methods a Comparative Study

Neural Network Based Response Surface Methods a Comparative Study . LS-DYNA Anwenderforum, Ulm Robustheit / Optimierung II Neural Network Based Response Surface Methods a Comparative Study Wolfram Beyer, Martin Liebscher, Michael Beer, Wolfgang Graf TU Dresden, Germany

More information

Mining Classification Knowledge

Mining Classification Knowledge Mining Classification Knowledge Remarks on NonSymbolic Methods JERZY STEFANOWSKI Institute of Computing Sciences, Poznań University of Technology COST Doctoral School, Troina 2008 Outline 1. Bayesian classification

More information

Introduction to Neural Networks

Introduction to Neural Networks Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks Stephan Dreiseitl University of Applied Sciences Upper Austria at Hagenberg Harvard-MIT Division of Health Sciences and Technology HST.951J: Medical Decision Support Knowledge

More information