The Application of Back Propagation Neural Network of Multi-channel Piezoelectric Quartz Crystal Sensor for Mixed Organic Vapours

Size: px
Start display at page:

Download "The Application of Back Propagation Neural Network of Multi-channel Piezoelectric Quartz Crystal Sensor for Mixed Organic Vapours"

Transcription

1 Tamang Journal of Science and Engineering, Vol. 5, No. 4, pp (2002) 209 The Application of Bac Propagation Neural Networ of Multi-channel Piezoelectric Quartz Crystal Sensor Ping Chang and Jeng-Shong Shih * Department of Chemistry National Taiwan Normal University Taipei, Taiwan 116, R.O.C. sshih@cc.ntnu.edu.tw Abstract A multi-channel piezoelectric quartz crystal sensor with a homemade computer interface was prepared and employed in the present study to detect mixture of organic molecules. Bac propagation neural networ (BPN) was used to distinguish the species in the mixture organic molecules and multivariate linear regression analysis (MLR) was used to compute the concentration of the species. A six-channel piezoelectric sensor detecting organic molecules in static system was investigated and discussed. Amine, carboxylic acid, alcohol and aromatic molecules can easily be distinguished by this system with bac propagation neural networ. Furthermore, the concentrations of the organic compounds were computed with an error of about 10% by multivariate linear regression analysis (MLR). Detection of organic mixture with amine, carboxylic acid, alcohol and aromatic molecules by this method also had good qualitative and quantitative results. In order to achieve better distinguishability, change of fault-tolerance in bac propagation neural networ was also investigated and discussed in this study. Key Words: Pizoelectric Crystal, Multichannel Sensor, Organic Vapours, Bac Propagation Neural Networ, Linear Regression Analysis 1. Introduction It is well nown that the use of gas sensor array and pattern recognition analysis has the advantage in identifying odors, organic molecules and gases because of poor selectivity of a lot of other gas sensors [1-5]. The qualitative and quantitative analysis of a gas mixture with non-selective sensor elements is achievable with a combination of several different sensor elements in an array. Piezoelectric quartz crystals were very sensitive for the changes in mass [6-12]. By coating some special materials on the surface, the frequency of the crystal can be decreased by adsorbing targets on the crystal's surface. The relationship derived for quartz crystals (AT-cut) vibrating in the thicness shear-mode is as follows [13,14]: F = f 2 m/a (1) Where F is the frequency shift due to the coating, f (MHz) is the frequency of quartz crystal, m (gram) is the mass of deposited coating and A (cm 2 ) is the area coated. Principal component analysis (PCA), a well-nown technology of statistics, is useful in selecting the classic independents of all

2 210 Ping Chang and Jeng-Shong Shih materials [15]. We can easily differentiate different analytes by taeing a view of the profile discrimination with the responses of several channels in a plot. However, it is necessary to distinguish the different gases by sensor array without subective udgment in many cases. Therefore, using of computer for the study is the best choosing. Can a computer smart as a human being? Artificial neural networ (ANN) capacitates a computer the ability of learning and thining. Chemical sensors were developed for an artificial nose in the past decade [16]. The application of ANN method proved to be particularly advantageous if the measured property is not connected exactly to the signal of the transducers of sensors. The optimum structure of neural networ is determined by a trial and error method. Bac propagation neural networ (BPN) is the most popular technology of the chemical sensor array. The bac propagation method is part of the parallel distributed processing system [17]. One-layer networs lie Hopfield and Kohonen structure, and multi-layer systems lie counter propagation and bac propagation of errors, can be used for chemical applications. Bacpropagation refers to the method for computing the gradient of the case-wise error function with respect to the weights for a feedforward networ which is straightforward and elegant application of the chain rule of elementary calculus. By definition, bacpropagation or bacprop refers to a training method that uses bacpropagation to compute the gradient. In other words, a bacprop networ is a feedforward networ trained by bacpropagation [18]. In this study, qualitative analysis of the analytes (mixture gas) was done by using PZ sensor array and BPN. Three layers networ structure of BPN was established and three units of hidden layer were used. Furthermore, the concentration of each compound of the gases was computed by MLR. 2. Experimental 2.1 Crystal Coating Piezoelectric crystals used were AT-cut spherical quartz crystals, with a radius of 4.0 mm and a thicness of 0.18 mm with a basic resonant frequency of 10 MHz and were provided with silver-plated metal electrodes on both sides (Taiwan Crystal Co.). The crystals were coated with the prepared solution via dropping method with a microsyringe. An aliquot of 2 ml of the prepared coating solution was dropped onto one side of the quartz crystals. After evaporation of the solvent, differently functioning PZ crystals were obtained. The coating materials selected with PCA were polyvinyl alcohol, fullerene, polystyrene, stearic acid, polyethylene adipate and polyvinyl pyrrolidene [8]. 2.2 Apparatus Figure 1 depicts the experimental setup of a piezoelectric quartz crystal detection system with an assembled computer interface. The multi- channel PZ sensor connected with an oscillator system was placed in a glass cell. Organic liquid was inected into the inecting port which incudes a heat plant. A home-made computer interface, including an oscillator, Altera programmable logic devices, a standard crystal and a programming peripheral interface (PPI 8255) was prepared for frequency to digital conversion. The Altera was designed with a 24 bits counter and 24 bits register to treat the 10 MHz frequency and no frequency mixing was used. Therefore, the true frequencies were obtained and hence the errors were reduced [15]. Data processing and signal acquisition were automatically performed on a microcomputer (PC/AT) with a program in Qbasic. The MLR is done with the commercial statistical software pacage SAS and the bac propagation neural networ (BPN) is a program written in Qbasic. Oscillation circuit and Ferquency counter Multi-channel sensor Sample Inecting port Heater Nitrogen Flow controller Figure 1. Experimental setup of detection system with an assembled computer interface

3 The Application of Bac Propagation Neural Networ of Multi-channel Piezoelectric Quartz Crystal Sensor 211 BPN program Training process: Step 1. Design the structure of neural networ and input parameters of the nertwor. Step 2. Get initial weights W and initial θ values from randomizing. Step 3. Input training data matrix X and output matrix T. Step 4. Compute the output vector of each neural units. (a) Compute the output vecter H of the hidden layer net = WiX i θ (2) H = f ( net ) (3) (b) Compute the output vecter Y of the output layer net = WH i θ (4) Y = f ( net ) (5) Step 5. Compute the distances d (a) Compute the distances d of the output layer ' δ = ( T Y ) f ( net ) (6) (b) Compute the distances d of the hidden layer ' δ = ( δ W) f ( net ) (7) Step 6. Compute the modification of W and θ (η is the learning rate) (a) Compute the modification of W and θ of the output layer W = ηδ H (8) θ = ηδ (9) (b) Compute the modification of W and θ of the hidden layer Wi = ηδ Xi (10) θ = ηδ (11) Step 7. Renew W and θ (a) Renew W and θ of the output layer W = W + W (12) θ = θ + θ (13) (b) Renew W and θ of the hidden layer Wi = Wi + Wi (14) θ = θ + θ (15) Step 8. Repeat step 3 to step7 until convergence. Testing process: Step 1. Input the parameters of the networ. Step 2. Input the W and θ Step 3. Input an unnown data matrix X Step 4. Compute the output vector (a) Compute the output vector H of hidden layer net = WiX i θ (16) H = f ( net ) (17) (b) Compute the output vecter Y of the output layer net = WH i θ (18) Y = f ( net ) (19)

4 212 Ping Chang and Jeng-Shong Shih 3. Results And Discussion In this study, a networ structure was used as shown in Figure 2. The response of six coated (polyvinyl alcohol, fullerene, polystyrene, stearic acid, polyethylene adipate and polyvinyl pyrrolidene) PZ crystals was the input layer matrix of the networ. First, pure organic gases such as toluene, butanol, butyl amine and acetic acid was analyzed by this system. 40 training examples and 20 testing examples were prepared for training and testing the networ respectively. In the BPN networ, the problem of classification had been solved. On the other words, using BPN networ in PZ multichannel sensor can distinguish different organic gases, toluene, butanol, butyl amine and acetic acid, rapidly and accurately. Response of channel 1 Response of channel 2 Response of channel 3 Response of channel 4 Response of channel 5 Response of channel 6 Hz Input Layer #6 Hidden Layer #3 Output Layer #4 Toluene Presence Presence(+1) (+1) Absence (-1) alcohol Acetic acid amine Figure 2. The structure of neural networ adopted in this study In the calculation of BPN, input data processing and output data reprocessing were done as in equation (20) and equation (21) DOld µ D New = (20) κ σ µ: mean of the data of the unit σ: standard deviation of the data of the unit = 2.58 (99%) D Old Min D New = ( ) (21) Max Min Min: minimum of the data of the unit Max: maximum of the data of the unit In this networ, each unit of the output layer stands for the presence (+1) or absence (-1) of the detected molecule. 0.9 as a target value of the presence and -0.9 as a target value of the unexpected answer were used for computing the bac propagation of error algorithm. When the value of output is larger than 0.9 or smaller than -0.9, we define the analytes presence or absence. However, +0.8 bias of the processing data was used for error algorithm. In this study, batch learning (change weights and threshold limits after all training samples were computed), 0.5 of learning rate, 0.1 of minimum learning rate and random initial weights and threshold limits were used. Table 1 depicts that overfiting does not occur in the training process of BPN because testing result has smaller error. NNs (Neural networs), lie other flexible nonlinear estimation methods such as ernel regression and smoothing splines, can suffer from either underfitting or overfitting. A networ that is not sufficiently complex can fail to detect fully the signal in a complicated data set, leading to underfitting. A networ that is too complex may fit the noise, not ust the signal, leading to overfitting. Overfitting is especially dangerous because it can easily lead to predictions that are far beyond the range of the training data with many of the common types of NNs. Overfitting can also produce wild predictions in multilayer perceptrons even with noise-free data18. Table 1. The error rate for BPN of organic molecules detection system Learning cycles: 5000 Number of examples Error rate/units Error rate /examples Training % 7.5 % Testing 20 0 % 0 % error-rate/units=(units total - units correct )/units correct error-rate/examples=(examples total - examples correct )/ samples correct The linear relationship between the response and concentration is shown in Figure 3. Regression analysis usalysis. Although quite high sensitivity was was founded to compute the concentration of the organic molecule. Table 2 was obtained by linear regression an in the case of stearic acid, however, relatively quite low relative coefficient (R 2 = ) was also observed in the case of stearic acid as shown in Table 2 which implied that stearic acid was not a good adsorbent for the analysis of butyl amine. Different experimental data set is used to test the regression equation as shown in Table 3. Quite good linear responses to toluene were found for all adsorbents which indicated that toluene in organic mixtures could be analyzed by multivariate linear regression analysis with these adsorbents on multi-channel quartz crystals.

5 The Application of Bac Propagation Neural Networ of Multi-channel Piezoelectric Quartz Crystal Sensor 213 target value "zero" maybe dangerous for the predictions should be tested carefully. Table 5 shows the error rate (units) of various target values of training and testing. In the Table 5, both error rates of training and testing of target value "zero" are smaller than others. It indicates that learning fault-tolerance is helpful to distinguish organic molecule in gas mixture. In this case, BPN is confusing whether butyl alcohol was in the gas mixture or not, as edvent in Table 6. However, it could be easily to distinguish toluene, butyl amine and acetic acid. Figure 3. The response of multichannel PZ sensor for toluene Table 2. Regression equations of linear regression analysis for organic gases Analytes Regression channel Regression equation Toluene Polystyrene Y=6.9075X Polyvinyl Y=7.1517X alcohol pyrrolidene Acetic Polyvinyl Y=28.003X acid pyrrolidene Polystyrene Y=7.2733X amine *Stearic Y=10.386X acid *High sensitivity, but relatively low relative coefficient (R 2 ) Furthermore, The qualitative and quantitative analysis of a gas mixture was also investigated and discussed. Five types with different concentrations of gas mixture (40 examples for training and calibration, 20 examples for testing), as shown in Table 4, had been detected (three times of each sample) by PZ mulitichannel sensor. Lie previous study, the same networ structure was used as shown in Figure 2. Also, positive output as a target value of the presence and negative output as a target value of the unexpected answer was used for computing the bac propagation of error algorithm. Figure 4 shows the relationship between the error rate and the number of training cycle. Choosing an appropriate target value in the error algorithm is very important. By comparing line A, B, C and D, it can be seen that ±0.9 is not an appropriate target value in this case. However, R 2 Table 3. Testing result of linear regression Theoretic Analytes conc.(mg/l) Testing Error conc.(mg/l) Toluene % % % % % alcohol % % % % % Acetic acid % % % % % amine % % % % %

6 214 Ping Chang and Jeng-Shong Shih Sample Table 4. Samples of gas mixtures (mg/l) Toluene alcohol Acetic acid amine A A A A B B B B C C C C D D D D E E E E Error Rate/units % Training Cycles Figure 4. Error rate curves in training process, A: target value is ±0.9; B: target value is ±0.5; C: target value is ±0.2; D: target value is ±0. Table 5. The error rate/units of BPN of gas mixture Target value Error rate/units Error rate/units Training ± Testing 0.75 Training ± Testing Training ± Testing 0.15 Training ± Testing Number of examples of training: 40(160 output units) Number of examples of testing: 20(80 output units) Learning cycles: 8000 Table 6. Testing results of BPN of gas mixtures Examples Theoretic output* Output Judgement A1 [1, -1, -1, 1] [1, -1, -1, 1] toluene and butyl amine A2 [1, -1, -1, 1] [1, -1, -1, 1] toluene and butyl amine A3 [1, -1, -1, 1] [1, -1, -1, 1] toluene and butyl amine A4 [1, -1, -1, 1] [1, -1, -1, 1] toluene and butyl amine B1 [1, -1, 1, -1] [1, -1, 1, -1] toluene and acetic acid B2 [1, -1, 1, -1] [1, -1, 1, -1] toluene and acetic acid B3 [1, -1, 1, -1] [1, -1, 1, -1] toluene and acetic acid B4 [1, -1, 1, -1] [1, -1, 1, -1] toluene and acetic acid C1 [1, 1, -1, -1] [1, -1**, -1, -1] toluene C2 [1, 1, -1, -1] [1, 1, -1, -1] toluene and butyl alcohol C3 [1, 1, -1, -1] [1, 1, -1, -1] toluene and butyl alcohol C4 [1, 1, -1, -1] [1, 1, -1, -1] toluene and butyl alcohol D1 [1, 1, 1, -1] [1, -1**, 1, -1] toluene and acetic acid D2 [1, 1, 1, -1] [1, -1**, 1, -1] toluene and acetic acid D3 [1, 1, 1, -1] [1, -1**, 1, -1] toluene and acetic acid D4 [1, 1, 1, -1] [1, -1**, 1, -1] toluene and acetic acid E1 [1, 1, -1, 1] [1, 1, -1, 1] toluene, butyl alcohol and butyl amine E2 [1, 1, -1, 1] [1, 1, -1, 1] toluene, butyl alcohol and butyl amine E3 [1, 1, -1, 1] [1, 1, -1, 1] toluene, butyl alcohol and butyl amine E4 [1, 1, -1, 1] [1, 1, -1, 1] toluene, butyl alcohol and butyl amine *[ toluene, butyl alcohol, acetic acid, butyl amine ] ** Incorrect output

7 The Application of Bac Propagation Neural Networ of Multi-channel Piezoelectric Quartz Crystal Sensor 215 Table 7. Regression equations of MLR of gas mixtures Organic molecule Regression equations R 2 Toluene Y= X X X X X X alcohol Y= X X X X X X Acetic acid Y= X X X X X X amine Y= X X X X X X X 1 : The response of polystyrene X 2 : The response of polyvinyl alcohol X 3 : The response of stearic acid X 4 : The response of fullerene X 5 : The response of polyethylene adipate X 6 : The response of polyvinyl pyrrolidene Table 8. MLR Testing results of organic mixtures Sample Toluene Acetic acid amine True Type1 Type2 True Type1 Type2 True Type1 Type2 A (difference) (error %) -13.1% -14.0% 8.7 % 9.5 % A % -19.4% -9.5% -8.8% A % -29.1% -26.8% -26.0% A % -22.9% -22.7% 22.0 % B % 0.9 % 6.7 % 8.6 % B % -0.2% -7.5% -5.8% B % -7.5% -8.1% -6.3% B % -1.4% 0.2 % -1.9% C % -12% C % -14.7% C % -15.5% C % -18.6%

8 216 Ping Chang and Jeng-Shong Shih Table 9. MLR Testing results of organic mixtures (continued from Table 8) Sample Toluene Acetic acid amine True Type1 Type2 True Type1 Type2 True Type1 Type2 D % 62.8 % 15.4 % 19.5 % D % 24.1 % -9.5% 5.8 % D % 19.8 % -15.6% -12.4% D % 29.8 % -15.7% -12.6% E % 13.8 % -3.1% -1.3% E % -10.4% -10.4% -7.6% E % 4.1 % -21.3% -18.4% E % 2.8 % -24.3% -21.4% Quantitative analysis for organic gas mixture, including toluene, butyl amine, acetic acid and butyl alcohol (interferent in this case), was calculated by multivariate linear regression analysis (MLR). Regression equations are shown in Table 7 and the testing results are shown in Table 8 and Table 9. MLR woring-out of gas mixture with butyl alcohol was shown as Type 2 (butyl alcohol is an interferent). No obvious difference was observed between these two types. In other words, the system can wor even the presence of interferent. In conclusion, various organic molecules, amine (butyl amine), carboxylic acid (acetic acid), alcohol (butyl alcohol) and aromatic molecule (toluene), can be distinguished clearly and determined by the six-channel (polyvinyl alcohol, fullerene, polystyrene, stearic acid, polyethylene adipate and polyvinyl pyrrolidene) piezoelectric detection system. A bac propagation neural networ was used to recognize the organic molecules can be distinguished very clearly. The networ also wored in gas mixture cases. Toluene, acetic acid and butyl amine were distinguished from gas mixture with interferent, alcohol. With an error of about 5-20%, we suggest, the method can be used to detect the components of organic mixtures both qualitatively and quantitatively. Acnowledgment The authors would lie to than the National Science Council of Republic of China in Taiwan for the financial support. References [1] Persaud, K.; Dodd, G. Nature 1982, 299, 352. [2] Carey, W. P.; Beebe, K. R.; Kowalsi, B. R. Anal. Chem. 1986, 58, 149. [3] Sundgren, H.; Lundstrom, I.; Winquist, F. Sensors and Actuators B 1990, 2, 115. [4] Krebs, P.; Grisel, A. Sensors and Actuators B 1993, 13, 155. [5] Wang, X.; Yee, S.; Carey, P. Sensors and Actuators B 1993, 13-14, 458. [6] Lu, C. J.; Shih, J. S. Analytica Chimica Acta 1995, 306, 129. [7] Sheng, H. J.; Shih, J. S. Analytica Chimica

9 The Application of Bac Propagation Neural Networ of Multi-channel Piezoelectric Quartz Crystal Sensor 217 Acta 1997, 350, 109. [8] Jane, Y. S.; Shih, J. S. Analyst 1995, 120, 517. [9] Chao, Y. C.; Shih, J. S. Analytica Chimica Acta 1998, 374, 39. [10] Chiou, C. S.; Shih, J. S. Analytica Chimica Acta 1998, 360, 69. [11] Chang, P.; Shih, J. S. Analytica Chimica Acta 1999, 380, 55. [12] Chang, P.; Shih, J. S. Analytica Chimica Acta 1998, 360, 61. [13] Sauerbrey, G. Z. Z. Phys. 1959, 155, 206. [14] Sauerbrey, G. Z. Z. Phys. 1964, 178, 457. [15] Chang, P.; Shih, J. S. Analytica Chimica Acta 2000, 403, 39. [16] Taylor, M.; Lisboa, P. Techniques and Application of Neural Networs 1993, Ellis Horwood. [17] Rumelhart, D. E.; McClelland, J. L. Parallel Distributed Processing; MIT Press, U.S.A [18] ftp://ftp.sas.com/pub/neural/faq.html Manuscript Received: Sept. 11, 2002 and Accepted: Oct. 11, 2002

Pattern Classification

Pattern Classification Pattern Classification All materials in these slides were taen from Pattern Classification (2nd ed) by R. O. Duda,, P. E. Hart and D. G. Stor, John Wiley & Sons, 2000 with the permission of the authors

More information

Multilayer Perceptron

Multilayer Perceptron Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Single Perceptron 3 Boolean Function Learning 4

More information

Part 8: Neural Networks

Part 8: Neural Networks METU Informatics Institute Min720 Pattern Classification ith Bio-Medical Applications Part 8: Neural Netors - INTRODUCTION: BIOLOGICAL VS. ARTIFICIAL Biological Neural Netors A Neuron: - A nerve cell as

More information

Multilayer Neural Networks

Multilayer Neural Networks Multilayer Neural Networks Multilayer Neural Networks Discriminant function flexibility NON-Linear But with sets of linear parameters at each layer Provably general function approximators for sufficient

More information

An artificial neural networks (ANNs) model is a functional abstraction of the

An artificial neural networks (ANNs) model is a functional abstraction of the CHAPER 3 3. Introduction An artificial neural networs (ANNs) model is a functional abstraction of the biological neural structures of the central nervous system. hey are composed of many simple and highly

More information

Multilayer Perceptron = FeedForward Neural Network

Multilayer Perceptron = FeedForward Neural Network Multilayer Perceptron = FeedForward Neural Networ History Definition Classification = feedforward operation Learning = bacpropagation = local optimization in the space of weights Pattern Classification

More information

Artificial Intelligence (AI) Common AI Methods. Training. Signals to Perceptrons. Artificial Neural Networks (ANN) Artificial Intelligence

Artificial Intelligence (AI) Common AI Methods. Training. Signals to Perceptrons. Artificial Neural Networks (ANN) Artificial Intelligence Artificial Intelligence (AI) Artificial Intelligence AI is an attempt to reproduce intelligent reasoning using machines * * H. M. Cartwright, Applications of Artificial Intelligence in Chemistry, 1993,

More information

LECTURE # - NEURAL COMPUTATION, Feb 04, Linear Regression. x 1 θ 1 output... θ M x M. Assumes a functional form

LECTURE # - NEURAL COMPUTATION, Feb 04, Linear Regression. x 1 θ 1 output... θ M x M. Assumes a functional form LECTURE # - EURAL COPUTATIO, Feb 4, 4 Linear Regression Assumes a functional form f (, θ) = θ θ θ K θ (Eq) where = (,, ) are the attributes and θ = (θ, θ, θ ) are the function parameters Eample: f (, θ)

More information

Unit III. A Survey of Neural Network Model

Unit III. A Survey of Neural Network Model Unit III A Survey of Neural Network Model 1 Single Layer Perceptron Perceptron the first adaptive network architecture was invented by Frank Rosenblatt in 1957. It can be used for the classification of

More information

CS4442/9542b Artificial Intelligence II prof. Olga Veksler. Lecture 5 Machine Learning. Neural Networks. Many presentation Ideas are due to Andrew NG

CS4442/9542b Artificial Intelligence II prof. Olga Veksler. Lecture 5 Machine Learning. Neural Networks. Many presentation Ideas are due to Andrew NG CS4442/9542b Artificial Intelligence II prof. Olga Vesler Lecture 5 Machine Learning Neural Networs Many presentation Ideas are due to Andrew NG Outline Motivation Non linear discriminant functions Introduction

More information

Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks. Cannot approximate (learn) non-linear functions

Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks. Cannot approximate (learn) non-linear functions BACK-PROPAGATION NETWORKS Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks Cannot approximate (learn) non-linear functions Difficult (if not impossible) to design

More information

Lecture 13 Back-propagation

Lecture 13 Back-propagation Lecture 13 Bac-propagation 02 March 2016 Taylor B. Arnold Yale Statistics STAT 365/665 1/21 Notes: Problem set 4 is due this Friday Problem set 5 is due a wee from Monday (for those of you with a midterm

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks Threshold units Gradient descent Multilayer networks Backpropagation Hidden layer representations Example: Face Recognition Advanced topics 1 Connectionist Models Consider humans:

More information

Open Access Response of Quartz Crystal Microbalance Loaded with Single-drop Liquid in Gas Phase

Open Access Response of Quartz Crystal Microbalance Loaded with Single-drop Liquid in Gas Phase Send Orders for Reprints to reprints@benthamscience.ae The Open Electrical & Electronic Engineering Journal, 2014, 8, 197-201 197 Open Access Response of Quartz Crystal Microbalance Loaded with Single-drop

More information

Neural Networks and the Back-propagation Algorithm

Neural Networks and the Back-propagation Algorithm Neural Networks and the Back-propagation Algorithm Francisco S. Melo In these notes, we provide a brief overview of the main concepts concerning neural networks and the back-propagation algorithm. We closely

More information

ELECTRONIC TONGUE. Patrycja Ciosek

ELECTRONIC TONGUE. Patrycja Ciosek ELECTRONIC TONGUE 1 Advantages: Selective Real-time measurement On-line measurement CHEMICAL SENSORS Disadvantages: No sensors for some analytes Sometimes not sufficient selectivity SENSOR ARRAY Identification

More information

Multilayer Neural Networks

Multilayer Neural Networks Multilayer Neural Networks Introduction Goal: Classify objects by learning nonlinearity There are many problems for which linear discriminants are insufficient for minimum error In previous methods, the

More information

Address for Correspondence

Address for Correspondence Research Article APPLICATION OF ARTIFICIAL NEURAL NETWORK FOR INTERFERENCE STUDIES OF LOW-RISE BUILDINGS 1 Narayan K*, 2 Gairola A Address for Correspondence 1 Associate Professor, Department of Civil

More information

Multilayer Neural Networks

Multilayer Neural Networks Pattern Recognition Multilaer Neural Networs Lecture 4 Prof. Daniel Yeung School of Computer Science and Engineering South China Universit of Technolog Outline Introduction (6.) Artificial Neural Networ

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward

More information

Lecture 16: Introduction to Neural Networks

Lecture 16: Introduction to Neural Networks Lecture 16: Introduction to Neural Networs Instructor: Aditya Bhasara Scribe: Philippe David CS 5966/6966: Theory of Machine Learning March 20 th, 2017 Abstract In this lecture, we consider Bacpropagation,

More information

CSC321 Lecture 2: Linear Regression

CSC321 Lecture 2: Linear Regression CSC32 Lecture 2: Linear Regression Roger Grosse Roger Grosse CSC32 Lecture 2: Linear Regression / 26 Overview First learning algorithm of the course: linear regression Task: predict scalar-valued targets,

More information

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm Volume 4, Issue 5, May 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Huffman Encoding

More information

4. Multilayer Perceptrons

4. Multilayer Perceptrons 4. Multilayer Perceptrons This is a supervised error-correction learning algorithm. 1 4.1 Introduction A multilayer feedforward network consists of an input layer, one or more hidden layers, and an output

More information

Artificial Neural Networks Examination, March 2004

Artificial Neural Networks Examination, March 2004 Artificial Neural Networks Examination, March 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

Artificial Neural Networks. Edward Gatt

Artificial Neural Networks. Edward Gatt Artificial Neural Networks Edward Gatt What are Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning Very

More information

Artificial Neural Networks Examination, June 2004

Artificial Neural Networks Examination, June 2004 Artificial Neural Networks Examination, June 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

CSC321 Lecture 5: Multilayer Perceptrons

CSC321 Lecture 5: Multilayer Perceptrons CSC321 Lecture 5: Multilayer Perceptrons Roger Grosse Roger Grosse CSC321 Lecture 5: Multilayer Perceptrons 1 / 21 Overview Recall the simple neuron-like unit: y output output bias i'th weight w 1 w2 w3

More information

ARTIFICIAL CHEMICAL SENSES - ELECTRONIC TONGUE & ELECTRONIC NOSE

ARTIFICIAL CHEMICAL SENSES - ELECTRONIC TONGUE & ELECTRONIC NOSE ARTIFICIAL CHEMICAL SENSES - ELECTRONIC TONGUE & ELECTRONIC NOSE 1 Human senses Physical Vision Hearing Touch Chemical Smell Taste Bionics study modeling analysis Of functioning of biological organisms

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks 鮑興國 Ph.D. National Taiwan University of Science and Technology Outline Perceptrons Gradient descent Multi-layer networks Backpropagation Hidden layer representations Examples

More information

2,4-Toluene Diisocyanate Detection in Liquid and Gas Environments Through. Electrochemical Oxidation in an Ionic Liquid

2,4-Toluene Diisocyanate Detection in Liquid and Gas Environments Through. Electrochemical Oxidation in an Ionic Liquid Electronic Supplementary Material (ESI) for Analyst. This journal is The Royal Society of Chemistry 216 2,4-Toluene Diisocyanate Detection in Liquid and Gas Environments Through Electrochemical Oxidation

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks Stephan Dreiseitl University of Applied Sciences Upper Austria at Hagenberg Harvard-MIT Division of Health Sciences and Technology HST.951J: Medical Decision Support Knowledge

More information

Bidirectional Representation and Backpropagation Learning

Bidirectional Representation and Backpropagation Learning Int'l Conf on Advances in Big Data Analytics ABDA'6 3 Bidirectional Representation and Bacpropagation Learning Olaoluwa Adigun and Bart Koso Department of Electrical Engineering Signal and Image Processing

More information

Artificial Neural Networks. Historical description

Artificial Neural Networks. Historical description Artificial Neural Networks Historical description Victor G. Lopez 1 / 23 Artificial Neural Networks (ANN) An artificial neural network is a computational model that attempts to emulate the functions of

More information

Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!

Artificial Neural Networks and Nonparametric Methods CMPSCI 383 Nov 17, 2011! Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011! 1 Todayʼs lecture" How the brain works (!)! Artificial neural networks! Perceptrons! Multilayer feed-forward networks! Error

More information

Data Mining Part 5. Prediction

Data Mining Part 5. Prediction Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,

More information

Multilayer Neural Networks

Multilayer Neural Networks Pattern Recognition Lecture 4 Multilayer Neural Netors Prof. Daniel Yeung School of Computer Science and Engineering South China University of Technology Lec4: Multilayer Neural Netors Outline Introduction

More information

C4 Phenomenological Modeling - Regression & Neural Networks : Computational Modeling and Simulation Instructor: Linwei Wang

C4 Phenomenological Modeling - Regression & Neural Networks : Computational Modeling and Simulation Instructor: Linwei Wang C4 Phenomenological Modeling - Regression & Neural Networks 4040-849-03: Computational Modeling and Simulation Instructor: Linwei Wang Recall.. The simple, multiple linear regression function ŷ(x) = a

More information

3. Linear discrimination and single-layer neural networks

3. Linear discrimination and single-layer neural networks History: Sept 6, 014: created Oct 1, 014: corrected typos, made fig 3. visible 3. Linear discrimination and single-layer neural networs In this section we will treat a special case of two-class classification,

More information

Introduction to Neural Networks

Introduction to Neural Networks Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning

More information

Gas Detection System Based on Multi-Sensor Fusion with BP Neural Network

Gas Detection System Based on Multi-Sensor Fusion with BP Neural Network Sensors & Transducers 2013 by IFSA http://www.sensorsportal.com Gas Detection System Based on Multi-Sensor Fusion with BP Neural Network Qiu-Xia LIU Department of Physics, Heze University, Heze Shandong

More information

Slide04 Haykin Chapter 4: Multi-Layer Perceptrons

Slide04 Haykin Chapter 4: Multi-Layer Perceptrons Introduction Slide4 Hayin Chapter 4: Multi-Layer Perceptrons CPSC 636-6 Instructor: Yoonsuc Choe Spring 28 Networs typically consisting of input, hidden, and output layers. Commonly referred to as Multilayer

More information

o0, and since the period of vibration of the scale is related to the mass,

o0, and since the period of vibration of the scale is related to the mass, 459 THE USE OF RESONATING DEVICES TO MAKE SMALL MASS MEASUREMENTS* W. H. KING, JR. Esso Research and Engineering Company Linden, N.J. W^i THEN we wish to weigh something most of us think of a scale or

More information

Modeling and Compensation for Capacitive Pressure Sensor by RBF Neural Networks

Modeling and Compensation for Capacitive Pressure Sensor by RBF Neural Networks 21 8th IEEE International Conference on Control and Automation Xiamen, China, June 9-11, 21 ThCP1.8 Modeling and Compensation for Capacitive Pressure Sensor by RBF Neural Networks Mahnaz Hashemi, Jafar

More information

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis Introduction to Natural Computation Lecture 9 Multilayer Perceptrons and Backpropagation Peter Lewis 1 / 25 Overview of the Lecture Why multilayer perceptrons? Some applications of multilayer perceptrons.

More information

Artificial Neural Network Method of Rock Mass Blastability Classification

Artificial Neural Network Method of Rock Mass Blastability Classification Artificial Neural Network Method of Rock Mass Blastability Classification Jiang Han, Xu Weiya, Xie Shouyi Research Institute of Geotechnical Engineering, Hohai University, Nanjing, Jiangshu, P.R.China

More information

Lecture 3: Pattern Classification

Lecture 3: Pattern Classification EE E6820: Speech & Audio Processing & Recognition Lecture 3: Pattern Classification 1 2 3 4 5 The problem of classification Linear and nonlinear classifiers Probabilistic classification Gaussians, mixtures

More information

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 BIOLOGICAL INSPIRATIONS Some numbers The human brain contains about 10 billion nerve cells (neurons) Each neuron is connected to the others through 10000

More information

Neural Networks and Deep Learning

Neural Networks and Deep Learning Neural Networks and Deep Learning Professor Ameet Talwalkar November 12, 2015 Professor Ameet Talwalkar Neural Networks and Deep Learning November 12, 2015 1 / 16 Outline 1 Review of last lecture AdaBoost

More information

SPSS, University of Texas at Arlington. Topics in Machine Learning-EE 5359 Neural Networks

SPSS, University of Texas at Arlington. Topics in Machine Learning-EE 5359 Neural Networks Topics in Machine Learning-EE 5359 Neural Networks 1 The Perceptron Output: A perceptron is a function that maps D-dimensional vectors to real numbers. For notational convenience, we add a zero-th dimension

More information

Detection of Volatile S- and N-containing Compounds Based on SAW Array Sensor

Detection of Volatile S- and N-containing Compounds Based on SAW Array Sensor American Journal of Chemical and Biochemical Engineering 2017; 1(1: 35-39 http://www.sciencepublishinggroup.com/j/ajcbe doi: 10.11648/j.ajcbe.20170101.15 Detection of Volatile S- and N-containing Compounds

More information

Back-Propagation Algorithm. Perceptron Gradient Descent Multilayered neural network Back-Propagation More on Back-Propagation Examples

Back-Propagation Algorithm. Perceptron Gradient Descent Multilayered neural network Back-Propagation More on Back-Propagation Examples Back-Propagation Algorithm Perceptron Gradient Descent Multilayered neural network Back-Propagation More on Back-Propagation Examples 1 Inner-product net =< w, x >= w x cos(θ) net = n i=1 w i x i A measure

More information

Multilayer Feedforward Networks. Berlin Chen, 2002

Multilayer Feedforward Networks. Berlin Chen, 2002 Multilayer Feedforard Netors Berlin Chen, 00 Introduction The single-layer perceptron classifiers discussed previously can only deal ith linearly separable sets of patterns The multilayer netors to be

More information

Artificial Neural Networks Examination, June 2005

Artificial Neural Networks Examination, June 2005 Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either

More information

Artificial Neural Network : Training

Artificial Neural Network : Training Artificial Neural Networ : Training Debasis Samanta IIT Kharagpur debasis.samanta.iitgp@gmail.com 06.04.2018 Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.04.2018 1 / 49 Learning of neural

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) Human Brain Neurons Input-Output Transformation Input Spikes Output Spike Spike (= a brief pulse) (Excitatory Post-Synaptic Potential)

More information

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks INFOB2KI 2017-2018 Utrecht University The Netherlands ARTIFICIAL INTELLIGENCE Artificial Neural Networks Lecturer: Silja Renooij These slides are part of the INFOB2KI Course Notes available from www.cs.uu.nl/docs/vakken/b2ki/schema.html

More information

Neural Network Based Density Measurement

Neural Network Based Density Measurement Bulg. J. Phys. 31 (2004) 163 169 P. Neelamegam 1, A. Rajendran 2 1 PG and Research Department of Physics, AVVM Sri Pushpam College (Autonomous), Poondi, Thanjavur, Tamil Nadu-613 503, India 2 PG and Research

More information

Multilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs)

Multilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs) Multilayer Neural Networks (sometimes called Multilayer Perceptrons or MLPs) Linear separability Hyperplane In 2D: w x + w 2 x 2 + w 0 = 0 Feature x 2 = w w 2 x w 0 w 2 Feature 2 A perceptron can separate

More information

Multilayer Perceptrons (MLPs)

Multilayer Perceptrons (MLPs) CSE 5526: Introduction to Neural Networks Multilayer Perceptrons (MLPs) 1 Motivation Multilayer networks are more powerful than singlelayer nets Example: XOR problem x 2 1 AND x o x 1 x 2 +1-1 o x x 1-1

More information

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Neural Networks Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Brains as Computational Devices Brains advantages with respect to digital computers: Massively parallel Fault-tolerant Reliable

More information

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption ANDRÉ NUNES DE SOUZA, JOSÉ ALFREDO C. ULSON, IVAN NUNES

More information

Multilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs)

Multilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs) Multilayer Neural Networks (sometimes called Multilayer Perceptrons or MLPs) Linear separability Hyperplane In 2D: w 1 x 1 + w 2 x 2 + w 0 = 0 Feature 1 x 2 = w 1 w 2 x 1 w 0 w 2 Feature 2 A perceptron

More information

Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter

Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter (Chair) STF - China Fellow francesco.dimaio@polimi.it

More information

CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS

CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS LAST TIME Intro to cudnn Deep neural nets using cublas and cudnn TODAY Building a better model for image classification Overfitting

More information

ECE 471/571 - Lecture 17. Types of NN. History. Back Propagation. Recurrent (feedback during operation) Feedforward

ECE 471/571 - Lecture 17. Types of NN. History. Back Propagation. Recurrent (feedback during operation) Feedforward ECE 47/57 - Lecture 7 Back Propagation Types of NN Recurrent (feedback during operation) n Hopfield n Kohonen n Associative memory Feedforward n No feedback during operation or testing (only during determination

More information

Artificial Neural Networks

Artificial Neural Networks Introduction ANN in Action Final Observations Application: Poverty Detection Artificial Neural Networks Alvaro J. Riascos Villegas University of los Andes and Quantil July 6 2018 Artificial Neural Networks

More information

A novel intelligent predictive maintenance procedure for electrical machines

A novel intelligent predictive maintenance procedure for electrical machines A novel intelligent predictive maintenance procedure for electrical machines D.-M. Yang Department of Automation Engineering, Kao-Yuan University, No.1821 Chung-Shan Road, Loju Hsiang, Kaohsiung County,

More information

Computational Intelligence Winter Term 2017/18

Computational Intelligence Winter Term 2017/18 Computational Intelligence Winter Term 207/8 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS ) Fakultät für Informatik TU Dortmund Plan for Today Single-Layer Perceptron Accelerated Learning

More information

Neural networks. Chapter 20. Chapter 20 1

Neural networks. Chapter 20. Chapter 20 1 Neural networks Chapter 20 Chapter 20 1 Outline Brains Neural networks Perceptrons Multilayer networks Applications of neural networks Chapter 20 2 Brains 10 11 neurons of > 20 types, 10 14 synapses, 1ms

More information

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided

More information

A Reservoir Sampling Algorithm with Adaptive Estimation of Conditional Expectation

A Reservoir Sampling Algorithm with Adaptive Estimation of Conditional Expectation A Reservoir Sampling Algorithm with Adaptive Estimation of Conditional Expectation Vu Malbasa and Slobodan Vucetic Abstract Resource-constrained data mining introduces many constraints when learning from

More information

LIMITATIONS OF RECEPTRON. XOR Problem The failure of the perceptron to successfully simple problem such as XOR (Minsky and Papert).

LIMITATIONS OF RECEPTRON. XOR Problem The failure of the perceptron to successfully simple problem such as XOR (Minsky and Papert). LIMITATIONS OF RECEPTRON XOR Problem The failure of the ercetron to successfully simle roblem such as XOR (Minsky and Paert). x y z x y z 0 0 0 0 0 0 Fig. 4. The exclusive-or logic symbol and function

More information

AI Programming CS F-20 Neural Networks

AI Programming CS F-20 Neural Networks AI Programming CS662-2008F-20 Neural Networks David Galles Department of Computer Science University of San Francisco 20-0: Symbolic AI Most of this class has been focused on Symbolic AI Focus or symbols

More information

Learning and Memory in Neural Networks

Learning and Memory in Neural Networks Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units

More information

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen Neural Networks - I Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Neural Networks 1 /

More information

Neural Networks biological neuron artificial neuron 1

Neural Networks biological neuron artificial neuron 1 Neural Networks biological neuron artificial neuron 1 A two-layer neural network Output layer (activation represents classification) Weighted connections Hidden layer ( internal representation ) Input

More information

Selection of the Appropriate Lag Structure of Foreign Exchange Rates Forecasting Based on Autocorrelation Coefficient

Selection of the Appropriate Lag Structure of Foreign Exchange Rates Forecasting Based on Autocorrelation Coefficient Selection of the Appropriate Lag Structure of Foreign Exchange Rates Forecasting Based on Autocorrelation Coefficient Wei Huang 1,2, Shouyang Wang 2, Hui Zhang 3,4, and Renbin Xiao 1 1 School of Management,

More information

dissolved into methanol (20 ml) to form a solution. 2-methylimidazole (263 mg) was dissolved in

dissolved into methanol (20 ml) to form a solution. 2-methylimidazole (263 mg) was dissolved in Experimental section Synthesis of small-sized ZIF-8 particles (with average diameter of 50 nm): Zn(NO 3 ) 2 (258 mg) was dissolved into methanol (20 ml) to form a solution. 2-methylimidazole (263 mg) was

More information

Neural Networks DWML, /25

Neural Networks DWML, /25 DWML, 2007 /25 Neural networks: Biological and artificial Consider humans: Neuron switching time 0.00 second Number of neurons 0 0 Connections per neuron 0 4-0 5 Scene recognition time 0. sec 00 inference

More information

Neural Networks. Nicholas Ruozzi University of Texas at Dallas

Neural Networks. Nicholas Ruozzi University of Texas at Dallas Neural Networks Nicholas Ruozzi University of Texas at Dallas Handwritten Digit Recognition Given a collection of handwritten digits and their corresponding labels, we d like to be able to correctly classify

More information

ECE521 Lecture 7/8. Logistic Regression

ECE521 Lecture 7/8. Logistic Regression ECE521 Lecture 7/8 Logistic Regression Outline Logistic regression (Continue) A single neuron Learning neural networks Multi-class classification 2 Logistic regression The output of a logistic regression

More information

Neural Networks. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington

Neural Networks. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington Neural Networks CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington 1 Perceptrons x 0 = 1 x 1 x 2 z = h w T x Output: z x D A perceptron

More information

Introduction Biologically Motivated Crude Model Backpropagation

Introduction Biologically Motivated Crude Model Backpropagation Introduction Biologically Motivated Crude Model Backpropagation 1 McCulloch-Pitts Neurons In 1943 Warren S. McCulloch, a neuroscientist, and Walter Pitts, a logician, published A logical calculus of the

More information

Computational Intelligence Lecture 3: Simple Neural Networks for Pattern Classification

Computational Intelligence Lecture 3: Simple Neural Networks for Pattern Classification Computational Intelligence Lecture 3: Simple Neural Networks for Pattern Classification Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Fall 2011 arzaneh Abdollahi

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Neural Networks Varun Chandola x x 5 Input Outline Contents February 2, 207 Extending Perceptrons 2 Multi Layered Perceptrons 2 2. Generalizing to Multiple Labels.................

More information

Computational statistics

Computational statistics Computational statistics Lecture 3: Neural networks Thierry Denœux 5 March, 2016 Neural networks A class of learning methods that was developed separately in different fields statistics and artificial

More information

Introduction to Artificial Neural Networks

Introduction to Artificial Neural Networks Facultés Universitaires Notre-Dame de la Paix 27 March 2007 Outline 1 Introduction 2 Fundamentals Biological neuron Artificial neuron Artificial Neural Network Outline 3 Single-layer ANN Perceptron Adaline

More information

sensors ISSN by MDPI

sensors ISSN by MDPI Sensors 26, 6, 324-334 sensors ISSN 1424-822 26 by MDPI http://www.mdpi.org/sensors Quartz Crystal Nanobalance in Conjunction with Principal Component Analysis for Identification of Volatile Organic Compounds

More information

INTRODUCTION TO SCA\ \I\G TUNNELING MICROSCOPY

INTRODUCTION TO SCA\ \I\G TUNNELING MICROSCOPY INTRODUCTION TO SCA\ \I\G TUNNELING MICROSCOPY SECOND EDITION C. JULIAN CHEN Department of Applied Physics and Applied Mathematics, Columbia University, New York OXFORD UNIVERSITY PRESS Contents Preface

More information

Artificial Neural Network

Artificial Neural Network Artificial Neural Network Contents 2 What is ANN? Biological Neuron Structure of Neuron Types of Neuron Models of Neuron Analogy with human NN Perceptron OCR Multilayer Neural Network Back propagation

More information

Multilayer Neural Networks and the Backpropagation Algorithm

Multilayer Neural Networks and the Backpropagation Algorithm Module 3 Multilayer Neural Networs and the Bacpropagation Algorithm Prof. Marzui Bin Khalid CAIRO Faulti Keuruteraan Eletri Universiti Tenologi Malaysia marzui@utml.utm.my 1 UTM Module 3 Obectives To understand

More information

Lecture 4: Perceptrons and Multilayer Perceptrons

Lecture 4: Perceptrons and Multilayer Perceptrons Lecture 4: Perceptrons and Multilayer Perceptrons Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning Perceptrons, Artificial Neuronal Networks Lecture 4: Perceptrons

More information

Feedforward Neural Nets and Backpropagation

Feedforward Neural Nets and Backpropagation Feedforward Neural Nets and Backpropagation Julie Nutini University of British Columbia MLRG September 28 th, 2016 1 / 23 Supervised Learning Roadmap Supervised Learning: Assume that we are given the features

More information

Lecture 7 Artificial neural networks: Supervised learning

Lecture 7 Artificial neural networks: Supervised learning Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in

More information

ULTRASONIC CHEMICAL SENSOR FOR DETECTION OF ALIPHATIC AND AROMATIC HYDROCARBONS IN AIR

ULTRASONIC CHEMICAL SENSOR FOR DETECTION OF ALIPHATIC AND AROMATIC HYDROCARBONS IN AIR ARCHIVES OF ACOUSTICS 32, 4 (Supplement), 53 58 (2007) ULTRASONIC CHEMICAL SENSOR FOR DETECTION OF ALIPHATIC AND AROMATIC HYDROCARBONS IN AIR Andrzej BALCERZAK (1), Genady ZHAVNERKO (2) (1) Institute of

More information

CS:4420 Artificial Intelligence

CS:4420 Artificial Intelligence CS:4420 Artificial Intelligence Spring 2018 Neural Networks Cesare Tinelli The University of Iowa Copyright 2004 18, Cesare Tinelli and Stuart Russell a a These notes were originally developed by Stuart

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward

More information

STA 414/2104: Lecture 8

STA 414/2104: Lecture 8 STA 414/2104: Lecture 8 6-7 March 2017: Continuous Latent Variable Models, Neural networks Delivered by Mark Ebden With thanks to Russ Salakhutdinov, Jimmy Ba and others Outline Continuous latent variable

More information

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau Last update: October 26, 207 Neural networks CMSC 42: Section 8.7 Dana Nau Outline Applications of neural networks Brains Neural network units Perceptrons Multilayer perceptrons 2 Example Applications

More information