Application of hopfield network in improvement of fingerprint recognition process Mahmoud Alborzi 1, Abbas Toloie- Eshlaghy 1 and Dena Bazazian 2
|
|
- Bruce Conley
- 6 years ago
- Views:
Transcription
1 5797 Available online at Computer Science and Engineering Elixir Comp. Sci. & Engg. 41 (211) Application hopfield network in improvement recognition process Mahmoud Alborzi 1, Abbas Toloie- Eshlaghy 1 and Dena Bazazian 2 1 Industrial management department, Science and Research Branch, Islamic Azad University, Tehran, Iran 2 Information technology management department, Science and Research Branch, Islamic Azad University, Tehran, Iran. ARTICLE INF O Article history: Received: 25 September 211; Received in revised form: 18 November 211; Accepted: 29 November 211; Keywords Fingerprint image, Noise, Hopfield Network, Hamming Distance. AB STRACT Hopfield Network is able to convert noisy data provided to the network because it can act as content addressable memory. Hopfield Network can convert noisy to the noiseless or the with the minimum noise by training through receivable models set. In this research, recognition process has been performed through Hamming Distance and effect use Hopfield network on recognition process has been mentioned. In case those Hamming Distance operations are performed after Hopfield Networks processing, error recognition will be reduced. 211 Elixir All rights reserved. Introduction Hopfield Network is one the neural networks which have been used in the present research. Hopfield network in can have useful applications. Fingerprint which are used for identification may have noise. For example, it may be injured. In the present research, we study to what extent Hopfield Network can improve noisy or defective and be applied in recognition process. Fingerprint become binary after recalling to be calculated for Hopfield network studies. Hamming Distance is regarded as the best option due to application for binary data. Hamming Distance will find the closest similar image by studying differences between each image input and other. Images Fingerprint each person is a unique biological trait and can have effective applications for identification. Identification system based on biological traits should be distinguished and can be gathered and stable and bolometric trait has the mentioned specifications so that it has been regarded as valid evidence in judicial authorities (Daliri et al, 27). Images are fingertips tissue pattern which have unevenness. Unevenness the fingertip is due to wrinkles (Jean et al, 1997). According to analysis and study wrinkles in fingertip, we can adjust identify recognition system based on this biological trait. Preprocessing operations on s : In the present research, 4 s have been used as model set for Hopfield Network learning. All fingertips are converted to black and white format and then size all is equal. Primary fingertip are in large dimensions which are difficult to calculate. For this reason, will be calculated in smaller sizes. Background fingertips is removed and contrast the is increased so that fingertips are made more evident. Stages preprocessing operations on one the are shown in Figure 1. After performing preprocessing on the model set, Hopfield operations will be performed. Figure 1: Preprocessing operations on the Recall In the present research, all operations on have been done in MATLAB stware. In this stware, the related matrix will be specified after recall the. Matrix are determined on the basis gray spectrum zero to 255 belonging to each one the image pixels. In the next stage, are binary and because the studied method is Hopfield network and all studied in Hopfield network should be +1 and -1, therefore, all zeros are converted to -1 and +1s remain unchanged. In the following Figure, matrixes image are shown. In the first stage, all matrix are between zero and 255 and are zero and 1 in the next stage and finally between -1 and +1. Figure 2: Fingerprint matrices Tele: addresses: toloie@gmail.com, toloie@srbiau.ac.ir 211 Elixir All rights reserved
2 5798 Hopfield Network Hopfield network is a kind recurrent artificial neural networks have been invented by John Hopfield. Hopfield network can play role a system with dependent memory including two mode components. Hopfield Network has been composed some nodes each connected to other nodes. Therefore, with regard to the shown figures, among two Hopfield networks, this network is a related network. Figure 3: Different Hopfield Network Components Hopfield network are two state components. It means that they can be in two defined states and that their position is determined dependent on the fact that input set the related component is less or more than the threshold value. In Hopfield Network, all neurons act like each other and there is no classification input and output neurons. In fact, Hopfield network is a recurrent network in which output each neuron is connected to input other neurons. This action is repeated to the extent that output the network converges to a fixed point. Hopfield Network Algorithm In this section, the algorithm which determines function Hopfield network has been mentioned. The mentioned algorithm is derived from book introduction to neural networks. (Translated by Alborzi, 21) Stages algorithm: weight coefficients are determined with the following formula: (1) M 1 i S= s X X s j i = j, < i, j< M 1 w ij is weight coefficient from node I to node j and is ith member sample model s. value is +1 and -1. The number models is M from zero to M-1 and threshold value all nodes is zero. We give unknown model to the network. at t time. () =, < i < N-1 We iterate it to perfect convergence. = () =, < i < N-1 is, i node output Function a Heaviside step function Figure 4. Heaviside step function behaves in such a manner that output is one in case positive input and output is zero in case negative input. Weight coefficients between neurons are determined with use selective sample algorithm all classes with the formula which is given in algorithm. This stage is stage network education so that each model is made similar to it. In stage identification, output the network is put against the unknown model by imposition, and then the network is released to change in discrete time intervals as long as it reaches stability and its output is fixed. In such case, the network converges to final answer. If an entangled pattern is given to the network, the network will produce its correct pattern, therefore, the network acts as a content addressable memory. In Hopfield Network, patters are not supplied to the network one by one but they are fed to the network with use the above stage by calculating weight coefficients. In fact, network education is done only in one step. Function the Hopfield Network can be summarized as follows: We commission the network We supply the unknown pattern We iterate it until final convergence Noisy processing with Hopfield network: In order to process with Hopfield network, a set has been given to the network. Then, noisy are supplied to the network. In this research, have been studied by adding line or point as noisy image. After the are noisy, they become binary and matrix noisy is studied with noiseless and then the matrix change after being noisy. Noisy in matrix which have been studied as unknown pattern range from to 6. In the following table, the number with noisy has been studied: Matrix unknown pattern faces different results after convergence. In this following table 4, noisy are given with primary matrixes noiseless, noisy matrix and final matrix after convergence. Matrix will have different results after convergence. The best result which is obtained from Hopfield networks is similar to the first line the above table. In this case, is noiseless like the primary. As shown in the above table, all the matrix may not change correctly and sometimes some fixed matrix may change after convergence. Generally, results obtained from Hopfield operations can be divided into four groups as follows: First answer group: final matrix is similar to the primary matrix and is noiseless after convergence and all have been similar to the primary matrix after convergence. For matrixes with zero, first answer group has not changed after operations Hopfield network the studied matrix. Second answer group: in the final matrix, some are noiseless after convergence and some the noisy matrix have not changed. Third answer group: in the final matrix, some which were fixed after being noisy changed after convergence. Fourth answer group: have not changed in the final matrix after convergence. In some cases, results Hopfield network are combination two answer groups. Group s combination is as 1 and 3, 2 and 3 and 3 and 4. Figure 4: Heaviside step function
3 5799 Answer 1 and 3 group: have changed correctly but some the main matrix have changed by mistake. Answer 2 and 3 group: have changed correctly but some have not changed. Some the main matrix have changed by mistake. Answer 3 and 4 group: have not changed after convergence but some have changed by mistake. As mentioned above, 4 have been stored in system. Different these 4 have been noisy with different noises randomly. With regard to the fact that there are some with different noises, number these is 1. 1 matrixes image with zero to 6 have been studied with Hopfield network. After study the es with Hopfield network, matrixes resulting from these operations are studied in recognition process according to Hamming method. 1 the image are studied again in recognition process to define effect Hopfield in recognition process. Hamming Distance: Hamming is obtained by calculating difference each element in a vector and its corresponding element in another vector and sum absolute magnitude differences. Hamming Distance is applied for calculating vectors zero and one. With regard to the obtained numbers in calculations and matrices, Hamming is the best option for recognition process. H = ( x i -y i ) With the above formula, the minimum sum absolute magnitude differences is found and the closest vector similar to the input vector is obtained. Fingerprint recognition process according to Hamming In the present research, are studied with or without Hopfield network according to Hamming. In study recognition process, effect Hopfield network is specified. As mentioned in the previous section, image has been considered. Matrixes the mentioned have to 6. Fingerprint group which have been studied with Hopfield network in recognition process have got closer to the primary noiseless image after operations Hopfield network. As mentioned above, 4 sound have been stored in the system. Different these 4 have been randomly noisy with different noises. With regard to the fact that some are available with different noises, the number these group is 1. In recognition process, one image out 1 is regarded as input according to Hamming and compared with 4 saved in the system. Hamming is studied in such a manner that input matrix are compared with 4 matrixes saved in the system and finally the closest similar matrix the input image is mentioned. The announced results in recognition process can be classified into four classes answer. In case that only one matrix similar to input matrix is announced which is input matrix, result recognition process has been correct in this case. o In case that two matrixes are similar to input matrix provided that there is matrix the primary image similar to the input image, result recognition process is announced for two similar to input image. o In case that three matrixes are similar to input matrix provided that there is matrix the primary image similar to the input image, result recognition process is announced for three similar to input image. o In case that the announced matrixes are not the matrixes corresponding to input image, the announced result is incorrect in recognition process. o Therefore, 1 are studied (with and without Hopfield Network) according to Hamming Distance in two stages and the obtained results are shown in table 4. With regard to Table 4, there are more correct results with Hopfield Network and fewer incorrect results. That the noise applied to the is high range; one can make correct comparison on the basis in. With increase in, it is unlikely that the image similar to the first image will be obtained without noise with Hopfield network. Study hamming results with regard to noise the In this part the article, results obtained from recognition process with or without Hopfield network are compared with each other on the basis noise the. In table 5, the obtained results are shown. The upper number in each cell table is the result with Hopfield network and the lower number is the result without Hopfield Network. As shown in Table 5, results recognition process with Hopfield network is averagely more correct. The important point in this table is without noise or one noisy element and in this case, more correct results have been obtained without Hopfield network. In with average noise rate, there has been more correct recognition with Hopfield network and less incorrect recognition. In with 5 and 6 noisy in each matrix, there has been no correct recognition with and without Hopfield network but there is recognition for some similar and there are fewer results with Hopfield network than those without Hopfield network in incorrect recognitions. In diagrams 3 to 9, percentage correct results and recognition two similar (2?) and recognition three similar (3?) and incorrect results are shown on the basis noise in.
4 58 Conclusion In this article, effect Hopfield network on improvement recognition process was mentioned. Hopfield network could convert with average noise to without noise or the least noise. In case performance Hopfield network before recognition process, effect error will be reduced. Hopfield network can act as a content addressable memory and is able to convert noisy to noiseless or with the least noise. In case those operations Hopfield network are performed on with average noise and the obtained are studied in the recognition process, error rate will be reduced. Therefore, Hopfield network is able to improve recognition process. References [1] Alborzi, M. 21. Introduction to neural networks. Second edition. Tehran, Sharif University Technology Publication Institute. [2] Daliri, A., Osarerh, A. 27. Biometric security system. Bioelectronics Seminar Biomedical Engineering Faculty Amir Kabir University Technology. [3] Abd Allah, M. 25. Artificial neural networks based authentication with clusters algorithm. Informatica. 29, PP [4] Baldi, P., Chauvin, Y Neural Networks for Fingerprint Recognition. Massachusetts Institute Technology, Neural Computation 5, PP [5] Choudhary, P. 29. Human recognition using multiple s. B. Tech. Project Report. Department Computer Science and Engineering, Indian Institute Technology, Kanpur, India [6] Hopfield J. J Neural networks and physical systems with emergent collective computational abilities. Proceedings the National Academy Sciences the USA. vol. 79 no. 8 pp [7] Jain. A.K, Hong.L, Pankanti.S and Boll.R An identity authentication system using s. Proc. IEEE vol 85, no.9, pp [8] Weber, D.M A cost effective verification algorithm for commercial applications. IEEE. Unknown. [9] Yang J., Govindaraju, V. 25. A minutia-based partial recognition system. The journal Pattern Recognition society 38. pp [1] Yi,S., Jie-Ge, L., Sung, Y Improvement on performance modified Hopfield neural network for image restoration. IEEE transaction on image processing vol. 4, No. 5, pp noisy 6 Table 1: Noisy and rate in noisy noisy noisy noisy noisy noisy 1 Noise rate Matrix matrix
5 581 Table 2: Results the studied matrixes with Hopfield network Table 3: Analysis Hopfield network results and in noise Result 16% 64% 37% 44% Matrixes which have reached first answer group. 11% 14% 37% 56% Matrixes which have reached first and third answer group 12% 3 44% 21% Matrixes which have reached fourth answer group 9 81% 6 22% Matrixes which have reached second and third answer group. 6% 1% 6% Matrixes which have reached third and fourth answer group 1 5% Matrixes which have reached fourth answer group
6 582 Table 4: Study on effect Hopfield network by comparing results obtained from Hamming Without Hopfield Network With Hopfield Network 1 1 Total number according to Hamming distract The number results showing that Hamming is correct The number results 2 similar Hamming 1 11 The number results 3 similar Hamming The number results showing that Hamming is not correct. Table 5: Study noise and effect Hopfield network on recognition process according to hamming The number The number The number The number The number The number The number noisy noisy noisy noisy in in noisy in in in in image image in matrix equivalent to 2 matrix equivalent to % 44% 6% 37% 94% % 23% 22% 27% 49% 86% 14% 85% 8% 7% 88% 93% 6% 7% 6% 75% 25% 7% 93% Recognition correct Hamming Recognition two similar hamming Recognition three similar hamming Recognition incorrect Hamming
CHAPTER 3. Pattern Association. Neural Networks
CHAPTER 3 Pattern Association Neural Networks Pattern Association learning is the process of forming associations between related patterns. The patterns we associate together may be of the same type or
More informationUsing a Hopfield Network: A Nuts and Bolts Approach
Using a Hopfield Network: A Nuts and Bolts Approach November 4, 2013 Gershon Wolfe, Ph.D. Hopfield Model as Applied to Classification Hopfield network Training the network Updating nodes Sequencing of
More informationConvolutional Associative Memory: FIR Filter Model of Synapse
Convolutional Associative Memory: FIR Filter Model of Synapse Rama Murthy Garimella 1, Sai Dileep Munugoti 2, Anil Rayala 1 1 International Institute of Information technology, Hyderabad, India. rammurthy@iiit.ac.in,
More information2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net.
2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net. - For an autoassociative net, the training input and target output
More informationLecture 7 Artificial neural networks: Supervised learning
Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in
More informationHopfield Neural Network
Lecture 4 Hopfield Neural Network Hopfield Neural Network A Hopfield net is a form of recurrent artificial neural network invented by John Hopfield. Hopfield nets serve as content-addressable memory systems
More informationOn the Hopfield algorithm. Foundations and examples
General Mathematics Vol. 13, No. 2 (2005), 35 50 On the Hopfield algorithm. Foundations and examples Nicolae Popoviciu and Mioara Boncuţ Dedicated to Professor Dumitru Acu on his 60th birthday Abstract
More informationStorage Capacity of Letter Recognition in Hopfield Networks
Storage Capacity of Letter Recognition in Hopfield Networks Gang Wei (gwei@cs.dal.ca) Zheyuan Yu (zyu@cs.dal.ca) Faculty of Computer Science, Dalhousie University, Halifax, N.S., Canada B3H 1W5 Abstract:
More informationARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD
ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided
More informationArtificial Intelligence Hopfield Networks
Artificial Intelligence Hopfield Networks Andrea Torsello Network Topologies Single Layer Recurrent Network Bidirectional Symmetric Connection Binary / Continuous Units Associative Memory Optimization
More informationNeural Networks Introduction
Neural Networks Introduction H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011 H. A. Talebi, Farzaneh Abdollahi Neural Networks 1/22 Biological
More informationIn biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required.
In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required. In humans, association is known to be a prominent feature of memory.
More informationSCRAM: Statistically Converging Recurrent Associative Memory
SCRAM: Statistically Converging Recurrent Associative Memory Sylvain Chartier Sébastien Hélie Mounir Boukadoum Robert Proulx Department of computer Department of computer science, UQAM, Montreal, science,
More informationComputational Intelligence Lecture 6: Associative Memory
Computational Intelligence Lecture 6: Associative Memory Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Fall 2011 Farzaneh Abdollahi Computational Intelligence
More informationWeek 4: Hopfield Network
Week 4: Hopfield Network Phong Le, Willem Zuidema November 20, 2013 Last week we studied multi-layer perceptron, a neural network in which information is only allowed to transmit in one direction (from
More informationAssociative Memory : Soft Computing Course Lecture 21 24, notes, slides RC Chakraborty, Aug.
Associative Memory : Soft Computing Course Lecture 21 24, notes, slides www.myreaders.info/, RC Chakraborty, e-mail rcchak@gmail.com, Aug. 10, 2010 http://www.myreaders.info/html/soft_computing.html www.myreaders.info
More informationPattern Association or Associative Networks. Jugal Kalita University of Colorado at Colorado Springs
Pattern Association or Associative Networks Jugal Kalita University of Colorado at Colorado Springs To an extent, learning is forming associations. Human memory associates similar items, contrary/opposite
More informationNeural Networks DWML, /25
DWML, 2007 /25 Neural networks: Biological and artificial Consider humans: Neuron switching time 0.00 second Number of neurons 0 0 Connections per neuron 0 4-0 5 Scene recognition time 0. sec 00 inference
More informationArtificial Neural Network
Artificial Neural Network Eung Je Woo Department of Biomedical Engineering Impedance Imaging Research Center (IIRC) Kyung Hee University Korea ejwoo@khu.ac.kr Neuron and Neuron Model McCulloch and Pitts
More informationarxiv: v2 [nlin.ao] 19 May 2015
Efficient and optimal binary Hopfield associative memory storage using minimum probability flow arxiv:1204.2916v2 [nlin.ao] 19 May 2015 Christopher Hillar Redwood Center for Theoretical Neuroscience University
More informationArtificial Neural Networks Examination, March 2004
Artificial Neural Networks Examination, March 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum
More informationDynamic Classification of Power Systems via Kohonen Neural Network
Dynamic Classification of Power Systems via Kohonen Neural Network S.F Mahdavizadeh Iran University of Science & Technology, Tehran, Iran M.A Sandidzadeh Iran University of Science & Technology, Tehran,
More informationNeural Networks. Associative memory 12/30/2015. Associative memories. Associative memories
//5 Neural Netors Associative memory Lecture Associative memories Associative memories The massively parallel models of associative or content associative memory have been developed. Some of these models
More informationPalmprint based Verification System Robust to Occlusion using Low-order Zernike Moments of Sub-images
BADRINATH, NARESH, GUPTA: PALMPRINT BASED VERIFICATION SYSTEM 1 Palmprint based Verification System Robust to Occlusion using Low-order Zernike Moments of Sub-images Badrinath G. S. badri@cse.iitk.ac.in
More informationNeural Nets and Symbolic Reasoning Hopfield Networks
Neural Nets and Symbolic Reasoning Hopfield Networks Outline The idea of pattern completion The fast dynamics of Hopfield networks Learning with Hopfield networks Emerging properties of Hopfield networks
More informationHandwritten English Character Recognition using Pixel Density Gradient Method
International Journal of Computer Science and Engineering Open Access Research Paper Volume-2, Issue-3 E-ISSN: 2347-2693 Handwritten English Character Recognition using Pixel Density Gradient Method Rakesh
More informationIntroduction to Artificial Neural Networks
Facultés Universitaires Notre-Dame de la Paix 27 March 2007 Outline 1 Introduction 2 Fundamentals Biological neuron Artificial neuron Artificial Neural Network Outline 3 Single-layer ANN Perceptron Adaline
More informationRole of Assembling Invariant Moments and SVM in Fingerprint Recognition
56 Role of Assembling Invariant Moments SVM in Fingerprint Recognition 1 Supriya Wable, 2 Chaitali Laulkar 1, 2 Department of Computer Engineering, University of Pune Sinhgad College of Engineering, Pune-411
More informationUnit 8: Introduction to neural networks. Perceptrons
Unit 8: Introduction to neural networks. Perceptrons D. Balbontín Noval F. J. Martín Mateos J. L. Ruiz Reina A. Riscos Núñez Departamento de Ciencias de la Computación e Inteligencia Artificial Universidad
More informationKeywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm
Volume 4, Issue 5, May 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Huffman Encoding
More informationNeural Networks biological neuron artificial neuron 1
Neural Networks biological neuron artificial neuron 1 A two-layer neural network Output layer (activation represents classification) Weighted connections Hidden layer ( internal representation ) Input
More informationHopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5
Hopfield Neural Network and Associative Memory Typical Myelinated Vertebrate Motoneuron (Wikipedia) PHY 411-506 Computational Physics 2 1 Wednesday, March 5 1906 Nobel Prize in Physiology or Medicine.
More informationArtificial Neural Networks Examination, June 2005
Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either
More informationLearning Vector Quantization
Learning Vector Quantization Neural Computation : Lecture 18 John A. Bullinaria, 2015 1. SOM Architecture and Algorithm 2. Vector Quantization 3. The Encoder-Decoder Model 4. Generalized Lloyd Algorithms
More informationMining Classification Knowledge
Mining Classification Knowledge Remarks on NonSymbolic Methods JERZY STEFANOWSKI Institute of Computing Sciences, Poznań University of Technology SE lecture revision 2013 Outline 1. Bayesian classification
More informationTerminal attractor optical associative memory with adaptive control parameter
1 June 1998 Ž. Optics Communications 151 1998 353 365 Full length article Terminal attractor optical associative memory with adaptive control parameter Xin Lin a,), Junji Ohtsubo b, Masahiko Mori a a Electrotechnical
More informationHand Written Digit Recognition using Kalman Filter
International Journal of Electronics and Communication Engineering. ISSN 0974-2166 Volume 5, Number 4 (2012), pp. 425-434 International Research Publication House http://www.irphouse.com Hand Written Digit
More informationConsider the way we are able to retrieve a pattern from a partial key as in Figure 10 1.
CompNeuroSci Ch 10 September 8, 2004 10 Associative Memory Networks 101 Introductory concepts Consider the way we are able to retrieve a pattern from a partial key as in Figure 10 1 Figure 10 1: A key
More informationApplication of SOM neural network in clustering
J. Biomedical Science and Engineering, 2009, 2, 637-643 doi: 10.4236/jbise.2009.28093 Published Online December 2009 (http://www.scirp.org/journal/jbise/). Application of SOM neural network in clustering
More informationDynamics of Structured Complex Recurrent Hopfield Networks
Dynamics of Structured Complex Recurrent Hopfield Networks by Garimella Ramamurthy Report No: IIIT/TR/2016/-1 Centre for Security, Theory and Algorithms International Institute of Information Technology
More informationArtificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino
Artificial Neural Networks Data Base and Data Mining Group of Politecnico di Torino Elena Baralis Politecnico di Torino Artificial Neural Networks Inspired to the structure of the human brain Neurons as
More informationCOMP9444 Neural Networks and Deep Learning 2. Perceptrons. COMP9444 c Alan Blair, 2017
COMP9444 Neural Networks and Deep Learning 2. Perceptrons COMP9444 17s2 Perceptrons 1 Outline Neurons Biological and Artificial Perceptron Learning Linear Separability Multi-Layer Networks COMP9444 17s2
More informationCS 4700: Foundations of Artificial Intelligence
CS 4700: Foundations of Artificial Intelligence Prof. Bart Selman selman@cs.cornell.edu Machine Learning: Neural Networks R&N 18.7 Intro & perceptron learning 1 2 Neuron: How the brain works # neurons
More informationMining Classification Knowledge
Mining Classification Knowledge Remarks on NonSymbolic Methods JERZY STEFANOWSKI Institute of Computing Sciences, Poznań University of Technology COST Doctoral School, Troina 2008 Outline 1. Bayesian classification
More information3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield
3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield (1982, 1984). - The net is a fully interconnected neural
More informationSimple neuron model Components of simple neuron
Outline 1. Simple neuron model 2. Components of artificial neural networks 3. Common activation functions 4. MATLAB representation of neural network. Single neuron model Simple neuron model Components
More informationUndirected Graphical Models
Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Properties Properties 3 Generative vs. Conditional
More informationComparing Robustness of Pairwise and Multiclass Neural-Network Systems for Face Recognition
Comparing Robustness of Pairwise and Multiclass Neural-Network Systems for Face Recognition J. Uglov, V. Schetinin, C. Maple Computing and Information System Department, University of Bedfordshire, Luton,
More informationWorst-Case Analysis of the Perceptron and Exponentiated Update Algorithms
Worst-Case Analysis of the Perceptron and Exponentiated Update Algorithms Tom Bylander Division of Computer Science The University of Texas at San Antonio San Antonio, Texas 7849 bylander@cs.utsa.edu April
More informationComputational Intelligence Lecture 3: Simple Neural Networks for Pattern Classification
Computational Intelligence Lecture 3: Simple Neural Networks for Pattern Classification Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Fall 2011 arzaneh Abdollahi
More informationUsing Variable Threshold to Increase Capacity in a Feedback Neural Network
Using Variable Threshold to Increase Capacity in a Feedback Neural Network Praveen Kuruvada Abstract: The article presents new results on the use of variable thresholds to increase the capacity of a feedback
More informationNeural Networks Based on Competition
Neural Networks Based on Competition In some examples of pattern classification we encountered a situation in which the net was trained to classify the input signal into one of the output categories, while
More informationAI Programming CS F-20 Neural Networks
AI Programming CS662-2008F-20 Neural Networks David Galles Department of Computer Science University of San Francisco 20-0: Symbolic AI Most of this class has been focused on Symbolic AI Focus or symbols
More informationOrientation Map Based Palmprint Recognition
Orientation Map Based Palmprint Recognition (BM) 45 Orientation Map Based Palmprint Recognition B. H. Shekar, N. Harivinod bhshekar@gmail.com, harivinodn@gmail.com India, Mangalore University, Department
More informationA Novel Activity Detection Method
A Novel Activity Detection Method Gismy George P.G. Student, Department of ECE, Ilahia College of,muvattupuzha, Kerala, India ABSTRACT: This paper presents an approach for activity state recognition of
More informationSimple Neural Nets For Pattern Classification
CHAPTER 2 Simple Neural Nets For Pattern Classification Neural Networks General Discussion One of the simplest tasks that neural nets can be trained to perform is pattern classification. In pattern classification
More informationRadial Basis Function Networks. Ravi Kaushik Project 1 CSC Neural Networks and Pattern Recognition
Radial Basis Function Networks Ravi Kaushik Project 1 CSC 84010 Neural Networks and Pattern Recognition History Radial Basis Function (RBF) emerged in late 1980 s as a variant of artificial neural network.
More informationSinger Identification using MFCC and LPC and its comparison for ANN and Naïve Bayes Classifiers
Singer Identification using MFCC and LPC and its comparison for ANN and Naïve Bayes Classifiers Kumari Rambha Ranjan, Kartik Mahto, Dipti Kumari,S.S.Solanki Dept. of Electronics and Communication Birla
More informationImage Similarity Test Using Eigenface Calculation
2017 IJSRST Volume 3 Issue 6 Print ISSN: 2395-6011 Online ISSN: 2395-602X Themed Section: Science and Technology Image Similarity Test Using Eigenface Calculation Nadya Andhika Putri 1, Andysah Putera
More informationLearning Vector Quantization (LVQ)
Learning Vector Quantization (LVQ) Introduction to Neural Computation : Guest Lecture 2 John A. Bullinaria, 2007 1. The SOM Architecture and Algorithm 2. What is Vector Quantization? 3. The Encoder-Decoder
More informationCellular Automata Evolution for Pattern Recognition
Cellular Automata Evolution for Pattern Recognition Pradipta Maji Center for Soft Computing Research Indian Statistical Institute, Kolkata, 700 108, INDIA Under the supervision of Prof. P Pal Chaudhuri
More informationIntelligent Handwritten Digit Recognition using Artificial Neural Network
RESEARCH ARTICLE OPEN ACCESS Intelligent Handwritten Digit Recognition using Artificial Neural Networ Saeed AL-Mansoori Applications Development and Analysis Center (ADAC), Mohammed Bin Rashid Space Center
More informationCOMS 4771 Introduction to Machine Learning. Nakul Verma
COMS 4771 Introduction to Machine Learning Nakul Verma Announcements HW1 due next lecture Project details are available decide on the group and topic by Thursday Last time Generative vs. Discriminative
More informationEigenface-based facial recognition
Eigenface-based facial recognition Dimitri PISSARENKO December 1, 2002 1 General This document is based upon Turk and Pentland (1991b), Turk and Pentland (1991a) and Smith (2002). 2 How does it work? The
More informationLecture 4: Feed Forward Neural Networks
Lecture 4: Feed Forward Neural Networks Dr. Roman V Belavkin Middlesex University BIS4435 Biological neurons and the brain A Model of A Single Neuron Neurons as data-driven models Neural Networks Training
More information7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory
Physics 178/278 - David Kleinfeld - Fall 2005; Revised for Winter 2017 7 Rate-Based Recurrent etworks of Threshold eurons: Basis for Associative Memory 7.1 A recurrent network with threshold elements The
More informationIntroduction To Artificial Neural Networks
Introduction To Artificial Neural Networks Machine Learning Supervised circle square circle square Unsupervised group these into two categories Supervised Machine Learning Supervised Machine Learning Supervised
More informationSmall sample size generalization
9th Scandinavian Conference on Image Analysis, June 6-9, 1995, Uppsala, Sweden, Preprint Small sample size generalization Robert P.W. Duin Pattern Recognition Group, Faculty of Applied Physics Delft University
More informationNeural Networks with Applications to Vision and Language. Feedforward Networks. Marco Kuhlmann
Neural Networks with Applications to Vision and Language Feedforward Networks Marco Kuhlmann Feedforward networks Linear separability x 2 x 2 0 1 0 1 0 0 x 1 1 0 x 1 linearly separable not linearly separable
More informationContent-Addressable Memory Associative Memory Lernmatrix Association Heteroassociation Learning Retrieval Reliability of the answer
Associative Memory Content-Addressable Memory Associative Memory Lernmatrix Association Heteroassociation Learning Retrieval Reliability of the answer Storage Analysis Sparse Coding Implementation on a
More informationNeural Networks for Machine Learning. Lecture 2a An overview of the main types of neural network architecture
Neural Networks for Machine Learning Lecture 2a An overview of the main types of neural network architecture Geoffrey Hinton with Nitish Srivastava Kevin Swersky Feed-forward neural networks These are
More informationWeight Initialization Methods for Multilayer Feedforward. 1
Weight Initialization Methods for Multilayer Feedforward. 1 Mercedes Fernández-Redondo - Carlos Hernández-Espinosa. Universidad Jaume I, Campus de Riu Sec, Edificio TI, Departamento de Informática, 12080
More informationArtificial Neural Networks Examination, June 2004
Artificial Neural Networks Examination, June 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum
More informationArtificial Intelligence
Artificial Intelligence Jeff Clune Assistant Professor Evolving Artificial Intelligence Laboratory Announcements Be making progress on your projects! Three Types of Learning Unsupervised Supervised Reinforcement
More informationARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92
ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 BIOLOGICAL INSPIRATIONS Some numbers The human brain contains about 10 billion nerve cells (neurons) Each neuron is connected to the others through 10000
More informationCourse 395: Machine Learning - Lectures
Course 395: Machine Learning - Lectures Lecture 1-2: Concept Learning (M. Pantic) Lecture 3-4: Decision Trees & CBC Intro (M. Pantic & S. Petridis) Lecture 5-6: Evaluating Hypotheses (S. Petridis) Lecture
More informationBiometrics: Introduction and Examples. Raymond Veldhuis
Biometrics: Introduction and Examples Raymond Veldhuis 1 Overview Biometric recognition Face recognition Challenges Transparent face recognition Large-scale identification Watch list Anonymous biometrics
More informationAssociative Neural Networks using Matlab
Associative Neural Networks using Matlab Example 1: Write a matlab program to find the weight matrix of an auto associative net to store the vector (1 1-1 -1). Test the response of the network by presenting
More informationShort Term Memory Quantifications in Input-Driven Linear Dynamical Systems
Short Term Memory Quantifications in Input-Driven Linear Dynamical Systems Peter Tiňo and Ali Rodan School of Computer Science, The University of Birmingham Birmingham B15 2TT, United Kingdom E-mail: {P.Tino,
More informationPart 8: Neural Networks
METU Informatics Institute Min720 Pattern Classification ith Bio-Medical Applications Part 8: Neural Netors - INTRODUCTION: BIOLOGICAL VS. ARTIFICIAL Biological Neural Netors A Neuron: - A nerve cell as
More informationCOMP-4360 Machine Learning Neural Networks
COMP-4360 Machine Learning Neural Networks Jacky Baltes Autonomous Agents Lab University of Manitoba Winnipeg, Canada R3T 2N2 Email: jacky@cs.umanitoba.ca WWW: http://www.cs.umanitoba.ca/~jacky http://aalab.cs.umanitoba.ca
More informationSTUDY ON METHODS FOR COMPUTER-AIDED TOOTH SHADE DETERMINATION
INTERNATIONAL JOURNAL OF INFORMATION AND SYSTEMS SCIENCES Volume 5, Number 3-4, Pages 351 358 c 2009 Institute for Scientific Computing and Information STUDY ON METHODS FOR COMPUTER-AIDED TOOTH SHADE DETERMINATION
More informationOn Computational Limitations of Neural Network Architectures
On Computational Limitations of Neural Network Architectures Achim Hoffmann + 1 In short A powerful method for analyzing the computational abilities of neural networks based on algorithmic information
More informationAn artificial neural networks (ANNs) model is a functional abstraction of the
CHAPER 3 3. Introduction An artificial neural networs (ANNs) model is a functional abstraction of the biological neural structures of the central nervous system. hey are composed of many simple and highly
More informationMultilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs)
Multilayer Neural Networks (sometimes called Multilayer Perceptrons or MLPs) Linear separability Hyperplane In 2D: w x + w 2 x 2 + w 0 = 0 Feature x 2 = w w 2 x w 0 w 2 Feature 2 A perceptron can separate
More informationBACKPROPAGATION. Neural network training optimization problem. Deriving backpropagation
BACKPROPAGATION Neural network training optimization problem min J(w) w The application of gradient descent to this problem is called backpropagation. Backpropagation is gradient descent applied to J(w)
More informationLearning and Memory in Neural Networks
Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units
More informationMultilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs)
Multilayer Neural Networks (sometimes called Multilayer Perceptrons or MLPs) Linear separability Hyperplane In 2D: w 1 x 1 + w 2 x 2 + w 0 = 0 Feature 1 x 2 = w 1 w 2 x 1 w 0 w 2 Feature 2 A perceptron
More informationArtificial Neural Networks
Artificial Neural Networks 鮑興國 Ph.D. National Taiwan University of Science and Technology Outline Perceptrons Gradient descent Multi-layer networks Backpropagation Hidden layer representations Examples
More informationINTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY
[Gaurav, 2(1): Jan., 2013] ISSN: 2277-9655 IJESRT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY Face Identification & Detection Using Eigenfaces Sachin.S.Gurav *1, K.R.Desai 2 *1
More informationThe Perceptron. Volker Tresp Summer 2014
The Perceptron Volker Tresp Summer 2014 1 Introduction One of the first serious learning machines Most important elements in learning tasks Collection and preprocessing of training data Definition of a
More informationApplication of Associative Matrices to Recognize DNA Sequences in Bioinformatics
Application of Associative Matrices to Recognize DNA Sequences in Bioinformatics 1. Introduction. Jorge L. Ortiz Department of Electrical and Computer Engineering College of Engineering University of Puerto
More informationTHE Back-Prop (Backpropagation) algorithm of Paul
COGNITIVE COMPUTATION The Back-Prop and No-Prop Training Algorithms Bernard Widrow, Youngsik Kim, Yiheng Liao, Dookun Park, and Aaron Greenblatt Abstract Back-Prop and No-Prop, two training algorithms
More informationThe Perceptron. Volker Tresp Summer 2016
The Perceptron Volker Tresp Summer 2016 1 Elements in Learning Tasks Collection, cleaning and preprocessing of training data Definition of a class of learning models. Often defined by the free model parameters
More informationThis is a repository copy of Improving the associative rule chaining architecture.
This is a repository copy of Improving the associative rule chaining architecture. White Rose Research Online URL for this paper: http://eprints.whiterose.ac.uk/75674/ Version: Accepted Version Book Section:
More informationNeural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET
Unit-. Definition Neural network is a massively parallel distributed processing system, made of highly inter-connected neural computing elements that have the ability to learn and thereby acquire knowledge
More informationNeural Networks. Nicholas Ruozzi University of Texas at Dallas
Neural Networks Nicholas Ruozzi University of Texas at Dallas Handwritten Digit Recognition Given a collection of handwritten digits and their corresponding labels, we d like to be able to correctly classify
More informationSEQUENTIAL SUBSPACE FINDING: A NEW ALGORITHM FOR LEARNING LOW-DIMENSIONAL LINEAR SUBSPACES.
SEQUENTIAL SUBSPACE FINDING: A NEW ALGORITHM FOR LEARNING LOW-DIMENSIONAL LINEAR SUBSPACES Mostafa Sadeghi a, Mohsen Joneidi a, Massoud Babaie-Zadeh a, and Christian Jutten b a Electrical Engineering Department,
More informationA Modified Incremental Principal Component Analysis for On-Line Learning of Feature Space and Classifier
A Modified Incremental Principal Component Analysis for On-Line Learning of Feature Space and Classifier Seiichi Ozawa 1, Shaoning Pang 2, and Nikola Kasabov 2 1 Graduate School of Science and Technology,
More informationExtending the Associative Rule Chaining Architecture for Multiple Arity Rules
Extending the Associative Rule Chaining Architecture for Multiple Arity Rules Nathan Burles, James Austin, and Simon O Keefe Advanced Computer Architectures Group Department of Computer Science University
More information