Application of hopfield network in improvement of fingerprint recognition process Mahmoud Alborzi 1, Abbas Toloie- Eshlaghy 1 and Dena Bazazian 2

Similar documents
CHAPTER 3. Pattern Association. Neural Networks

Using a Hopfield Network: A Nuts and Bolts Approach

Convolutional Associative Memory: FIR Filter Model of Synapse

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net.

Lecture 7 Artificial neural networks: Supervised learning

Hopfield Neural Network

On the Hopfield algorithm. Foundations and examples

Storage Capacity of Letter Recognition in Hopfield Networks

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

Artificial Intelligence Hopfield Networks

Neural Networks Introduction

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required.

SCRAM: Statistically Converging Recurrent Associative Memory

Computational Intelligence Lecture 6: Associative Memory

Week 4: Hopfield Network

Associative Memory : Soft Computing Course Lecture 21 24, notes, slides RC Chakraborty, Aug.

Pattern Association or Associative Networks. Jugal Kalita University of Colorado at Colorado Springs

Neural Networks DWML, /25

Artificial Neural Network

arxiv: v2 [nlin.ao] 19 May 2015

Artificial Neural Networks Examination, March 2004

Dynamic Classification of Power Systems via Kohonen Neural Network

Neural Networks. Associative memory 12/30/2015. Associative memories. Associative memories

Palmprint based Verification System Robust to Occlusion using Low-order Zernike Moments of Sub-images

Neural Nets and Symbolic Reasoning Hopfield Networks

Handwritten English Character Recognition using Pixel Density Gradient Method

Introduction to Artificial Neural Networks

Role of Assembling Invariant Moments and SVM in Fingerprint Recognition

Unit 8: Introduction to neural networks. Perceptrons

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm

Neural Networks biological neuron artificial neuron 1

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

Artificial Neural Networks Examination, June 2005

Learning Vector Quantization

Mining Classification Knowledge

Terminal attractor optical associative memory with adaptive control parameter

Hand Written Digit Recognition using Kalman Filter

Consider the way we are able to retrieve a pattern from a partial key as in Figure 10 1.

Application of SOM neural network in clustering

Dynamics of Structured Complex Recurrent Hopfield Networks

Artificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino

COMP9444 Neural Networks and Deep Learning 2. Perceptrons. COMP9444 c Alan Blair, 2017

CS 4700: Foundations of Artificial Intelligence

Mining Classification Knowledge

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield

Simple neuron model Components of simple neuron

Undirected Graphical Models

Comparing Robustness of Pairwise and Multiclass Neural-Network Systems for Face Recognition

Worst-Case Analysis of the Perceptron and Exponentiated Update Algorithms

Computational Intelligence Lecture 3: Simple Neural Networks for Pattern Classification

Using Variable Threshold to Increase Capacity in a Feedback Neural Network

Neural Networks Based on Competition

AI Programming CS F-20 Neural Networks

Orientation Map Based Palmprint Recognition

A Novel Activity Detection Method

Simple Neural Nets For Pattern Classification

Radial Basis Function Networks. Ravi Kaushik Project 1 CSC Neural Networks and Pattern Recognition

Singer Identification using MFCC and LPC and its comparison for ANN and Naïve Bayes Classifiers

Image Similarity Test Using Eigenface Calculation

Learning Vector Quantization (LVQ)

Cellular Automata Evolution for Pattern Recognition

Intelligent Handwritten Digit Recognition using Artificial Neural Network

COMS 4771 Introduction to Machine Learning. Nakul Verma

Eigenface-based facial recognition

Lecture 4: Feed Forward Neural Networks

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory

Introduction To Artificial Neural Networks

Small sample size generalization

Neural Networks with Applications to Vision and Language. Feedforward Networks. Marco Kuhlmann

Content-Addressable Memory Associative Memory Lernmatrix Association Heteroassociation Learning Retrieval Reliability of the answer

Neural Networks for Machine Learning. Lecture 2a An overview of the main types of neural network architecture

Weight Initialization Methods for Multilayer Feedforward. 1

Artificial Neural Networks Examination, June 2004

Artificial Intelligence

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92

Course 395: Machine Learning - Lectures

Biometrics: Introduction and Examples. Raymond Veldhuis

Associative Neural Networks using Matlab

Short Term Memory Quantifications in Input-Driven Linear Dynamical Systems

Part 8: Neural Networks

COMP-4360 Machine Learning Neural Networks

STUDY ON METHODS FOR COMPUTER-AIDED TOOTH SHADE DETERMINATION

On Computational Limitations of Neural Network Architectures

An artificial neural networks (ANNs) model is a functional abstraction of the

Multilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs)

BACKPROPAGATION. Neural network training optimization problem. Deriving backpropagation

Learning and Memory in Neural Networks

Multilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs)

Artificial Neural Networks

INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY

The Perceptron. Volker Tresp Summer 2014

Application of Associative Matrices to Recognize DNA Sequences in Bioinformatics

THE Back-Prop (Backpropagation) algorithm of Paul

The Perceptron. Volker Tresp Summer 2016

This is a repository copy of Improving the associative rule chaining architecture.

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET

Neural Networks. Nicholas Ruozzi University of Texas at Dallas

SEQUENTIAL SUBSPACE FINDING: A NEW ALGORITHM FOR LEARNING LOW-DIMENSIONAL LINEAR SUBSPACES.

A Modified Incremental Principal Component Analysis for On-Line Learning of Feature Space and Classifier

Extending the Associative Rule Chaining Architecture for Multiple Arity Rules

Transcription:

5797 Available online at www.elixirjournal.org Computer Science and Engineering Elixir Comp. Sci. & Engg. 41 (211) 5797-582 Application hopfield network in improvement recognition process Mahmoud Alborzi 1, Abbas Toloie- Eshlaghy 1 and Dena Bazazian 2 1 Industrial management department, Science and Research Branch, Islamic Azad University, Tehran, Iran 2 Information technology management department, Science and Research Branch, Islamic Azad University, Tehran, Iran. ARTICLE INF O Article history: Received: 25 September 211; Received in revised form: 18 November 211; Accepted: 29 November 211; Keywords Fingerprint image, Noise, Hopfield Network, Hamming Distance. AB STRACT Hopfield Network is able to convert noisy data provided to the network because it can act as content addressable memory. Hopfield Network can convert noisy to the noiseless or the with the minimum noise by training through receivable models set. In this research, recognition process has been performed through Hamming Distance and effect use Hopfield network on recognition process has been mentioned. In case those Hamming Distance operations are performed after Hopfield Networks processing, error recognition will be reduced. 211 Elixir All rights reserved. Introduction Hopfield Network is one the neural networks which have been used in the present research. Hopfield network in can have useful applications. Fingerprint which are used for identification may have noise. For example, it may be injured. In the present research, we study to what extent Hopfield Network can improve noisy or defective and be applied in recognition process. Fingerprint become binary after recalling to be calculated for Hopfield network studies. Hamming Distance is regarded as the best option due to application for binary data. Hamming Distance will find the closest similar image by studying differences between each image input and other. Images Fingerprint each person is a unique biological trait and can have effective applications for identification. Identification system based on biological traits should be distinguished and can be gathered and stable and bolometric trait has the mentioned specifications so that it has been regarded as valid evidence in judicial authorities (Daliri et al, 27). Images are fingertips tissue pattern which have unevenness. Unevenness the fingertip is due to wrinkles (Jean et al, 1997). According to analysis and study wrinkles in fingertip, we can adjust identify recognition system based on this biological trait. Preprocessing operations on s : In the present research, 4 s have been used as model set for Hopfield Network learning. All fingertips are converted to black and white format and then size all is equal. Primary fingertip are in large dimensions which are difficult to calculate. For this reason, will be calculated in smaller sizes. Background fingertips is removed and contrast the is increased so that fingertips are made more evident. Stages preprocessing operations on one the are shown in Figure 1. After performing preprocessing on the model set, Hopfield operations will be performed. Figure 1: Preprocessing operations on the Recall In the present research, all operations on have been done in MATLAB stware. In this stware, the related matrix will be specified after recall the. Matrix are determined on the basis gray spectrum zero to 255 belonging to each one the image pixels. In the next stage, are binary and because the studied method is Hopfield network and all studied in Hopfield network should be +1 and -1, therefore, all zeros are converted to -1 and +1s remain unchanged. In the following Figure, matrixes image are shown. In the first stage, all matrix are between zero and 255 and are zero and 1 in the next stage and finally between -1 and +1. Figure 2: Fingerprint matrices Tele: E-mail addresses: toloie@gmail.com, toloie@srbiau.ac.ir 211 Elixir All rights reserved

5798 Hopfield Network Hopfield network is a kind recurrent artificial neural networks have been invented by John Hopfield. Hopfield network can play role a system with dependent memory including two mode components. Hopfield Network has been composed some nodes each connected to other nodes. Therefore, with regard to the shown figures, among two Hopfield networks, this network is a related network. Figure 3: Different Hopfield Network Components Hopfield network are two state components. It means that they can be in two defined states and that their position is determined dependent on the fact that input set the related component is less or more than the threshold value. In Hopfield Network, all neurons act like each other and there is no classification input and output neurons. In fact, Hopfield network is a recurrent network in which output each neuron is connected to input other neurons. This action is repeated to the extent that output the network converges to a fixed point. Hopfield Network Algorithm In this section, the algorithm which determines function Hopfield network has been mentioned. The mentioned algorithm is derived from book introduction to neural networks. (Translated by Alborzi, 21) Stages algorithm: weight coefficients are determined with the following formula: (1) M 1 i S= s X X s j i = j, < i, j< M 1 w ij is weight coefficient from node I to node j and is ith member sample model s. value is +1 and -1. The number models is M from zero to M-1 and threshold value all nodes is zero. We give unknown model to the network. at t time. () =, < i < N-1 We iterate it to perfect convergence. = () =, < i < N-1 is, i node output Function a Heaviside step function Figure 4. Heaviside step function behaves in such a manner that output is one in case positive input and output is zero in case negative input. Weight coefficients between neurons are determined with use selective sample algorithm all classes with the formula which is given in algorithm. This stage is stage network education so that each model is made similar to it. In stage identification, output the network is put against the unknown model by imposition, and then the network is released to change in discrete time intervals as long as it reaches stability and its output is fixed. In such case, the network converges to final answer. If an entangled pattern is given to the network, the network will produce its correct pattern, therefore, the network acts as a content addressable memory. In Hopfield Network, patters are not supplied to the network one by one but they are fed to the network with use the above stage by calculating weight coefficients. In fact, network education is done only in one step. Function the Hopfield Network can be summarized as follows: We commission the network We supply the unknown pattern We iterate it until final convergence Noisy processing with Hopfield network: In order to process with Hopfield network, a set has been given to the network. Then, noisy are supplied to the network. In this research, have been studied by adding line or point as noisy image. After the are noisy, they become binary and matrix noisy is studied with noiseless and then the matrix change after being noisy. Noisy in matrix which have been studied as unknown pattern range from to 6. In the following table, the number with noisy has been studied: Matrix unknown pattern faces different results after convergence. In this following table 4, noisy are given with primary matrixes noiseless, noisy matrix and final matrix after convergence. Matrix will have different results after convergence. The best result which is obtained from Hopfield networks is similar to the first line the above table. In this case, is noiseless like the primary. As shown in the above table, all the matrix may not change correctly and sometimes some fixed matrix may change after convergence. Generally, results obtained from Hopfield operations can be divided into four groups as follows: First answer group: final matrix is similar to the primary matrix and is noiseless after convergence and all have been similar to the primary matrix after convergence. For matrixes with zero, first answer group has not changed after operations Hopfield network the studied matrix. Second answer group: in the final matrix, some are noiseless after convergence and some the noisy matrix have not changed. Third answer group: in the final matrix, some which were fixed after being noisy changed after convergence. Fourth answer group: have not changed in the final matrix after convergence. In some cases, results Hopfield network are combination two answer groups. Group s combination is as 1 and 3, 2 and 3 and 3 and 4. Figure 4: Heaviside step function

5799 Answer 1 and 3 group: have changed correctly but some the main matrix have changed by mistake. Answer 2 and 3 group: have changed correctly but some have not changed. Some the main matrix have changed by mistake. Answer 3 and 4 group: have not changed after convergence but some have changed by mistake. As mentioned above, 4 have been stored in system. Different these 4 have been noisy with different noises randomly. With regard to the fact that there are some with different noises, number these is 1. 1 matrixes image with zero to 6 have been studied with Hopfield network. After study the es with Hopfield network, matrixes resulting from these operations are studied in recognition process according to Hamming method. 1 the image are studied again in recognition process to define effect Hopfield in recognition process. Hamming Distance: Hamming is obtained by calculating difference each element in a vector and its corresponding element in another vector and sum absolute magnitude differences. Hamming Distance is applied for calculating vectors zero and one. With regard to the obtained numbers in calculations and matrices, Hamming is the best option for recognition process. H = ( x i -y i ) With the above formula, the minimum sum absolute magnitude differences is found and the closest vector similar to the input vector is obtained. Fingerprint recognition process according to Hamming In the present research, are studied with or without Hopfield network according to Hamming. In study recognition process, effect Hopfield network is specified. As mentioned in the previous section, image has been considered. Matrixes the mentioned have to 6. Fingerprint group which have been studied with Hopfield network in recognition process have got closer to the primary noiseless image after operations Hopfield network. As mentioned above, 4 sound have been stored in the system. Different these 4 have been randomly noisy with different noises. With regard to the fact that some are available with different noises, the number these group is 1. In recognition process, one image out 1 is regarded as input according to Hamming and compared with 4 saved in the system. Hamming is studied in such a manner that input matrix are compared with 4 matrixes saved in the system and finally the closest similar matrix the input image is mentioned. The announced results in recognition process can be classified into four classes answer. In case that only one matrix similar to input matrix is announced which is input matrix, result recognition process has been correct in this case. o In case that two matrixes are similar to input matrix provided that there is matrix the primary image similar to the input image, result recognition process is announced for two similar to input image. o In case that three matrixes are similar to input matrix provided that there is matrix the primary image similar to the input image, result recognition process is announced for three similar to input image. o In case that the announced matrixes are not the matrixes corresponding to input image, the announced result is incorrect in recognition process. o Therefore, 1 are studied (with and without Hopfield Network) according to Hamming Distance in two stages and the obtained results are shown in table 4. With regard to Table 4, there are more correct results with Hopfield Network and fewer incorrect results. That the noise applied to the is high range; one can make correct comparison on the basis in. With increase in, it is unlikely that the image similar to the first image will be obtained without noise with Hopfield network. Study hamming results with regard to noise the In this part the article, results obtained from recognition process with or without Hopfield network are compared with each other on the basis noise the. In table 5, the obtained results are shown. The upper number in each cell table is the result with Hopfield network and the lower number is the result without Hopfield Network. As shown in Table 5, results recognition process with Hopfield network is averagely more correct. The important point in this table is without noise or one noisy element and in this case, more correct results have been obtained without Hopfield network. In with average noise rate, there has been more correct recognition with Hopfield network and less incorrect recognition. In with 5 and 6 noisy in each matrix, there has been no correct recognition with and without Hopfield network but there is recognition for some similar and there are fewer results with Hopfield network than those without Hopfield network in incorrect recognitions. In diagrams 3 to 9, percentage correct results and recognition two similar (2?) and recognition three similar (3?) and incorrect results are shown on the basis noise in.

58 Conclusion In this article, effect Hopfield network on improvement recognition process was mentioned. Hopfield network could convert with average noise to without noise or the least noise. In case performance Hopfield network before recognition process, effect error will be reduced. Hopfield network can act as a content addressable memory and is able to convert noisy to noiseless or with the least noise. In case those operations Hopfield network are performed on with average noise and the obtained are studied in the recognition process, error rate will be reduced. Therefore, Hopfield network is able to improve recognition process. References [1] Alborzi, M. 21. Introduction to neural networks. Second edition. Tehran, Sharif University Technology Publication Institute. [2] Daliri, A., Osarerh, A. 27. Biometric security system. Bioelectronics Seminar Biomedical Engineering Faculty Amir Kabir University Technology. [3] Abd Allah, M. 25. Artificial neural networks based authentication with clusters algorithm. Informatica. 29, PP.33 37 [4] Baldi, P., Chauvin, Y. 1993. Neural Networks for Fingerprint Recognition. Massachusetts Institute Technology, Neural Computation 5, PP. 42-418 [5] Choudhary, P. 29. Human recognition using multiple s. B. Tech. Project Report. Department Computer Science and Engineering, Indian Institute Technology, Kanpur, India [6] Hopfield J. J. 1982. Neural networks and physical systems with emergent collective computational abilities. Proceedings the National Academy Sciences the USA. vol. 79 no. 8 pp. 2554-2558. [7] Jain. A.K, Hong.L, Pankanti.S and Boll.R. 1997. An identity authentication system using s. Proc. IEEE vol 85, no.9, pp. 1365-1388. [8] Weber, D.M. 1992. A cost effective verification algorithm for commercial applications. IEEE. Unknown. [9] Yang J., Govindaraju, V. 25. A minutia-based partial recognition system. The journal Pattern Recognition society 38. pp 1672 1684 [1] Yi,S., Jie-Ge, L., Sung, Y. 1995.Improvement on performance modified Hopfield neural network for image restoration. IEEE transaction on image processing vol. 4, No. 5, pp. 688-693 noisy 6 Table 1: Noisy and rate in noisy noisy noisy noisy 5 4 3 2 noisy noisy 1 Noise rate Matrix 1 16 1 18 14 16 16 matrix

581 Table 2: Results the studied matrixes with Hopfield network Table 3: Analysis Hopfield network results and in 6 5 4 3 2 1 noise Result 16% 64% 37% 44% Matrixes which have reached first answer group. 11% 14% 37% 56% Matrixes which have reached first and third answer group 12% 3 44% 21% Matrixes which have reached fourth answer group 9 81% 6 22% Matrixes which have reached second and third answer group. 6% 1% 6% Matrixes which have reached third and fourth answer group 1 5% Matrixes which have reached fourth answer group

582 Table 4: Study on effect Hopfield network by comparing results obtained from Hamming Without Hopfield Network With Hopfield Network 1 1 Total number according to Hamming distract 31 57 The number results showing that Hamming is correct. 18 16 The number results 2 similar Hamming 1 11 The number results 3 similar Hamming 41 16 The number results showing that Hamming is not correct. Table 5: Study noise and effect Hopfield network on recognition process according to hamming The number The number The number The number The number The number The number noisy noisy noisy noisy in in noisy in in in in image image in 6 5 4 3 matrix equivalent to 2 matrix equivalent to 1 1 1 1 8 9 19% 44% 6% 37% 94% 4 2 2 2 2 8 77% 23% 22% 27% 49% 86% 14% 85% 8% 7% 88% 93% 6% 7% 6% 75% 25% 7% 93% Recognition correct Hamming Recognition two similar hamming Recognition three similar hamming Recognition incorrect Hamming