A New Algorithm for Training Multi-layered Morphological Networks

Similar documents
Multilayer Perceptron (MLP)

Solving Nonlinear Differential Equations by a Neural Network Method

A new Approach for Solving Linear Ordinary Differential Equations

Neural Networks & Learning

Week 5: Neural Networks

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

The Study of Teaching-learning-based Optimization Algorithm

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

EEE 241: Linear Systems

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing

Multigradient for Neural Networks for Equalizers 1

Neural networks. Nuno Vasconcelos ECE Department, UCSD

Kernel Methods and SVMs Extension

Evaluation of classifiers MLPs

ECE559VV Project Report

CHAPTER-5 INFORMATION MEASURE OF FUZZY MATRIX AND FUZZY BINARY RELATION

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

Lecture Notes on Linear Regression

CHAPTER 14 GENERAL PERTURBATION THEORY

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

On the correction of the h-index for career length

Non-linear Canonical Correlation Analysis Using a RBF Network

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Multilayer neural networks

CHAPTER III Neural Networks as Associative Memory

VQ widely used in coding speech, image, and video

Natural Language Processing and Information Retrieval

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

Report on Image warping

Image Segmentation and Compression using Neural Networks

CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING INTRODUCTION

On the Multicriteria Integer Network Flow Problem

Microwave Diversity Imaging Compression Using Bioinspired

Difference Equations

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

A quantum-statistical-mechanical extension of Gaussian mixture model

The Order Relation and Trace Inequalities for. Hermitian Operators

Numerical Algorithms for Visual Computing 2008/09 Example Solutions for Assignment 4. Problem 1 (Shift invariance of the Laplace operator)

Numerical Heat and Mass Transfer

Semi-supervised Classification with Active Query Selection

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

The Synchronous 8th-Order Differential Attack on 12 Rounds of the Block Cipher HyRAL

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis

2.3 Nilpotent endomorphisms

(Online First)A Lattice Boltzmann Scheme for Diffusion Equation in Spherical Coordinate

Lecture 12: Discrete Laplacian

10-701/ Machine Learning, Fall 2005 Homework 3

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

Finding Dense Subgraphs in G(n, 1/2)

The Expectation-Maximization Algorithm

Online Classification: Perceptron and Winnow

Min Cut, Fast Cut, Polynomial Identities

VARIATION OF CONSTANT SUM CONSTRAINT FOR INTEGER MODEL WITH NON UNIFORM VARIABLES

New Method for Solving Poisson Equation. on Irregular Domains

Errors for Linear Systems

Model of Neurons. CS 416 Artificial Intelligence. Early History of Neural Nets. Cybernetics. McCulloch-Pitts Neurons. Hebbian Modification.

SL n (F ) Equals its Own Derived Group

Supporting Information

Formulas for the Determinant

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

High resolution entropy stable scheme for shallow water equations

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

Lecture 4: Universal Hash Functions/Streaming Cont d

Multi-layer neural networks

Linear Classification, SVMs and Nearest Neighbors

Power law and dimension of the maximum value for belief distribution with the max Deng entropy

Homework Assignment 3 Due in class, Thursday October 15

Time-Varying Systems and Computations Lecture 6

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

Lecture 23: Artificial neural networks

Linear Approximation with Regularization and Moving Least Squares

18.1 Introduction and Recap

A New Evolutionary Computation Based Approach for Learning Bayesian Network

Short Term Load Forecasting using an Artificial Neural Network

COMPLEX NUMBERS AND QUADRATIC EQUATIONS

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

Affine transformations and convexity

Removal of Hidden Neurons by Crosswise Propagation

The path of ants Dragos Crisan, Andrei Petridean, 11 th grade. Colegiul National "Emil Racovita", Cluj-Napoca

1 Derivation of Point-to-Plane Minimization

APPENDIX A Some Linear Algebra

Appendix B: Resampling Algorithms

Convexity preserving interpolation by splines of arbitrary degree

Physics 5153 Classical Mechanics. Principle of Virtual Work-1

A Note on Bound for Jensen-Shannon Divergence by Jeffreys

The optimal delay of the second test is therefore approximately 210 hours earlier than =2.

The Minimum Universal Cost Flow in an Infeasible Flow Network

MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN

Regularized Discriminant Analysis for Face Recognition

Application research on rough set -neural network in the fault diagnosis system of ball mill

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

IV. Performance Optimization

Multiple Sound Source Location in 3D Space with a Synchronized Neural System

Interactive Bi-Level Multi-Objective Integer. Non-linear Programming Problem

Generalized Linear Methods

Transcription:

A New Algorthm for Tranng Mult-layered Morphologcal Networs Rcardo Barrón, Humberto Sossa, and Benamín Cruz Centro de Investgacón en Computacón-IPN Av. Juan de Dos Bátz esquna con Mguel Othón de Mendzábal Mexco Cty, 07738, Mexco rbarron@cc.pn.mx, hsossa@cc.pn.mx, benamncruz@sagtaro.cc.pn.mx, Abstract. In ths wor we present an algorthm for tranng an assocatve memory based on the so-called mult-layered morphologcal perceptron wth maxmal support neghborhoods. We compare the proposal wth the orgnal one by performng some experments wth real mages. We show the superorty of the new one. We also gve formal condtons for correct classfcaton. We show that the proposal can be appled to the case of gray-level mages and not only bnary mages. Keywords: Assocatve memores, Morphologcal neural networs, maxmal support neghborhoods. 1 Introducton Neural networs have shown to be an excellent alternatve to face problems where t s dffcult to fnd no algorthmc soluton. Based on the functonng of the human nervous system, lots of researchers have proposed dfferent neural processng models. Probably the best now model s the Bac-propagaton Neural Networ [1], [2] and [3]. The study of the nternal structure of neural cells has revealed that all cells have the same smple structure, ndependently of ther sze and shape. Informaton from a cell voyages through the sgnals that neurons send to other neurons through ther dendrtes. It s beleved that the cellular body adds up the receved sgnals; when enough nputs are avalable a dscharge s produced. Intally, ths dscharge occurs n cellular body, t then propagates between the axon untl the synapses that sends a new sgnal to the other neurons. An artfcal neural networ can be seen as a non-lnear mappng between two pattern spaces: the nput pattern set and the output pattern set. Normally the nternal parameters of ths mappng are determned by means of a tranng process and are denoted as synaptc weghts. In the decade of the 50 s, Rosenblatt ([2], [3]) ntroduces the well-nown perceptron, whch s the classcal model that has been used for most of the actual developments. However, n the 90 s, Rtter et al., ([4], [5], [6]) and Sussner [12] presented a new nd of neural networ model, the so-called morphologcal neural L. Rueda, D. Mery, and J. Kttler (Eds.): CIARP 2007, LNCS 4756, pp. 546 555, 2007. Sprnger-Verlag Berln Hedelberg 2007

A New Algorthm for Tranng Mult-layered Morphologcal Networs 547 networ. Here, the classcal operatons of multplcaton and addton are replaced by summatons and max (or mn), respectvely. One dfference between ths model and the classcal models s the computatonal cost when computng the value of -th neuron at tme t + 1 that n ths model s less. In the last years t has been found that, apparently the processng of nformaton not only occurs at the cellular body but also at the dendrtes [7]. Ths affrmaton could gve an explanaton of the great effcency of our nervous system, snce processng of the nformaton happens practcally along the communcaton channel. The materal ust presented along wth the morphologcal paradgm s the departure pont of ths paper. 2 Assocatve Memory Based on the Morphologcal Perceptron In [7] t s presented how a morphologcal perceptron allows classfyng any compact set n the pattern s doman, whch can be used to buld an assocatve memory able to recall patterns affected by mxed nose. The dea s to buld a three layer assocatve memory (an of nput, one hdden, and of output) as can be apprecated n Fgure 1. Fg. 1. Assocatve memory of three layers based on the morphologcal perceptron Frst layer wors as regster of the number of elements of the nput pattern. Second one (the hdden layer) s composed of morphologcal perceptrons [7], one for each class, whle the output layer s formed by perceptrons based on the max operator wth a lnear gatng functon, one perceptron for each component of the output patterns. The perceptrons of the hdden layer are morphologcal perceptrons wth one nput dendrte. The classfed regon s thus a hyper-rectangle n R n, whose goal s to classfy the correspondng pattern nsde the on-regon. For the better functonng of ths model, the perceptron outputs a zero n ts on-regon and n ts complements. The output of a perceptron at the output layer s gven as K ϕ ( ) (1) z = y + x = 1 where ϕ x s the output of the -th perceptron of the hdden layer. Thus, when tang the max of the y s the -th component of the -th output pattern, and ( )

548 R. Barrón, H. Sossa, and B. Cruz ϕ that output a unqualfy pattern outputs of the perceptrons y as possble output of the assocatve memory. To tran ths assocatve memory, we have smply to defne the supports of each ey pattern. In ths case the on regons of the correspondng patterns represent these supports. From [3], durng tranng, all supports of the memory are bult as hyper-cubes of sde equal to α, where α s obtaned as where (, ) d x x s defned as and ( x ) 1,, α 1 mn d x, x 2 < ( ) < (2) (, ) max{ } = l l d x x x x = = m s the set of ey patterns. l 1, n To avod collsons at the moment of classfcaton t s necessary that the supports are dsont two by two. Ths s not demonstrated n [7]. Next a bref proof that ths happens s gven. Proposton 1. Let M a mult-layered assocatve memory. If each ey pattern a support S { x: d( x, x ) α} (3) x has = <, wth α as defned n equaton (2), then t hold S S =. that, Proof. Let us suppose that such that S S, ths means that there s a x such that d( x, x ) < α and d( x, x ) α nequalty we get: d( x, x ) d( x, x ) d( x, x ) ( α α 2α) d( x, x ) d( x, x ) < 2α, or n other words α <. When summng and usng trangles + + =, then 2 <!!, whch s a contradcton, thus the proposton hold. One of the drawbacs of ths method to construct the supports s that f two of them are of them too close, the (radus) of the neghborhood s supports reduce drastcally. In [7] t s proposed another method to ncrease the neghborhoods of the supports by means of the ernels method. Accordng to [7], wth ths the range of permssble nose s ncreased. However, ths method maes expensve the computatonal cost and besdes t mposes restrctons over the patterns very dffcult to get. 3 Proposed Tranng Algorthm In the content of ths wor, let us suppose that a pattern s represented n terms of n obect features; then at each coordnated axs we can compute the varaton obtaned

A New Algorthm for Tranng Mult-layered Morphologcal Networs 549 per feature by orderng all components. By computng the average of varablty per pattern t s possble to defne a threshold. For practcal purposes, ths threshold allows to consder f two patterns can be consdered to be the same from the pont of vew of one of ther components. Ths way we avod havng very tny supports wth respect to the coordnated axs. Ths way the drawbac of the algorthm descrbed n last secton s surpassed. Fg. 2. Flowchart to get average varaton threshold The algorthm to fnd the average varaton threshold by axs s observed n Fg. 2. In ths case, X s a matrx of n m. At -th column t s -th pattern, whle at - th lne are the m features of the -th axs of each pattern. In U t s the average varaton threshold for the -th axs. The ey pont for the tranng of the multlayered morphologcal perceptron conssts on constructng the supports for each pattern. In general these supports must be dsont two by two to avod that two patterns be assgned to the same output. Ths s fulflled n the proposed algorthm thans to the followng: Proposton 2. Let Ω and Ω two arbtrary supports correspondng to dfferent patterns n R n, bult accordng to average varaton threshold, then t holds that Ω=Ω or Ω Ω =. [13]. Proof. Let x and x the correspondng patterns to supports Ω and Ω, respectvely, then we have two cases: a. If t holds that x x < U, = 1, n, wth U s the average varaton. If ths holds then Ω=Ω. such that x x > U would mply that at th b. If = 1,, n coordnate, the support does not concde and on the axs they are dsont, thus they are dsont n R n.

550 R. Barrón, H. Sossa, and B. Cruz Evdently, when the patterns are too close, the neghborhoods would present occlusons. In these cases Ω=Ω for, one varant of ths algorthm would be to consder that Ω Ω, but that each neghborhood s centered at ts respectve ey pattern. The advantage of ths enhancement s that neghborhoods allowng more nose to be added to the patterns wll be enlarged. 4 Numercal Examples Example No. 1. Let the followng set of ey patterns n R n : 3 1 2 2 4 3 1 2 3 4 5 6 x =, x =, x =, x =, x =, x = 4 4 2 3 5 2 a) Soluton obtaned by means of algorthm proposed n [7]: By applyng the algorthm proposed n [7], accordng to equaton (2) α = 0.5. Fgure 3 shows the neghborhoods obtaned when usng ths value of α. (4) Fg. 3. Neghborhoods obtaned when algorthm proposed n [7] s used b) Soluton obtaned by means of algorthm proposed n ths paper (frst varant): When usng the algorthm to get the maxmal support neghborhoods, by applyng the flowchart shown n Fgure 3, we get the threshold value U as: 1.166 U = 1.5 The neghborhoods obtaned by usng ths threshold value are shown n Fgure 4. The dfferences can be mmedately apprecated. In ths second case the range on nose for each pattern, as can be apprecated, s bgger. (5)

A New Algorthm for Tranng Mult-layered Morphologcal Networs 551 Fg. 4. Neghborhoods obtaned when usng frst varant of the proposed algorthm c) Soluton obtaned by means of algorthm proposed n ths paper (second varant): We apply the same procedure used by the frst varant, but n ths case the neghborhoods are centered at the correspondng ey patterns. The threshold s the same gven by equaton (5). Fg. 5. Neghborhoods obtaned when usng second varant of the proposed algorthm

552 R. Barrón, H. Sossa, and B. Cruz The neghborhoods obtaned are shown n Fg. 5. Due to the neghborhoods are now centered at ther respectve ey patterns, n ths case the support for nose s bgger. Note also the occlusons between classes that do not occur n the frst varant. In the followng secton we show how the proposal descrbed n ths paper can be used not only to recall bnary patterns but also gray-level patterns such mages. 5 Experments wth Real Patterns For ths experment we used the mages shown n Fg. 6. These mages are gray-level of 262 326 elements. They were perturbed wth nose from 5% to 15%, n steps of 5%. Fg. 6. Orgnal mages of experments The nput ey patterns where formed by descrbng each mage by means of nown Hu nvarants ([10], [11]). Fgures 7 to 9 show graphcally the results obtaned when usng the algorthm proposed n [7], and the algorthm proposed n ths paper to the mages s shown n Fgs. 10 to 12. From all of these fgures we can observe that when addng, even small quanttes of nose to the patterns, the orgnal algorthm proposed n [7] fals to recall practcally of the patterns, whle the proposal, although wth modest percentage, several of the desred patterns are correctly recalled. Fg. 7. Images altered wt 5% of nose. Images recalled usng [7].

A New Algorthm for Tranng Mult-layered Morphologcal Networs Fg. 8. Images altered wt 10% of nose. Images recalled usng [7]. Fg. 9. Images altered wt 15% of nose. Images recalled usng [7]. Fg. 10. Images altered wt 5% of nose. Images recalled usng the proposal. 553

554 R. Barrón, H. Sossa, and B. Cruz Fg. 11. Images altered wt 10% of nose. Images recalled usng the proposal. Fg. 12. Images altered wt 15% of nose. Images recalled usng the proposal 6 Conclusons In ths paper we have presented an algorthm to tran the mult-layered morphologcal perceptron that allows to buld more effcent support neghborhoods, snce the pont of vew of pattern recall, from the set of ey pattern patterns of the tranng set of an assocatve memory n both ts auto-assocatve or hetero-assocatve way of operaton. By several experments wth real patterns, we have shown that the proposal can be used to recall gray-level mages and not only bnary mages. We show the superorty of the new one. Acnowledgements. Ths wor was economcally supported by CIC-IPN, COFFAA- IPN and CONACYT under grant 46805 and SIP under grants 20071438 and 20071084, respectvely.

A New Algorthm for Tranng Mult-layered Morphologcal Networs 555 References [1] Bshop, C.: Neural Networs for Pattern Recognton. Oxford Unversty Press, Oxford England (1995) [2] Rossenblatt, F.: The perceptron: a probablstc model for nformaton storage and organzaton n the bran. Psychologcal revew 65, 386 408 (1958) [3] Rossenblatt, F.: Prncples of Neurodnamcs: Perceptrons and the theory of bran mechansm, Spartans Boos, Washngton D.C (1962) [4] Rtter, G.X., et al.: An ntroducton to morphologcal Neural Networs. In: Proceedngs of the 13Internatonal Conference on Pattern Recognton, pp. 709 717 (1996) [5] Rtter, G.X., et al.: Morphologcal assocatve memores. IEEE Transactons on Neural Networs C-9, 281 293 (1998) [6] Rtter, G.X.: Morphologcal Perceptrons. ISAS 97, Intellgent Systems and Semotcs, Gathersburg, Maryland (1997) [7] Rtter, G.X.: A new auto-assocatve memory based on lattce algebra. In: Proc. 9CIARP 2004, La Havana, Cuba pp. 148 155 (2004) [8] Kshan, M., et al.: Elements of Artfcal Neural Networs. The MIT Press, Cambrdge, Massachusetts, London, England (1997) [9] Pessoa, L.F.C., et al.: Morphologcal/Ran Neural Networs and ther adaptatve optmal mage processng. IEEE Internatonal Conference on Acoustc, Speech, and Sgnal Processng 6, 3399 3402 (1996) [10] Hu, M.K.: Pattern recognton by moments nvarants. Proceedng of the IRE 49, 1428 (1961) [11] Hu, M.K.: Vsual pattern recognton by moments nvarants, IRE Transactons on Informaton Theory, 179 187 (1962) [12] Sussner, P.: Morphologcal Perceptron Learnng. In: Proceedngs of the 1998 Internatonal Symposum on Intellgent Systems and Semotcs, pp. 477-482, Gathersburg, Maryland (1998) [13] Barron, R., et al.: New Improved Algorthm for the tranng of a Morphologcal Assocatve Memory, Research n Computer Scence. Specal Issue: Neural Networs and Assocatve Memores 21, 49 59 (2006)