Classification of Cat Ganglion Retinal Cells and Implications for Shape-Function Relationship

Size: px
Start display at page:

Download "Classification of Cat Ganglion Retinal Cells and Implications for Shape-Function Relationship"

Transcription

1

2 Classification of Cat Ganglion Retinal Cells and Implications for Shape-Function Relationship Regina Célia Coelho 1,3, Cesare Valenti 2, Júlia Sawaki Tanaka 1,4 and Luciano da Fontoura Costa 1 1 Cybernetic Vision Research Group, IFSC University of São Paulo Caixa Postal 369, São Carlos, SP, Brazil, {reginac, julia, luciano}@if.sc.usp.br 2 Dipartimento di Matematica ed Applicazioni, University of Palermo Via Archirafi 34, 90123, Palermo, Italy cvalenti@math.unipa.it 3 State University of Maringá, Campus Universitário, Av. Colombo, 5790 Maringá, PR, Brazil reginac@din.uem.br 4 IQ UNESP São Paulo State University Caixa Postal 355, Araraquara, SP, Brazil, julia@iq.unesp.br Abstract This article presents a quantitative approach to ganglion cell classification by considering combinations of several geometrical features including fractal dimension, symmetry, diameter, eccentricity and convex hull. Special attention is given to moment and symmetrybased features. Several combinations of such features are fed to two clustering methods (Ward s hierarchical scheme and K-Means) and the respectively obtained classifications are compared. The results indicate the superiority of some features, also suggesting possible biological implications. 1. Introduction Although the morphological aspects of neurons and neural structures are potentially important, they have received relatively little attention from researchers in neuroscience. At the same time, an increasing amount of works have showed that the neural shape is directly related with the respective function [1 4], in the sense that the behavior exhibited by neural cells can be understood in terms of the biochemical processes in and outside the cells and their respective morphology. The morphology exhibited by neural cells is essential in the sense that it constrains the biochemical processes in the cells, defining the potential of interaction between each specific neuron and the rest of the neural system. Neural shape can be characterized quantitatively by obtaining morphometric measurements that allow us to have a reasonable representation of the physical constraints imparted by the cells [5]. Thus, the determination of a reasonable set of features expressing the neural behavior of different neuron classes can help to segregate cells into meaningful classes, and a cell classification scheme based on quantitative features can be used to this purpose [5,6]. These set of features should not just be related to the processes of the neurons, but also to the fields of influences affecting these processes, because these fields are essential in order to better understand the spatial cover and environment interaction exhibited by the cells. The primary cells used by morphological investigations are retinal ganglion cells, because their nearly planar dendritic arborization can be conveniently reduced to two dimensions. Most of such studies have been accomplished in cats because the central area of the retina of these animals is similar to the human fovea. An important evidence of the relationship between shape and function has been verified for the specific case of such ganglion cells. Wässle [4] verified that their receptive fields are related with the convex hull area of the dendritic X/01 $ IEEE 517

3 arborization. Related findings were verified by several researches [7 11]. They related three types of ganglion cells with respect to the function they play, (i.e. Y, X and W), and verified that these functional types corresponded to the morphological α, β and γ types of cells, respectively. The current article concentrates on α and β cells. The present work considers a group of cat retinal ganglion cells and investigates several morphological measures that can be particularly helpful to characterize each group of cells. The objective is to try to find the best combination among the presented features that classify these neurons inside each group with the smallest possible mistake. The considered features are related to the shape and influence fields of neural cells, including fractal dimension, symmetry, diameter, eccentricity and convex hull. An analysis of the correlations between the considered features is also included. 2. Neural shape characterization The features considered in the present approach are described below and listed in Table 1. These features are related to measures normally used to biologically classify these cells. Table 1. Morphological features used to characterize cells. Abrev. Morphological Feature AM Axial Moment MV Mean Value SD Standard Deviation EC Eccentricity BC Fractal Dimension by Box-Counting Method Mi Fractal Dimension by Minkowski Sausage Method CH Convex Hull Area Di Diameter IH 1-11 Influence Histograms Influence Areas by ratio IA Diameter and Convex Hull Area The diameter (Di) of each cell was obtained from the respective convex hull (CH). It was used to investigate the influence of the spatial extent of the cells in the classifications. Figure 1 illustrates the convex hull of a neural cell and the respective diameter d (the area has also been considered) Influence Area The influence area (IA) of a cell is relative to the type of interaction being investigated [5,12]. In the present case, we concentrate on the spatial coverage of dendritic arborizations, considering the area defined up to a maximum distance, Dist, from the cell, in the sense of the smallest distance to any of its points [12]. This area can be represented by Minkowski Sausage, where the influence area would be the area this sausage. Observe that the sausages give an indication of the influence area of the neurons in relation to the one specific distance Dist. It can be verified, by considering the successive area expansions, that the spatial coverage tends to saturate as the distances increase, as illustrated in Figure 2 with respect to Dist = 3, 6 and 9 respectively. Figure 1. The convex hull area obtained for a neural cell, and the respective diameter (d), which corresponds to the maximum distance between any pair of points belonging to the hull. (a) (c) (d) Figure 2. Minkowsky sausages of the artificial neuron presented in (a) with respect to Dist equal 3 (b), 6 (c) and 9 (d). Values chosen for the sake of better visualization purposes Influence Histograms Influence histograms (IH), originally introduced in Costa et al. [12], are capable of expressing, with respect to the total area affected by a specific trophic influence emanating from a neuron, the extension of regions under similar influence from the trophic field. The first important point to keep in mind is that this measure is relative to the specific profile chosen to represent the influence field under analysis. While it is possible to use profiles (or d (b) 518

4 point-spread-function) of influence varying with the position along the cell, we shall be restricted, for simplicity s sake, to homogenous profiles. The choice of a specific profile should take into account the nature of the influence factor under analysis. For instance, investigations about spatial distribution of electric fields would imply exponentially decaying profiles. The present work will assume Gaussian distributions, which are particularly relevant when analysing trophic effects such as those induced by diffusion of chemical factors around the cells. Let f(x,y,z) be a function defining the distribution of an influence factor (e.g. biochemical or ionic densities in and outside cells, or electric fields) induced by a point source, and let f(x,y,z) be a multivariate and symmetric Gaussian distribution. Assuming homogeneity, the total influence of the field around a neural cell can be easily calculated by convolving f(x,y,z) with the binary representation of the shape of neural cell, i.e. the function I(x,y,z). It should be observed that when the support of f(x,y,z) is broad, it becomes interesting to process such convolutions in terms of Hadamard products in the Fourier Domain. The resulting image, represented by T(x,y,z)=f(x,y,z) I(x,y,z), provides a faithful estimation of the spatial coverage by the trophic field defined around the cell. The process of deriving the representation T(x,y,z) is illustrated in figure Fractal Dimension The fractal dimension (FD) is a measure of complexity, which has been used in the classification of the cells since Boycott [7] showed in his work that the ganglion cells in the cat retina present different complexity. The methods that have been considered in this paper are the Box-Counting (BC) and the Minkowski Sausage (Mi) [12,13], with some alteration in order to enhance their accuracy. This alteration took into account that natural objects present limited fractality and the images are of discrete nature (spatially quantized) [13]. So, only a portion of the Log-Log curve, where the fractality is the largest, has been considered for interpolation and FD estimation. The fact that these curves present partial fractality generally is not considered in the literature, that is to say, the FD is calculated by processing linear regression about the whole Log-Log curve. In the present case, the FD by Box-Counting, considered medium values of box number to several inclinations of the grid. The grid was rotated in different angles and the medium number of boxes of all inclinations was considered [13] Axial Moments Symmetry operators have been included in vision systems to perform different visual tasks. They have been applied to represent and describe object-parts [14], to perform image segmentation [15] and to locate points of interest in a scene [16]. The definition of our symmetry operator starts from the computation of a set of first order axial moments (AM) around the center of gravity of each cell [17]. It s easy to prove that all AM s will have the same value in case of an isotropic density distribution, hence they can be used as a circular symmetry measure. For the purpose of this work we have set the number of axes to 32, according to previous experimental results. It s interesting to note that all AM s can be quickly computed in parallel as a sum of convolution filters. This approach can be also generalized for 3-D object recognition [18]. An example of the AM feature is shown in figure 4. (a) (c) (e) Figure 3. The binary representations of two neural cells exhibiting distinct complexity (a) and (b). The distributions of the trophic field around these cells, obtained by convolving a two-dimensional Gaussian with (a) and (b), respectively, are illustrated in (c) and (d). The respective influence histograms are shown in (e). (b) (d) 519

5 3. Neural Cell Classification (a) (b) Figure 4. The axial moment values of an á (a) and a â (b) cell (these neural cells figures, originally published by Dann et al., are here reproduced with permission). Three shape parameters are derived from the AM s: their mean value (MV), their standard deviation (SD), and their eccentricity (EC=AM min /AM max ), where AM min and AM max are the minimum and maximum values of the AM s. The MV and SD features, are normally distributed in first approximation. We assumed it true also in the case of the feature EC, as ratio of two normally distributed quantities. Table 2 reports the µ and σ parameters, while figure 5 shows the probability distribution. Table 2. Mean ( ì) and standard deviation (ó) of MV, SD, and EC. Feature µ α µ β σ α σ β MV SD EC Figure 5. Probability distribution for MV, SD, and EC. Since we are interested in investigating how the morphological cell classes are defined, unsupervised classification has been adopted. Two classification algorithms have been used: hierarchical grouping (Ward s method) and K-Means clustering. The first method tends to produce clusters that are easily distinguished from other clusters and which tend to be tightly packed. It reduces the number of clusters one at a time starting from one cluster per object and ending with one cluster comprising all the objects. At each cluster reduction, the method merges two or more objects. K-Means clustering involves an iterative scheme that operates over a fixed number of clusters, such that each class has a center which is the mean position of all the samples in that class and each sample is in the class whose center is closest to [20]. 4. Methodology In order to validate the potential of the adopted measures for the characterization and classification of neural cells, 50 images of cat retinal ganglion cells [7 9,19,21 24] of the types α and β, all presenting distance of the fovea smaller than 3 degrees, have been considered. A normalized version of the cells, in the sense that each cell was scaled to the same diameter, was also produced, but they were used only to generate axial moment, mean value, standard deviation and the eccentricity features since they are invariant to the scale. Influence histograms assuming a Gaussian profile with standard deviation of 4, and containing 11 uniformly distributed bins, were generated for each of images. Influence areas with Dist varying from 1 to 20 are also considered for each neuron. In order to investigate the influence of the size of the cells, the diameter of each cell was obtained from the respective convex hulls of the cells. To measure the complexity of each image, the fractal dimension was calculated by using both Box Counting and Minkowski Sausage methods. Note that in order to improve the clustering results, every feature was normalized through a statistical transformation in such a way as to present null average and unitary standard deviation. The calculation of central and axial moments has introduced new parameters with complementary information as compared to those obtained by other shape indicators. Moreover these measures satisfy the property of scaling, rotation, and translation invariance. This allowed us to improve the discrimination ratio of the two types of cells. The classifications were done using the Statistica software and involved combinations of the features using K-Means and Ward s methods. 520

6 5. Results and Discussion The linear interrelationship between neural features can be analyzed by the correlation matrix shown in Table 3. This table shows only three out of the 11 bins of influence histogram and five out of the 20 bins of influence area. It should be recalled that value 1 indicates a perfect correlation among two features, which only happens, in this case, for the same features. A negative value indicates an anti-correlation. Marked correlations (absolute values larger than 0.5) are considered strong. Classifications have been performed using all possible combinations of 2, 3, 4, 5 and 9 features, as well as by just using one and all features. Both classification methods have been applied for all such combinations. Although IA has presented a strong correlation with many other features, it resulted to be the worst feature. Alone or combined with other features, it always tends to undermine the classification. Therefore, it is excluded from the following comments. The features were grouped into the three major features groups: 1) symmetry features (AM, MV, SD, and EC); 2) complexity features (BC and Mi); and 3) size features (CH, Di, and IH), that considered the size of the neuron. Except for group (Di, IH), that presented satisfactory results (error 10% by both classification methods), any combination using only features of the same groups led to a poor result. The group (AM, Di) showed to be suitable for the classification (presented error equal to 6% both methods). This was the only group with two or three features that presented such a small error. Another interesting result was obtained by using all the features, except IA. The error presented by both methods was 6%. Most of the best classifications were obtained using groupings of 4 or 5 features together. Good results were obtained when we took at least 2 symmetry features, at least one complexity feature and IH. The IH combined with others also showed to be a good feature. Most of the worst results were obtained for groups of 1, 2 or 3 features. In such cases, the groups contained only symmetry features, or symmetries and complexity or still only a symmetry feature. The error in these cases was larger than 20% for both methods. The number of cells assigned to each class with respect to the best classification results is showed in table 4. In spite of the numbers, seemingly indicate success for all cells in specific cases, in all these case there were 2 wrongly classified cells, one in each class. It should also be observed that the relatively high obtained misclassification rates are very likely explained by the fact that the class assignment used as comparison standard was produced subjectively by diverse authors, using different criteria. 6. Conclusions A quantitative approach to neural cell classification, concentrating on cat ganglion cells, has been reported considering several geometrical features, and their combinations have been investigated with respect to two clustering algorithms. The obtained results indicated the superiority of some specific features, as well as a few incompatibilities with the original classifications originally assigned by human operators. The axial moment and diameter, in particular, resulted particularly effective. Such results indicate that the properties quantified by these geometrical features likely have special relevance to the behavior of the respective classes of cells. Moreover the features and the methodologies here introduced are general and can be easily extended to solve different kinds of object recognition problems. Acknowledgments Luciano da F. Costa is indebted to FAPESP and CNPq for financial support. References [1] Purves D., Body and Brain. A Trophic Theory of Neural Connections, Harvard University Press, United States of America, [2] Linden R., Dendritic Competition in the Developing Retina: Ganglion Cell Density Gradients and Laterally Displace Dendrites, Visual Neuroscience, Vol. 10, pp , [3] Kossel A., S. Löwel, and J. Bolz, Relationships between Dendritic Fields and Functional Architecture in Striate Cortex of Normal and Visually Deprived Cats, The Journal of Neuroscience, Vol. 15, pp , [4] Wässle H., Sampling of visual space by retina ganglion cell, In: Pettigrew J.D., Sanderson K.J., and Levick W.R., In Visual Neuroscience, Cambridge University Press, [5] Costa L.F. and T.J. Velte, Automatic Characterization and Classification of Ganglion Cells from the Salamander Retina, The Journal of Comparative Neurololy, Vol. 404, p , [6] Costa L.F. and Jr.R.M. Cesar., Shape Analysis and Classification: Theory and Practice, CRC Press, Brazil, [7] Boycott B.B. and H. Wässle, The Morphological Types of Ganglion Cells of the Domestic Cat s Retina, Journal Physiology, Vol. 240, p , [8] Fukuda Y., C,F. Hsiao, M. Watanabe and H. Ito, Morphological Correlates of Physiologically Identified Y-, X- and W- Cells on Cat Retina, Journal of Neurophysiology, Vol. 52, No. 6, pp , [9] Saito H.A., Morphology of Physiologically Identified X-, Y-, and W-Type Retinal Ganglion Cells of the Cat, The Journal of Comparative Neurology, Vol. 221, pp ,

7 [10] Stone J. and Y. Fukuda, Properties of Cat Retinal Ganglion Cells: A Comparison of W-Cells With X- and Y-Cells, Journal of Neurophisiology, Vol. 37, pp , [11] Stone J. and R. Clarke, Correlation Between Soma Size And Dendritic Morphology In Cat Retinal Cells: Evidence Of Further Variation In The γ-cell Class, The Journal of Comparative Neurology, Vol. 192, pp , [12] Costa L.F., Jr.R.M. Cesar, R.C. Coelho., and J.S. Tanaka, Analysis and Synthesis of Morphologically Realistic Neural Networks, In: Poznanski R., Modelling in the Neurosciences: From Ionic Channels to Neural Networks, Harwood Academic Publishers, India, pp , [13] Coelho R.C. and L.F. Costa, On the Application of the Bouligand-Minkowski Fractal Dimension for Shape Characterization, Applied Signal Processing, Vol. 3, pp , [14] Kelly M.F. and M.D. Levine, From Symmetry to Representation. Technical Report, TR-CIM-94-12, Center for Intelligent Machines. McGill University, Montreal, Canada, [15] Gauch J.M. and S.M. Pizer, The Intensity Axis Of Symmetry Application To Image Segmentation, IEEE Trans. PAMI, Vol. 15, No. 8, pp , [16] Reisfeld D., H. Wolfson., and Y. Yeshurun, Context Free Attentional Operators: the Generalized Symmetry Transform, Int. Journal of Computer Vision, Vol. 14, pp , [17] Di Gesù V. and C. Valenti, Symmetry Operators in Computer Vision, Vistas in Astronomy, Pergamon, Vol. 40, No. 4, pp , [18] Chella A., V. Di Gesù, I. Infantino, D. Intravaia and C. Valenti, Cooperating Strategy for Objects Recognition, in Lecture Notes in Computer Science book Shape, contour and grouping in computer vision, Springer Verlag, [19] Dann J.F., E.H. Buhl, and L. Peichl, Postnatal Dendritic Maturation of Alpha and Beta Ganglion Cells in Cat Retina, The Journal of Neuroscience, Vol. 8, No. 5, pp , [20] Andeberg M.R., Cluster Analysis for Applications, New York, Academic Press, [21] Kolb H., R. Nelson, and A. Mariani, Amacrine Cells, Bipolar Cells and Ganglion Cells of the Cat Retina: A Golgi Study, Vision Research, Vol. 21, pp , [22] Leventhal A.G. and J.D. Schall, Structural Basis of Orientation Sensitivity of Cat Retinal Ganglion Cells, The Journal of Computational Neurology, Vol. 220, pp , [23] Wässle H., L. Peichl and B.B. Boycott, Morphology and Topography of On- and Off-Alpha Cells in the Cat Retina, Proceedings of Royal Society London B, Vol. 212, pp , [24] Watanabe M., H. Sawai, and Y. Fukuda, Number, Distribution, and Morphology of Retinal Ganglion Cells with Axons Regenerated into Peripheral Nerve Graft in Adult Cats, Journal of Neuroscience, Vol. 13, No. 5, pp , Table 3. Correlation matrix among the features considered. N=50 (Casewise deletion of missing data). AM MV SD EC BC Mi CH Di IH 1 IH 6 IH 11 IA 1 IA 5 IA 10 IA 15 IA 20 AM MV SD EC BC Mi CH Di IH IH IH IA IA IA IA IA Table 4. Cells assigned to each class with respect to the best classification results. N=50 (23 á cells, 27 â cells). In all these cases, the error percentage has been 4%. Feature sets K-Means Ward α / β α / β (AM,MV,BC,Mi,IH) 83% / 100% 100% / 100% (SD,EC,BC,Mi,IH) 100% / 93% 100% / 100% (AM,SD,EC,BC,IH), (MV,EC,BC,Di,IH) 100% / 100% 91% / 100% (AM,BC,CH,Di), (AM,MV,BC,CH,Di), (AM,MV,EC,CH,Di) 100% / 93% 78% / 100% (AM,MV,EC,CH,Di) 100% / 93% 100% / 82% (AM,Di), (MV,SD,BC,Mi,IH) 100% / 89% 100% / 89% (AM,MV,CH,Di) 87% / 100% 100% / 89% (AM,MV,Mi,IH), (MV,SD,EC,Mi,IH) 96% / 100% 87% / 100% (AM,MV,SD,EC,BC,Mi,CH,Di,IH) 87% / 100% 87% / 100% 522

visual tasks. For example, a set of annular operators can be used to identify enclosed symmetry points, and then a grouping algorithm is applied to re

visual tasks. For example, a set of annular operators can be used to identify enclosed symmetry points, and then a grouping algorithm is applied to re Detection of regions of interest via the Pyramid Discrete Symmetry Transform Vito Di Gesu and Cesare Valenti 1. Introduction Pyramid computation has been introduced to design ecient vision algorithms [1],

More information

Auto-correlation of retinal ganglion cell mosaics shows hexagonal structure

Auto-correlation of retinal ganglion cell mosaics shows hexagonal structure Supplementary Discussion Auto-correlation of retinal ganglion cell mosaics shows hexagonal structure Wässle and colleagues first observed that the local structure of cell mosaics was approximately hexagonal

More information

Generating Three-Dimensional Neural Cells Based on Bayes Rules and Interpolation with Thin Plate Splines

Generating Three-Dimensional Neural Cells Based on Bayes Rules and Interpolation with Thin Plate Splines enerating Three-Dimensional Neural Cells Based on Bayes Rules and Interpolation with Thin Plate Splines Regina Célia Coelho 1,2 and Osvaldo Vargas Jaques 2,3,4 1 UNIMEP-Universidade Metodista de Piracicaba

More information

A Three-dimensional Physiologically Realistic Model of the Retina

A Three-dimensional Physiologically Realistic Model of the Retina A Three-dimensional Physiologically Realistic Model of the Retina Michael Tadross, Cameron Whitehouse, Melissa Hornstein, Vicky Eng and Evangelia Micheli-Tzanakou Department of Biomedical Engineering 617

More information

Modeling of Retinal Ganglion Cell Responses to Electrical Stimulation with Multiple Electrodes L.A. Hruby Salk Institute for Biological Studies

Modeling of Retinal Ganglion Cell Responses to Electrical Stimulation with Multiple Electrodes L.A. Hruby Salk Institute for Biological Studies Modeling of Retinal Ganglion Cell Responses to Electrical Stimulation with Multiple Electrodes L.A. Hruby Salk Institute for Biological Studies Introduction Since work on epiretinal electrical stimulation

More information

Modeling retinal high and low contrast sensitivity lters. T. Lourens. Abstract

Modeling retinal high and low contrast sensitivity lters. T. Lourens. Abstract Modeling retinal high and low contrast sensitivity lters T. Lourens Department of Computer Science University of Groningen P.O. Box 800, 9700 AV Groningen, The Netherlands E-mail: tino@cs.rug.nl Abstract

More information

Effects of Betaxolol on Hodgkin-Huxley Model of Tiger Salamander Retinal Ganglion Cell

Effects of Betaxolol on Hodgkin-Huxley Model of Tiger Salamander Retinal Ganglion Cell Effects of Betaxolol on Hodgkin-Huxley Model of Tiger Salamander Retinal Ganglion Cell 1. Abstract Matthew Dunlevie Clement Lee Indrani Mikkilineni mdunlevi@ucsd.edu cll008@ucsd.edu imikkili@ucsd.edu Isolated

More information

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 BIOLOGICAL INSPIRATIONS Some numbers The human brain contains about 10 billion nerve cells (neurons) Each neuron is connected to the others through 10000

More information

15 Grossberg Network 1

15 Grossberg Network 1 Grossberg Network Biological Motivation: Vision Bipolar Cell Amacrine Cell Ganglion Cell Optic Nerve Cone Light Lens Rod Horizontal Cell Retina Optic Nerve Fiber Eyeball and Retina Layers of Retina The

More information

Adaptation in the Neural Code of the Retina

Adaptation in the Neural Code of the Retina Adaptation in the Neural Code of the Retina Lens Retina Fovea Optic Nerve Optic Nerve Bottleneck Neurons Information Receptors: 108 95% Optic Nerve 106 5% After Polyak 1941 Visual Cortex ~1010 Mean Intensity

More information

arxiv: v1 [cond-mat.stat-mech] 23 Mar 2008

arxiv: v1 [cond-mat.stat-mech] 23 Mar 2008 Entropy Moments Characterization of Statistical Distributions Luciano da Fontoura Costa nstitute of Physics at São Carlos, University of São Paulo, P.O. Box 369, São Carlos, São Paulo, 13560-970 Brazil

More information

Is the Human Visual System Invariant to Translation and Scale?

Is the Human Visual System Invariant to Translation and Scale? The AAAI 207 Spring Symposium on Science of Intelligence: Computational Principles of Natural and Artificial Intelligence Technical Report SS-7-07 Is the Human Visual System Invariant to Translation and

More information

Region Description for Recognition

Region Description for Recognition Region Description for Recognition For object recognition, descriptions of regions in an image have to be compared with descriptions of regions of meaningful objects (models). The general problem of object

More information

Discriminative Direction for Kernel Classifiers

Discriminative Direction for Kernel Classifiers Discriminative Direction for Kernel Classifiers Polina Golland Artificial Intelligence Lab Massachusetts Institute of Technology Cambridge, MA 02139 polina@ai.mit.edu Abstract In many scientific and engineering

More information

THE retina in general consists of three layers: photoreceptors

THE retina in general consists of three layers: photoreceptors CS229 MACHINE LEARNING, STANFORD UNIVERSITY, DECEMBER 2016 1 Models of Neuron Coding in Retinal Ganglion Cells and Clustering by Receptive Field Kevin Fegelis, SUID: 005996192, Claire Hebert, SUID: 006122438,

More information

Visual Motion Analysis by a Neural Network

Visual Motion Analysis by a Neural Network Visual Motion Analysis by a Neural Network Kansai University Takatsuki, Osaka 569 1095, Japan E-mail: fukushima@m.ieice.org (Submitted on December 12, 2006) Abstract In the visual systems of mammals, visual

More information

Does the Wake-sleep Algorithm Produce Good Density Estimators?

Does the Wake-sleep Algorithm Produce Good Density Estimators? Does the Wake-sleep Algorithm Produce Good Density Estimators? Brendan J. Frey, Geoffrey E. Hinton Peter Dayan Department of Computer Science Department of Brain and Cognitive Sciences University of Toronto

More information

The Spike Response Model: A Framework to Predict Neuronal Spike Trains

The Spike Response Model: A Framework to Predict Neuronal Spike Trains The Spike Response Model: A Framework to Predict Neuronal Spike Trains Renaud Jolivet, Timothy J. Lewis 2, and Wulfram Gerstner Laboratory of Computational Neuroscience, Swiss Federal Institute of Technology

More information

EEG- Signal Processing

EEG- Signal Processing Fatemeh Hadaeghi EEG- Signal Processing Lecture Notes for BSP, Chapter 5 Master Program Data Engineering 1 5 Introduction The complex patterns of neural activity, both in presence and absence of external

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks CPSC 533 Winter 2 Christian Jacob Neural Networks in the Context of AI Systems Neural Networks as Mediators between Symbolic AI and Statistical Methods 2 5.-NeuralNets-2.nb Neural

More information

Linear discriminant functions

Linear discriminant functions Andrea Passerini passerini@disi.unitn.it Machine Learning Discriminative learning Discriminative vs generative Generative learning assumes knowledge of the distribution governing the data Discriminative

More information

International Journal of Remote Sensing, in press, 2006.

International Journal of Remote Sensing, in press, 2006. International Journal of Remote Sensing, in press, 2006. Parameter Selection for Region-Growing Image Segmentation Algorithms using Spatial Autocorrelation G. M. ESPINDOLA, G. CAMARA*, I. A. REIS, L. S.

More information

SUPPORT VECTOR REGRESSION WITH A GENERALIZED QUADRATIC LOSS

SUPPORT VECTOR REGRESSION WITH A GENERALIZED QUADRATIC LOSS SUPPORT VECTOR REGRESSION WITH A GENERALIZED QUADRATIC LOSS Filippo Portera and Alessandro Sperduti Dipartimento di Matematica Pura ed Applicata Universit a di Padova, Padova, Italy {portera,sperduti}@math.unipd.it

More information

Influence of Criticality on 1/f α Spectral Characteristics of Cortical Neuron Populations

Influence of Criticality on 1/f α Spectral Characteristics of Cortical Neuron Populations Influence of Criticality on 1/f α Spectral Characteristics of Cortical Neuron Populations Robert Kozma rkozma@memphis.edu Computational Neurodynamics Laboratory, Department of Computer Science 373 Dunn

More information

Spatial constraints underlying the retinal mosaics of two types of horizontal cells in cat and macaque

Spatial constraints underlying the retinal mosaics of two types of horizontal cells in cat and macaque Spatial constraints underlying the retinal mosaics of two types of horizontal cells in cat and macaque Stephen J. Eglen 1,, James C. T. Wong 1 Revision: 1.6 September 3, 18 arxiv:18.986v1 [q-bio.nc] 5

More information

Classification of Ordinal Data Using Neural Networks

Classification of Ordinal Data Using Neural Networks Classification of Ordinal Data Using Neural Networks Joaquim Pinto da Costa and Jaime S. Cardoso 2 Faculdade Ciências Universidade Porto, Porto, Portugal jpcosta@fc.up.pt 2 Faculdade Engenharia Universidade

More information

Annales UMCS Informatica AI 1 (2003) UMCS. Liquid state machine built of Hodgkin-Huxley neurons pattern recognition and informational entropy

Annales UMCS Informatica AI 1 (2003) UMCS. Liquid state machine built of Hodgkin-Huxley neurons pattern recognition and informational entropy Annales UMC Informatica AI 1 (2003) 107-113 Annales UMC Informatica Lublin-Polonia ectio AI http://www.annales.umcs.lublin.pl/ Liquid state machine built of Hodgkin-Huxley neurons pattern recognition and

More information

Introduction and Perceptron Learning

Introduction and Perceptron Learning Artificial Neural Networks Introduction and Perceptron Learning CPSC 565 Winter 2003 Christian Jacob Department of Computer Science University of Calgary Canada CPSC 565 - Winter 2003 - Emergent Computing

More information

Introduction Biologically Motivated Crude Model Backpropagation

Introduction Biologically Motivated Crude Model Backpropagation Introduction Biologically Motivated Crude Model Backpropagation 1 McCulloch-Pitts Neurons In 1943 Warren S. McCulloch, a neuroscientist, and Walter Pitts, a logician, published A logical calculus of the

More information

Where is the bark, the tree and the forest, mathematical foundations of multi-scale analysis

Where is the bark, the tree and the forest, mathematical foundations of multi-scale analysis Where is the bark, the tree and the forest, mathematical foundations of multi-scale analysis What we observe is not nature itself, but nature exposed to our method of questioning. -Werner Heisenberg M.

More information

Fundamentals of Computational Neuroscience 2e

Fundamentals of Computational Neuroscience 2e Fundamentals of Computational Neuroscience 2e January 1, 2010 Chapter 10: The cognitive brain Hierarchical maps and attentive vision A. Ventral visual pathway B. Layered cortical maps Receptive field size

More information

Introduction Principles of Signaling and Organization p. 3 Signaling in Simple Neuronal Circuits p. 4 Organization of the Retina p.

Introduction Principles of Signaling and Organization p. 3 Signaling in Simple Neuronal Circuits p. 4 Organization of the Retina p. Introduction Principles of Signaling and Organization p. 3 Signaling in Simple Neuronal Circuits p. 4 Organization of the Retina p. 5 Signaling in Nerve Cells p. 9 Cellular and Molecular Biology of Neurons

More information

SGD and Deep Learning

SGD and Deep Learning SGD and Deep Learning Subgradients Lets make the gradient cheating more formal. Recall that the gradient is the slope of the tangent. f(w 1 )+rf(w 1 ) (w w 1 ) Non differentiable case? w 1 Subgradients

More information

A Simple Implementation of the Stochastic Discrimination for Pattern Recognition

A Simple Implementation of the Stochastic Discrimination for Pattern Recognition A Simple Implementation of the Stochastic Discrimination for Pattern Recognition Dechang Chen 1 and Xiuzhen Cheng 2 1 University of Wisconsin Green Bay, Green Bay, WI 54311, USA chend@uwgb.edu 2 University

More information

Convolutional neural networks

Convolutional neural networks 11-1: Convolutional neural networks Prof. J.C. Kao, UCLA Convolutional neural networks Motivation Biological inspiration Convolution operation Convolutional layer Padding and stride CNN architecture 11-2:

More information

Functional Radial Basis Function Networks (FRBFN)

Functional Radial Basis Function Networks (FRBFN) Bruges (Belgium), 28-3 April 24, d-side publi., ISBN 2-9337-4-8, pp. 313-318 Functional Radial Basis Function Networks (FRBFN) N. Delannay 1, F. Rossi 2, B. Conan-Guez 2, M. Verleysen 1 1 Université catholique

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 2: PROBABILITY DISTRIBUTIONS

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 2: PROBABILITY DISTRIBUTIONS PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 2: PROBABILITY DISTRIBUTIONS Parametric Distributions Basic building blocks: Need to determine given Representation: or? Recall Curve Fitting Binary Variables

More information

Data Mining Part 5. Prediction

Data Mining Part 5. Prediction Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,

More information

Robust cartogram visualization of outliers in manifold leaning

Robust cartogram visualization of outliers in manifold leaning Robust cartogram visualization of outliers in manifold leaning Alessandra Tosi and Alfredo Vellido atosi@lsi.upc.edu - avellido@lsi.upc.edu LSI Department - UPC, Barcelona 1 Introduction Goals 2 NLDR methods:

More information

Distributed Clustering and Local Regression for Knowledge Discovery in Multiple Spatial Databases

Distributed Clustering and Local Regression for Knowledge Discovery in Multiple Spatial Databases Distributed Clustering and Local Regression for Knowledge Discovery in Multiple Spatial Databases Aleksandar Lazarevic, Dragoljub Pokrajac, Zoran Obradovic School of Electrical Engineering and Computer

More information

An Inverse Vibration Problem Solved by an Artificial Neural Network

An Inverse Vibration Problem Solved by an Artificial Neural Network TEMA Tend. Mat. Apl. Comput., 6, No. 1 (05), 163-175. c Uma Publicação da Sociedade Brasileira de Matemática Aplicada e Computacional. An Inverse Vibration Problem Solved by an Artificial Neural Network

More information

STUDENT PAPER. Santiago Santana University of Illinois, Urbana-Champaign Blue Waters Education Program 736 S. Lombard Oak Park IL, 60304

STUDENT PAPER. Santiago Santana University of Illinois, Urbana-Champaign Blue Waters Education Program 736 S. Lombard Oak Park IL, 60304 STUDENT PAPER Differences between Stochastic and Deterministic Modeling in Real World Systems using the Action Potential of Nerves. Santiago Santana University of Illinois, Urbana-Champaign Blue Waters

More information

Variational Principal Components

Variational Principal Components Variational Principal Components Christopher M. Bishop Microsoft Research 7 J. J. Thomson Avenue, Cambridge, CB3 0FB, U.K. cmbishop@microsoft.com http://research.microsoft.com/ cmbishop In Proceedings

More information

Cell division takes place next to the RPE. Neuroblastic cells have the capacity to differentiate into any of the cell types found in the mature retina

Cell division takes place next to the RPE. Neuroblastic cells have the capacity to differentiate into any of the cell types found in the mature retina RPE is a monolayer of hexagonal shaped neural epithelial cells that have the same embryological origin as the neural retina. They mature before the neural retina and play a key role in metabolic support

More information

Large Scale Environment Partitioning in Mobile Robotics Recognition Tasks

Large Scale Environment Partitioning in Mobile Robotics Recognition Tasks Large Scale Environment in Mobile Robotics Recognition Tasks Boyan Bonev, Miguel Cazorla {boyan,miguel}@dccia.ua.es Robot Vision Group Department of Computer Science and Artificial Intelligence University

More information

Danjon noticed that the length (cusp to cusp) of the new crescent. moon was less than 180 degrees and suggested that the cause of the

Danjon noticed that the length (cusp to cusp) of the new crescent. moon was less than 180 degrees and suggested that the cause of the From The Observatory, Vol. 125, No. 1187, pp. 227-232, 2005 August EXPLAINING AND CALCULATING THE LENGTH OF THE NEW CRESCENT MOON By A. H. Sultan Physics Department, Sana a University, Yemen Danjon noticed

More information

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, 2012 Sasidharan Sreedharan www.sasidharan.webs.com 3/1/2012 1 Syllabus Artificial Intelligence Systems- Neural Networks, fuzzy logic,

More information

Theoretical Computer Science

Theoretical Computer Science Theoretical Computer Science 406 008) 3 4 Contents lists available at ScienceDirect Theoretical Computer Science journal homepage: www.elsevier.com/locate/tcs Discrete sets with minimal moment of inertia

More information

Artificial Neural Networks

Artificial Neural Networks Introduction ANN in Action Final Observations Application: Poverty Detection Artificial Neural Networks Alvaro J. Riascos Villegas University of los Andes and Quantil July 6 2018 Artificial Neural Networks

More information

Nonlinear reverse-correlation with synthesized naturalistic noise

Nonlinear reverse-correlation with synthesized naturalistic noise Cognitive Science Online, Vol1, pp1 7, 2003 http://cogsci-onlineucsdedu Nonlinear reverse-correlation with synthesized naturalistic noise Hsin-Hao Yu Department of Cognitive Science University of California

More information

Viewpoint invariant face recognition using independent component analysis and attractor networks

Viewpoint invariant face recognition using independent component analysis and attractor networks Viewpoint invariant face recognition using independent component analysis and attractor networks Marian Stewart Bartlett University of California San Diego The Salk Institute La Jolla, CA 92037 marni@salk.edu

More information

Ângelo Cardoso 27 May, Symbolic and Sub-Symbolic Learning Course Instituto Superior Técnico

Ângelo Cardoso 27 May, Symbolic and Sub-Symbolic Learning Course Instituto Superior Técnico BIOLOGICALLY INSPIRED COMPUTER MODELS FOR VISUAL RECOGNITION Ângelo Cardoso 27 May, 2010 Symbolic and Sub-Symbolic Learning Course Instituto Superior Técnico Index Human Vision Retinal Ganglion Cells Simple

More information

Limulus. The Neural Code. Response of Visual Neurons 9/21/2011

Limulus. The Neural Code. Response of Visual Neurons 9/21/2011 Crab cam (Barlow et al., 2001) self inhibition recurrent inhibition lateral inhibition - L16. Neural processing in Linear Systems: Temporal and Spatial Filtering C. D. Hopkins Sept. 21, 2011 The Neural

More information

Deep Feedforward Networks. Sargur N. Srihari

Deep Feedforward Networks. Sargur N. Srihari Deep Feedforward Networks Sargur N. srihari@cedar.buffalo.edu 1 Topics Overview 1. Example: Learning XOR 2. Gradient-Based Learning 3. Hidden Units 4. Architecture Design 5. Backpropagation and Other Differentiation

More information

Lecture 7 Artificial neural networks: Supervised learning

Lecture 7 Artificial neural networks: Supervised learning Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in

More information

Correlated Percolation, Fractal Structures, and Scale-Invariant Distribution of Clusters in Natural Images

Correlated Percolation, Fractal Structures, and Scale-Invariant Distribution of Clusters in Natural Images 1016 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 38, NO. 5, MAY 2016 Correlated Percolation, Fractal Structures, and Scale-Invariant Distribution of Clusters in Natural Images

More information

Convex envelopes, cardinality constrained optimization and LASSO. An application in supervised learning: support vector machines (SVMs)

Convex envelopes, cardinality constrained optimization and LASSO. An application in supervised learning: support vector machines (SVMs) ORF 523 Lecture 8 Princeton University Instructor: A.A. Ahmadi Scribe: G. Hall Any typos should be emailed to a a a@princeton.edu. 1 Outline Convexity-preserving operations Convex envelopes, cardinality

More information

Natural Image Statistics

Natural Image Statistics Natural Image Statistics A probabilistic approach to modelling early visual processing in the cortex Dept of Computer Science Early visual processing LGN V1 retina From the eye to the primary visual cortex

More information

Optimization Methods for Machine Learning (OMML)

Optimization Methods for Machine Learning (OMML) Optimization Methods for Machine Learning (OMML) 2nd lecture (2 slots) Prof. L. Palagi 16/10/2014 1 What is (not) Data Mining? By Namwar Rizvi - Ad Hoc Query: ad Hoc queries just examines the current data

More information

Enhanced Fourier Shape Descriptor Using Zero-Padding

Enhanced Fourier Shape Descriptor Using Zero-Padding Enhanced ourier Shape Descriptor Using Zero-Padding Iivari Kunttu, Leena Lepistö, and Ari Visa Tampere University of Technology, Institute of Signal Processing, P.O. Box 553, I-330 Tampere inland {Iivari.Kunttu,

More information

Space-Variant Computer Vision: A Graph Theoretic Approach

Space-Variant Computer Vision: A Graph Theoretic Approach p.1/65 Space-Variant Computer Vision: A Graph Theoretic Approach Leo Grady Cognitive and Neural Systems Boston University p.2/65 Outline of talk Space-variant vision - Why and how of graph theory Anisotropic

More information

Lecture 6: Edge Detection. CAP 5415: Computer Vision Fall 2008

Lecture 6: Edge Detection. CAP 5415: Computer Vision Fall 2008 Lecture 6: Edge Detection CAP 5415: Computer Vision Fall 2008 Announcements PS 2 is available Please read it by Thursday During Thursday lecture, I will be going over it in some detail Monday - Computer

More information

Multivariate statistical methods and data mining in particle physics

Multivariate statistical methods and data mining in particle physics Multivariate statistical methods and data mining in particle physics RHUL Physics www.pp.rhul.ac.uk/~cowan Academic Training Lectures CERN 16 19 June, 2008 1 Outline Statement of the problem Some general

More information

Simple Neural Nets For Pattern Classification

Simple Neural Nets For Pattern Classification CHAPTER 2 Simple Neural Nets For Pattern Classification Neural Networks General Discussion One of the simplest tasks that neural nets can be trained to perform is pattern classification. In pattern classification

More information

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann (Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for

More information

A Novel Activity Detection Method

A Novel Activity Detection Method A Novel Activity Detection Method Gismy George P.G. Student, Department of ECE, Ilahia College of,muvattupuzha, Kerala, India ABSTRACT: This paper presents an approach for activity state recognition of

More information

The Perceptron. Volker Tresp Summer 2016

The Perceptron. Volker Tresp Summer 2016 The Perceptron Volker Tresp Summer 2016 1 Elements in Learning Tasks Collection, cleaning and preprocessing of training data Definition of a class of learning models. Often defined by the free model parameters

More information

Vote. Vote on timing for night section: Option 1 (what we have now) Option 2. Lecture, 6:10-7:50 25 minute dinner break Tutorial, 8:15-9

Vote. Vote on timing for night section: Option 1 (what we have now) Option 2. Lecture, 6:10-7:50 25 minute dinner break Tutorial, 8:15-9 Vote Vote on timing for night section: Option 1 (what we have now) Lecture, 6:10-7:50 25 minute dinner break Tutorial, 8:15-9 Option 2 Lecture, 6:10-7 10 minute break Lecture, 7:10-8 10 minute break Tutorial,

More information

Learning and Memory in Neural Networks

Learning and Memory in Neural Networks Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units

More information

Elements of Neurogeometry

Elements of Neurogeometry Lecture Notes in Morphogenesis Series Editor: Alessandro Sarti Jean Petitot Elements of Neurogeometry Functional Architectures of Vision http://www.springer.com/978-3-319-65589-5 Contents 1 Preface...

More information

Generalized Laplacian as Focus Measure

Generalized Laplacian as Focus Measure Generalized Laplacian as Focus Measure Muhammad Riaz 1, Seungjin Park, Muhammad Bilal Ahmad 1, Waqas Rasheed 1, and Jongan Park 1 1 School of Information & Communications Engineering, Chosun University,

More information

Deep Learning for Gravitational Wave Analysis Results with LIGO Data

Deep Learning for Gravitational Wave Analysis Results with LIGO Data Link to these slides: http://tiny.cc/nips arxiv:1711.03121 Deep Learning for Gravitational Wave Analysis Results with LIGO Data Daniel George & E. A. Huerta NCSA Gravity Group - http://gravity.ncsa.illinois.edu/

More information

Convolutional Associative Memory: FIR Filter Model of Synapse

Convolutional Associative Memory: FIR Filter Model of Synapse Convolutional Associative Memory: FIR Filter Model of Synapse Rama Murthy Garimella 1, Sai Dileep Munugoti 2, Anil Rayala 1 1 International Institute of Information technology, Hyderabad, India. rammurthy@iiit.ac.in,

More information

Feature extraction: Corners and blobs

Feature extraction: Corners and blobs Feature extraction: Corners and blobs Review: Linear filtering and edge detection Name two different kinds of image noise Name a non-linear smoothing filter What advantages does median filtering have over

More information

Hierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References

Hierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References 24th March 2011 Update Hierarchical Model Rao and Ballard (1999) presented a hierarchical model of visual cortex to show how classical and extra-classical Receptive Field (RF) effects could be explained

More information

A note on the Hausdorff distance between Atanassov s intuitionistic fuzzy sets

A note on the Hausdorff distance between Atanassov s intuitionistic fuzzy sets NIFS Vol. 15 (2009), No. 1, 1 12 A note on the Hausdorff distance between Atanassov s intuitionistic fuzzy sets Eulalia Szmidt and Janusz Kacprzyk Systems Research Institute, Polish Academy of Sciences

More information

RECONSTRUCTION AND PATTERN RECOGNITION VIA THE CITTI- PETITOT-SARTI MODEL

RECONSTRUCTION AND PATTERN RECOGNITION VIA THE CITTI- PETITOT-SARTI MODEL RECONSTRUCTION AND PATTERN RECOGNITION VIA THE CITTI- PETITOT-SARTI MODEL DARIO PRANDI, J.P. GAUTHIER, U. BOSCAIN LSIS, UNIVERSITÉ DE TOULON & ÉCOLE POLYTECHNIQUE, PARIS SÉMINAIRE STATISTIQUE ET IMAGERIE

More information

Derived Distance: Beyond a model, towards a theory

Derived Distance: Beyond a model, towards a theory Derived Distance: Beyond a model, towards a theory 9.520 April 23 2008 Jake Bouvrie work with Steve Smale, Tomaso Poggio, Andrea Caponnetto and Lorenzo Rosasco Reference: Smale, S., T. Poggio, A. Caponnetto,

More information

Parameter selection for region-growing image segmentation algorithms using spatial autocorrelation

Parameter selection for region-growing image segmentation algorithms using spatial autocorrelation International Journal of Remote Sensing Vol. 27, No. 14, 20 July 2006, 3035 3040 Parameter selection for region-growing image segmentation algorithms using spatial autocorrelation G. M. ESPINDOLA, G. CAMARA*,

More information

Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces

Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces LETTER Communicated by Bartlett Mel Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces Aapo Hyvärinen Patrik Hoyer Helsinki University

More information

On The Equivalence of Hierarchical Temporal Memory and Neural Nets

On The Equivalence of Hierarchical Temporal Memory and Neural Nets On The Equivalence of Hierarchical Temporal Memory and Neural Nets Bedeho Mesghina Wolde Mender December 7, 2009 Abstract In this paper we present a rigorous definition of classification in a common family

More information

OBJECT DETECTION AND RECOGNITION IN DIGITAL IMAGES

OBJECT DETECTION AND RECOGNITION IN DIGITAL IMAGES OBJECT DETECTION AND RECOGNITION IN DIGITAL IMAGES THEORY AND PRACTICE Bogustaw Cyganek AGH University of Science and Technology, Poland WILEY A John Wiley &. Sons, Ltd., Publication Contents Preface Acknowledgements

More information

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption ANDRÉ NUNES DE SOUZA, JOSÉ ALFREDO C. ULSON, IVAN NUNES

More information

Classifying Galaxy Morphology using Machine Learning

Classifying Galaxy Morphology using Machine Learning Julian Kates-Harbeck, Introduction: Classifying Galaxy Morphology using Machine Learning The goal of this project is to classify galaxy morphologies. Generally, galaxy morphologies fall into one of two

More information

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts III-IV

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts III-IV Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts III-IV Aapo Hyvärinen Gatsby Unit University College London Part III: Estimation of unnormalized models Often,

More information

International Journal "Information Theories & Applications" Vol.14 /

International Journal Information Theories & Applications Vol.14 / International Journal "Information Theories & Applications" Vol.4 / 2007 87 or 2) Nˆ t N. That criterion and parameters F, M, N assign method of constructing sample decision function. In order to estimate

More information

Automated Segmentation of Low Light Level Imagery using Poisson MAP- MRF Labelling

Automated Segmentation of Low Light Level Imagery using Poisson MAP- MRF Labelling Automated Segmentation of Low Light Level Imagery using Poisson MAP- MRF Labelling Abstract An automated unsupervised technique, based upon a Bayesian framework, for the segmentation of low light level

More information

Neural Networks for Machine Learning. Lecture 2a An overview of the main types of neural network architecture

Neural Networks for Machine Learning. Lecture 2a An overview of the main types of neural network architecture Neural Networks for Machine Learning Lecture 2a An overview of the main types of neural network architecture Geoffrey Hinton with Nitish Srivastava Kevin Swersky Feed-forward neural networks These are

More information

Basic Concepts of. Feature Selection

Basic Concepts of. Feature Selection Basic Concepts of Pattern Recognition and Feature Selection Xiaojun Qi -- REU Site Program in CVMA (2011 Summer) 1 Outline Pattern Recognition Pattern vs. Features Pattern Classes Classification Feature

More information

Machine Learning for Signal Processing Bayes Classification and Regression

Machine Learning for Signal Processing Bayes Classification and Regression Machine Learning for Signal Processing Bayes Classification and Regression Instructor: Bhiksha Raj 11755/18797 1 Recap: KNN A very effective and simple way of performing classification Simple model: For

More information

Small sample size generalization

Small sample size generalization 9th Scandinavian Conference on Image Analysis, June 6-9, 1995, Uppsala, Sweden, Preprint Small sample size generalization Robert P.W. Duin Pattern Recognition Group, Faculty of Applied Physics Delft University

More information

Feature Extraction and Image Processing

Feature Extraction and Image Processing Feature Extraction and Image Processing Second edition Mark S. Nixon Alberto S. Aguado :*авш JBK IIP AMSTERDAM BOSTON HEIDELBERG LONDON NEW YORK OXFORD PARIS SAN DIEGO SAN FRANCISCO SINGAPORE SYDNEY TOKYO

More information

Map automatic scale reduction by means of mathematical morphology

Map automatic scale reduction by means of mathematical morphology Map automatic scale reduction by means of mathematical morphology Luiz Alberto Vieira Dias 1 Júlio Cesar Lima d Alge 2 Fernando Augusto Mitsuo Ii 2 Sílvia Shizue Ii 2 National Institute for Space Research

More information

Chapter 9: The Perceptron

Chapter 9: The Perceptron Chapter 9: The Perceptron 9.1 INTRODUCTION At this point in the book, we have completed all of the exercises that we are going to do with the James program. These exercises have shown that distributed

More information

Using Variable Threshold to Increase Capacity in a Feedback Neural Network

Using Variable Threshold to Increase Capacity in a Feedback Neural Network Using Variable Threshold to Increase Capacity in a Feedback Neural Network Praveen Kuruvada Abstract: The article presents new results on the use of variable thresholds to increase the capacity of a feedback

More information

A New Efficient Method for Producing Global Affine Invariants

A New Efficient Method for Producing Global Affine Invariants A New Efficient Method for Producing Global Affine Invariants Esa Rahtu, Mikko Salo 2, and Janne Heikkilä Machine Vision Group, Department of Electrical and Information Engineering, P.O. Box 45, 94 University

More information

Tutorial on Methods for Interpreting and Understanding Deep Neural Networks. Part 3: Applications & Discussion

Tutorial on Methods for Interpreting and Understanding Deep Neural Networks. Part 3: Applications & Discussion Tutorial on Methods for Interpreting and Understanding Deep Neural Networks W. Samek, G. Montavon, K.-R. Müller Part 3: Applications & Discussion ICASSP 2017 Tutorial W. Samek, G. Montavon & K.-R. Müller

More information

Neural Networks Introduction

Neural Networks Introduction Neural Networks Introduction H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011 H. A. Talebi, Farzaneh Abdollahi Neural Networks 1/22 Biological

More information

A Neural Qualitative Approach for Automatic Territorial Zoning

A Neural Qualitative Approach for Automatic Territorial Zoning A Neural Qualitative Approach for Automatic Territorial Zoning R. J. S. Maciel 1, M. A. Santos da Silva *2, L. N. Matos 3 and M. H. G. Dompieri 2 1 Brazilian Agricultural Research Corporation, Postal Code

More information

Estimation of information-theoretic quantities

Estimation of information-theoretic quantities Estimation of information-theoretic quantities Liam Paninski Gatsby Computational Neuroscience Unit University College London http://www.gatsby.ucl.ac.uk/ liam liam@gatsby.ucl.ac.uk November 16, 2004 Some

More information