Neuro-based Olfactory Model for Artificial Organoleptic Tests

Size: px
Start display at page:

Download "Neuro-based Olfactory Model for Artificial Organoleptic Tests"

Transcription

1 Neuro-based Olfactory odel for Artificial Organoleptic Tests Zu Soh,ToshioTsuji, Noboru Takiguchi,andHisaoOhtake Graduate School of Engineering, Hiroshima Univ., Hiroshima , Japan Graduate School of Advanced Sciences of atter, Hiroshima Univ., Hiroshima , Japan Graduate School of Engineering, Osaka Univ., Osaka , Japan Abstract Recently, the demand for odor processing apparatus in the fragrance and food industries has increased. In this paper, we construct a neural network model of the olfactory system as basis for artificial organoleptic tests that combines possesses the advantages of both human sensory evaluation and machine olfaction. The simulation results indicate that the model can predict odor coding on the glomeruli by appropriately adjusting the parameters involved. Further, the model can simulate the feature extraction ability known Attention. Introduction As considerable evidence has been presented to show that odors have an effect on memory and emotions [], the importance of odors has begun to be recognized beyond their role as components of flavor. For this reason, the demand for odor processing apparatus in the fragrance and food industries is increasing [2]. Two methods of odor assessment developed so far are the sensory evaluation method and machine olfaction [2]. Generally, the sensory evaluation method is employed because it is based on the characteristics of human perception, although, individual differences due to factors such as personal preference or physical condition can affect the evaluation results. achine olfaction, in contrast, is an objective assessment method; nevertheless, it tends to ignore the nature of odor perception. Accordingly, a novel odor assessment method combining the advantages of both human sensory evaluation and machine olfaction making it suitable for artificial organoleptic tests is required. To developsuchamethod,itisfirstnecessarytopredicthow odorant information is coded in the brain to obtain the perception characteristics of animals. The mechanisms of feature extraction from the neuro-coded odor information must then also be predicted. Although the ideal scenario would be to analyze the human olfactory system, current biological knowledge regarding the coding manner of odorant information in the human brain remains limited. Given this restriction, the present study focuses on the olfactory system of mice. An odor is a combination of more than 400,000 kinds of odorant molecules. ice have approximately,000 kinds of odorant receptors, each of which is responsible for detecting a specific group of odorant molecules [3]. The outputs of the receptor neurons evoke an odor-specific activity pattern on glomeruli [4, 5]. As this activity pattern represents fundamental information for odor recognition, it is considered closely linked to the characteristics of perception. We constructed a neural network model [6] using biological data on glomerular activity patterns [5] as inputs, and attempted to simulate the perception characteristics of mice. The model also employed a feature extraction mechanism for Attention [7] in which the mice would focus on some of the important molecules in odors. The results of the computer simulation were compared with ones of behavioral experiments [7], and it was confirmed that the model could predict the discrimination ability of mice. However, the scope of our model was limited to odorants with known glomerular activity patterns. In this paper, we report on the extension of the model to enable the prediction of glomerular activity patterns from odorant properties. For this purpose, 3-layered feed-forward neural network was added and trained by using a known biological data set. Predicted activity patterns were ued as the input of the model for further Attention processing. The simulation results were then examined through comparison with the odor discrimination rates obtained from behavioral experiments on mice. ISAROB

2 Olfactory bulb Granule cells itral cells Glomeruli Correct rate of 24 trials % Odorants Anterior Posterior Piriform Cortex 50 vs Figure : The olfactory system of mice. 2 Biological insight 2. Olfactory system of mice Fig. shows the basic structure of the olfactory system in mice, which consists of three parts: receptor neurons, the olfactory bulb and the piriform cortex. Receptor neurons that distributed on the surface of the nasal chamber express single receptor from among,000 different ones, and bind to specific odorants [3]. When odorant molecules bind to the receptor, its neuron is activated and sends signals to the olfactory bulb. The axons from the receptors that express the same gene terminate at the same point on the surface of the olfactory bulb [4]. The terminals of these axons form a small, round cluster called a glomerulus. A 2D map of glomerular distribution can be associated with receptor genes as well as odorants, and is thus called an odor map [4]. Besides the glomeruli, mitral cells and granule cells are the principal neurons in the olfactory bulb. Signals from the glomeruli are inputted to the mitral cells, which are interconnected via the excitatory synapse. The granule cells receive inputs from the mitral cells and send inhibitory signals back to them. In general, the olfactory bulb is considered to perform feature extraction [8]. The mitral cells transmit the signal to the pyramidal cells in the piriform cortex, which then transmit signals back to the granule cells in the olfactory bulb and indirectly inhibit the mitral cells. The piriform cortex is divided into the anterior piriform cortex (APC) and the posterior piriform cortex (PPC); the division of their functions remains slightly unclear. Generally, the piriform cortex is believed to be responsible for the identification of odors [9]. 2.2 Attention mechanism in the olfactory system Okuhara et al. [7] conducted a series of odor discrimination experiments on mice [7]. First, the mice Figure 2: Results of the odor discrimination experiment. were trained to select a rewarded odor, such as [,, ] composed of three types of odorant. They were then required to discriminate among other odors that contained elements in odor [,, ], such as [] or [, ]. Fig. 2 outlines the results of the odor discrimination experiment performed with 0 mice, and indicates that most of them had difficulty in discriminating between [, ] and [,, ]. This implies that they focused on a combination of the odorants [] and [] when learning the odor [,, ]. This mechanism is called Attention, and contributes significantly to the odor perception characteristics of mice as described above. 3 odel of the olfactory system of mice 3. Structure of the proposed model Fig. 3 shows the structure of the proposed neural network model, which consists of three parts: odor reception, the olfactory bulb and the piriform cortex. The odor reception model is a feed-forward neural network of 3 layers: which are the preprocessing layer (l = ), the odorant layer (l =2)andthereceptor layer (l = 3). The neuron populations for each layer ( l N)are N = 80, 2 N = 500, and 3 N =, 805, respectively. The olfactory bulb model consists of the Glomerular layer (l = 4), the itral layer (l =5), and the Granule layer (l = 6). The neuron populations in the olfactory bulb are 4 N = 5 N = 6 N =, 805, These populations were determined based on the actual number of glomeruli distributed on the olfactory bulb [0]. The piriform cortex model consists of an AP C layer (l = 7) andappc layer (l = 8) corresponding to the anterior piriform cortex and the posterior piriform cortex, respectively. The neuron populations of the AP C and the PPC layers are 7 N =, 000 and 8 N = 00, respectively. ISAROB

3 P P 6 P P 6 Odor reception 2 32 l= l=2 l= l= l=2 l=3 3 max[ U(t) 3 o,... U(t) o] max[ 3 U2(t) o,... 3 U2(t) o] max[ 3 3 U N(t) o,... 3 U 3 N(t) o] 5-2 Olfactory bulb Piriform cortex Glomeruli itral Granule APC PPC l = 4 l = 5 l = 6 l = 7 l = W Y l = U (t) Figure 3: Structure of the proposed model. The activity pattern of glomeruli is cited from literature [5]. The connections between each layer in the olfactory The outputs of sigmoid neurons in other layers are also bulb model and the piriform cortex model are calculated using the above equation. The threshold set up based on the structure of the olfactory system describedinsection2.,withtheexceptionthatthe interconnections in each layer are not included in the θ s and the gradient ɛ s of the sigmoid function are determined according to the corresponding property by the following equation: model for simplification. In addition, a single neuron layer Y (l = 9) is artificially introduced as the output θ s = k (P i,max P i,min ), (3) of the model. K The model takes the odorant properties as the input (P i,max P i,min ) ɛ s = C s, (4) from the preprocessing layer. Since there are approximately K 400,000 kinds of odorant (forming extremely where P i,max and P i,min are the maximum and minimum values of the property P high-dimensional information), it is impossible to input the odorant information to the model through binary coding. The properties of each odorant are there- i in an odorant data set, and C s is a constant. The output of the preprocessing layer (l =)isinputted to the odorant layer (l = 2) through a connecfore broken down into 6 numeric properties as listed by Johnson et al. [] along with their corresponding tive weight matrix activity patterns.. The input to the odorant layer is given by the following equation: In order to normalize the properties with different units and orders, the preprocessing layer converts the u n (t) = w ns (t) U s (t), (5) value of the properties into activated neuron numbers. This method is introduced based on the s concept of population coding [2]. The neurons in the preprocessing layer are divided into 6 groups, each of which receives a different kind of odorant property. The input to the neurons in the preprocessing layer is given by the following equation: u s (t) =P i ( m o), (s =(i )K + k, k =, 2...K), () where u s (t) is the input to the s th neuron in the preprocessing layer (l =)attimestept, m o is the m th odorant in odor O, P i is the i th numeric property of the odorant m o, and K is the maximum number of neurons responsible for property P i. The activities of the neurons are given by the following sigmoid function: U s (t) = +exp{ ɛ s ( u s (t) θ s )}. (2) where 2 u n (t) is the input to the n th neuron in the odorant layer, and w ns (t) is the connective weight between neuron units n and s, which is an element in the connective weight matrix 2. The output of the odorant layer is inputted to the receptor layer (l = 3) through 32 inthesame manner as the equation (5). The output of the receptor layer (l =3)ispassed to the glomerular layer (l = 4). According to Lin et al. [3], the glomerular activity of an odorant mixture O can be represented by binary addition of the activities evoked by its odorant components. Thus, the input and output of the Glomerular layer is determined by the following equation: 4 u e (t) =max[ 3 U r (t) o... 3 U r (t) o] (6) ISAROB

4 The output 4 U e (t), which is calculated by equation (2), is an element in the activity pattern vector 4 U(t). Each element corresponds to a divided lattice of the activity patterns provided by Johnson et al. [5], as shown in Fig. 3. The output of the Glomerular layer is input to the itral layer on a one-to-one basis. The detailed structure of the olfactory bulb and piriform cortex part is described in our previous paper [6], where the inputs and outputs of each layer are determined in the same manner as described above. The inputs of the newly introduced Y (l =9)layer represent the output of the model. Its output is determined by the input from itral layer as the following equation: 9 U (t) = exp{ 9 ɛ ( 9 u (t) 9 θ )} +exp{ 9 ɛ ( 9 u (t) 9 θ )}. (7) To predict the glomerular activity patterns and perception characteristics affected by attention, the connective weights appearing in each equation must be appropriately adjusted. The next subsection describes the learning algorithm of the model. 3.2 Algorithm of the learning phase The learning algorithm consists of 2 steps, whose details are described in this subsection The st step of the learning phase In the st step, the connective weights 2 and 32 are adjusted for accurate prediction of the activity patterns for the Glomerular layer from the input odorants properties as given by the training set. The connective weights are adjusted to minimize the error energy E: E = 4 N e i = 4 N (Ui a i) 2, (8) i where e i is the mean square error (SE); Ui 4 is the output of the glomerular layer, and a i is the activity of actual glomeruli in the i th lattice. For the implementation of weight adjustment, the RPROP algorithm proposed by Riedmiller et al. [4] isutilized. This algorithm allows fast error convergence with a reasonable computer memory requirement. The connective weights are iteratively adjusted until a preset maximum iteration number is reached. i The 2 nd step of the learning phase In the 2 st step, the connective weights in the olfactory bulb and piriform cortex are modulated based on the algorithm proposed in [6]. Since most of the computational functions of the olfactory system, especially the connection from the piriform cortex to the olfactory bulb, are not yet clearly understood, signal transduction or connective weight modulation are hypothesized based on the odor discrimination experiment outlined in Section 2.2 [7]. We assume that the connection from the piriform cortex to the olfactory bulb plays a role in extracting the most activated regions in the glomeruli. With regard to these assumptions, we propose a learning algorithm that consists of 3 steps outlined below. In the st step, the connective weights 75 and 67 are modulated to subtract the background activity from the activated part of the itral layer using the following equation: 75 W zb (t+) = α 75 W zb (t)+β 5 U b (t) A O 7 U z (t) A O, (9) 67 W gz (t+) = α 67 W gz (t)+β 7 U z (t) A O 5 U b (t) back, (0) where α denotes the forgetting term, and β is the learning rate. As a result, when an odor A is input through the connective weights 75 W (t) and 67 W (t), the Granule layer can inhibit the background activity of the itral layer. In the 2 nd step, the most activated neurons in the Glomerular layer are extracted. The connective weights between the itral layer and the PPC layer are assumed to form a competitive system, and the corresponding connective weights are adjusted accordingtoamariet al. [5]. In the 3 rd step, the activity pattern of the itral layer resulting from the 2 nd step is memorized by adjusting the connective weights 95 as in the following equation: 95 W b (t) = 5 U b (t) A O. () This adjustment enables the model to compare the features of the memorized odor to the inputted odors using the comparison algorithm described in the next subsection. 3.3 Comparison algorithm In the comparison phase, an arbitrary odor B is inputted to the model. The input to the Y layer can be calculated as follows: 9 u ( t) B O = b = b 95 W b (t) 5 U b (t) B O 5 U b (t) A O 5 U b (t) B O. (2) ISAROB

5 * Odorants with a gray background are training data Living mice (Johnson et al. 04) Pentanal Hexanal Heptanal Octanal Nonanal Decanal O O O O O O Output of the model Output of NN U vs odel Odor discrimination experiment Correct rate % State of the itral layer 0.02 SE Figure 4: Simulation results of aldehydes. Accordingly, calculating the input to the Y layer is equivalent to calculating the correlation between the current output 5 U b (t) B O and the memorized output 5 U b (t) A O of the itral layer. The correlation is then converted to an index of dissimilarity by equation (7). It can be assumed that the mice tend to make wrong decisions when the outputs of the itral layer are similar. Accordingly, the output of the model is considered to correspond with the results of the odor discrimination experiments on mice [7]. 4 Simulation This section describes the simulations performed based on the algorithm described in the previous section. 4. The st step of the simulation First, data on 70 odorants were selected from the 365 odorants provided by Johnson et al. [] as a training data set. Then, 3 odorants with identical structures but different carbon numbers were chosen and added to the set. Fig. 4 shows the outputs of the model after the training is completed. In Fig. 4, the molecules in the uppermost row are the inputted odorants followed by the actual activity patterns [5], the output of the model, and the a graph of the mean square errors (SEs). The odorants with a gray background are included in the training set, while those with no background color are untrained odorants. This figure indicates that the model successfully predicted the activity patterns in the training data set with an prediction error SE of below The predicted activity patterns of untrained odorants were also close to the actual activity patterns with SEs ranging from about Inputted activity patterns Figure 5: Comparison between the results of the behavior experiments to Consequently, the model is capable of predicting the tendency for the activated parts of glomeruli to shift continuously along with the carbon number [5] nd step of the simulation In the learning phase, an odor [,, ], representing an odorant mixture composed of isoamyl acetate, citral, and ethyl butyrate, is input to the proposed model. The connective weights are then adjusted according to the learning algorithm described in Section The initial values of the connective weights are determined by uniform random values ranging between 0 5 and 0 5. After the learning phase, 6 different odors [], [], [], [, ], [, ], and [, ] are input to the model. Then, the output of the neuron in the Y layer is compared with the correct rates in odor discrimination experiments on the mice. In this step, the connective weights are fixed on the values determined in the st step of the simulation. The outputs of the Glomerular layer to each odor are shown in the bottom row in Fig. 5. The outputs of the itral layer after the 2 nd learning phase are shown in the middle row, and the outputs of the neurons in the Y layer are plotted at the top. The discrimination rates obtained from the odor discrimination experiment on the mice are also plotted beside the output of the Y layer. Comparing the activity patterns of the Glomerular layer to those of the itral layer shows that the activated region becomes narrow, but its activity becomes stronger, which means that most activated regions in ISAROB

6 the Glomerular layer were extracted. Fig. 5 indicates a similar tendency between the correct rates and the output of the neuron in the Y layer; the higher the output, the higher the correct rate. From these results, we can conclude that the model is capable of account for the perception characteristics of mice to a certain extent through the assumed Attention mechanism. 5 Conclusion In this paper, we propose a neural network model of the olfactory system of mice. Utilizing this model, we tried to predict the activity pattern in glomeruli evoked by odorants. The simulation results indicated that the model was capable of predicting the activity patterns of untrained odors with different carbon numbers, and showed consistency with those of odor discrimination experiments on mice. This ability to predict perception characteristics makes the model suitable as a basis for artificial organoleptic tests. However, odors in nature are composed of odorants in different concentrations, which is not accounted for in the proposed model. Future studies must therefore include the odor coding manner for different concentrations. In addition, since the simulations were performed only with on odorants [], [], and [] using limited experimental data, further behavioral experiments and simulation need to be performed on other odorants to verify the ability of the model. Acknowledgments. This work was partially supported by the 2st Century COE Program of JSPS (the Japan Society for the Promotion of Science) on Hyper Human Technology toward the 2st Century Industrial Revolution. References [] R.S. Herz and T. Engen, Odor memory: review and analysis, Psychonomic Bulletin and Review, Vol. 3, No. 3, pp , 996. [2] R. Gutierrez-Osuna, Pattern Analysis for achine Olfaction: A Review, IEEE Sensors Journal, Vol. 2, No. 3, pp , [3] L. Buck and R. Axel A novel multigene family may encode odorant receptors: a molecular basis for odor recognition, Cell, Vol. 65, pp.75 87, 99. [4] K. ori and Y. Yoshihara olecular recognition and olfactory processing in the mammalian olfactory system, Progress in Neurobiology, Vol. 45, pp , 995. [5] B.A. Johnson and. Leon odular glomerular representations of odorants in the rat olfactory bulb and the effects of stimulus concentration, The Journal of Comparative Neurology, Vol. 422, pp [6] Z. Soh, T. Tsuji, N. Takiguchi, H. Ohtake A neural network model of the olfactory system of mice Simulated the tendency of attention behavior, The 3th International Symposium on Artificial Life and Robotics, pp , [7] K. Okuhara and T. Nakamura Explore algorithms in olfactory system of mice, Software Biology, Vol.3, pp.20 25, (in Japanese) [8] O. Hoshino, Y. Kashimori, and T. Kambara, Feature extraction from mixed odor stimuli based on spatio-temporalrepresentation of odors in olfactory bulb, International Conference on Neural Networks, vol., pp , 997. [9] D.A. Wilson, Receptive Fields in the Rat Piriform Cortex, Chemical Senses, Vol. 26, No. 5, pp , 200. [0] B.A. Johnson, H. Farahbod, Z. Xu Local and global chemotopic organization: general features of the glomerular representations of aliphatic odorants differing in carbon number, The Journal of Comparative Neurology, Vol. 480, No. 2, pp , []. Leon, B. A. Johnson, Glomerular Response Archive, [2] A.P. Georgopoulos, A.B. Schwartz, R.E. Kettner, Neuronal population coding of movement direction, Science, Vol. 233, Issue. 477, pp , 986. [3] D.Y. Lin, S.D. Shea, L.C. Katz, Representation of natural stimuli in the rodent main olfactory bulb, Neuron, Vol.50, No. 6, pp , [4]. Riedmiller and H. Braun, A direct adaptive method for faster backpropagation learning: The RPROP algorithm, Proceedings of the IEEE International Conference on Neural Networks San Francisco, pp , 993. [5] S. Amari,.A. Arbib, Computation and cooperation in neural nets, Systems Neuroscience, Academic Press, pp. 67 7, 977. ISAROB

Neuro-based Olfactory Model for Estimation of Sensory Characteristic of Mice

Neuro-based Olfactory Model for Estimation of Sensory Characteristic of Mice Proceedings of the 2008 IEEE International Conference on Robotics and Biomimetics Bangkok, Thailand, February 21-26, 2009 Neuro-based Olfactory Model for Estimation of Sensory Characteristic of Mice Zu

More information

Identification of Odors by the Spatiotemporal Dynamics of the Olfactory Bulb. Outline

Identification of Odors by the Spatiotemporal Dynamics of the Olfactory Bulb. Outline Identification of Odors by the Spatiotemporal Dynamics of the Olfactory Bulb Henry Greenside Department of Physics Duke University Outline Why think about olfaction? Crash course on neurobiology. Some

More information

Announcements: Test4: Wednesday on: week4 material CH5 CH6 & NIA CAPE Evaluations please do them for me!! ask questions...discuss listen learn.

Announcements: Test4: Wednesday on: week4 material CH5 CH6 & NIA CAPE Evaluations please do them for me!! ask questions...discuss listen learn. Announcements: Test4: Wednesday on: week4 material CH5 CH6 & NIA CAPE Evaluations please do them for me!! ask questions...discuss listen learn. The Chemical Senses: Olfaction Mary ET Boyle, Ph.D. Department

More information

The sense of smell Outline Main Olfactory System Odor Detection Odor Coding Accessory Olfactory System Pheromone Detection Pheromone Coding

The sense of smell Outline Main Olfactory System Odor Detection Odor Coding Accessory Olfactory System Pheromone Detection Pheromone Coding The sense of smell Outline Main Olfactory System Odor Detection Odor Coding Accessory Olfactory System Pheromone Detection Pheromone Coding 1 Human experiment: How well do we taste without smell? 2 Brief

More information

Lecture 9 Olfaction (Chemical senses 2)

Lecture 9 Olfaction (Chemical senses 2) Lecture 9 Olfaction (Chemical senses 2) All lecture material from the following links unless otherwise mentioned: 1. http://wws.weizmann.ac.il/neurobiology/labs/ulanovsky/sites/neurobiology.labs.ulanovsky/files/uploads/kandel_ch32_smell_taste.pd

More information

Neurology, neurophysiology and neuropsychology: olfactory clues to brain development and disorder

Neurology, neurophysiology and neuropsychology: olfactory clues to brain development and disorder Section I Neurology, neurophysiology and neuropsychology: olfactory clues to brain development and disorder 1 Structure and function of the olfactory system Alan Mackay-Sim and Jean-Pierre Royet Introduction

More information

The Nobel Prize in Physiology or Medicine for 2004 awarded

The Nobel Prize in Physiology or Medicine for 2004 awarded The Nobel Prize in Physiology or Medicine for 2004 awarded jointly to Richard Axel and Linda B. Buck for their discoveries of "odorant receptors and the organization of the olfactory system" Introduction

More information

Les sens chimiques Prof. Alan Carleton Département de Neurosciences Fondamentales

Les sens chimiques Prof. Alan Carleton Département de Neurosciences Fondamentales Les sens chimiques Prof. Alan Carleton Département de Neurosciences Fondamentales Les sens chimiques Le système olfactif principal Organe voméronasal et détection des phéromones La perception des goûts

More information

Artificial Neural Network and Fuzzy Logic

Artificial Neural Network and Fuzzy Logic Artificial Neural Network and Fuzzy Logic 1 Syllabus 2 Syllabus 3 Books 1. Artificial Neural Networks by B. Yagnanarayan, PHI - (Cover Topologies part of unit 1 and All part of Unit 2) 2. Neural Networks

More information

arxiv:physics/ v1 [physics.bio-ph] 19 Feb 1999

arxiv:physics/ v1 [physics.bio-ph] 19 Feb 1999 Odor recognition and segmentation by coupled olfactory bulb and cortical networks arxiv:physics/9902052v1 [physics.bioph] 19 Feb 1999 Abstract Zhaoping Li a,1 John Hertz b a CBCL, MIT, Cambridge MA 02139

More information

Artificial Intelligence (AI) Common AI Methods. Training. Signals to Perceptrons. Artificial Neural Networks (ANN) Artificial Intelligence

Artificial Intelligence (AI) Common AI Methods. Training. Signals to Perceptrons. Artificial Neural Networks (ANN) Artificial Intelligence Artificial Intelligence (AI) Artificial Intelligence AI is an attempt to reproduce intelligent reasoning using machines * * H. M. Cartwright, Applications of Artificial Intelligence in Chemistry, 1993,

More information

Introduction to Physiological Psychology

Introduction to Physiological Psychology Introduction to Physiological Psychology Psych 260 Kim Sweeney ksweeney@cogsci.ucsd.edu cogsci.ucsd.edu/~ksweeney/psy260.html n Vestibular System Today n Gustation and Olfaction 1 n Vestibular sacs: Utricle

More information

A Biologically-Inspired Model for Recognition of Overlapped Patterns

A Biologically-Inspired Model for Recognition of Overlapped Patterns A Biologically-Inspired Model for Recognition of Overlapped Patterns Mohammad Saifullah Department of Computer and Information Science Linkoping University, Sweden Mohammad.saifullah@liu.se Abstract. In

More information

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, 2012 Sasidharan Sreedharan www.sasidharan.webs.com 3/1/2012 1 Syllabus Artificial Intelligence Systems- Neural Networks, fuzzy logic,

More information

Sensory Encoding of Smell in the Olfactory System of Drosophila

Sensory Encoding of Smell in the Olfactory System of Drosophila Sensory Encoding of Smell in the Olfactory System of Drosophila (reviewing Olfactory Information Processing in Drosophila by Masse et al, 2009) Ben Cipollini COGS 160 May 11, 2010 Smell Drives Our Behavior...

More information

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann (Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for

More information

Tuning tuning curves. So far: Receptive fields Representation of stimuli Population vectors. Today: Contrast enhancment, cortical processing

Tuning tuning curves. So far: Receptive fields Representation of stimuli Population vectors. Today: Contrast enhancment, cortical processing Tuning tuning curves So far: Receptive fields Representation of stimuli Population vectors Today: Contrast enhancment, cortical processing Firing frequency N 3 s max (N 1 ) = 40 o N4 N 1 N N 5 2 s max

More information

Lecture 4: Perceptrons and Multilayer Perceptrons

Lecture 4: Perceptrons and Multilayer Perceptrons Lecture 4: Perceptrons and Multilayer Perceptrons Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning Perceptrons, Artificial Neuronal Networks Lecture 4: Perceptrons

More information

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided

More information

Artificial Neural Network

Artificial Neural Network Artificial Neural Network Contents 2 What is ANN? Biological Neuron Structure of Neuron Types of Neuron Models of Neuron Analogy with human NN Perceptron OCR Multilayer Neural Network Back propagation

More information

Artifical Neural Networks

Artifical Neural Networks Neural Networks Artifical Neural Networks Neural Networks Biological Neural Networks.................................. Artificial Neural Networks................................... 3 ANN Structure...........................................

More information

Artificial Neural Networks Examination, June 2005

Artificial Neural Networks Examination, June 2005 Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either

More information

Receptor Contributions to Configural and Elemental Odor Mixture Perception

Receptor Contributions to Configural and Elemental Odor Mixture Perception Behavioral Neuroscience Copyright 2003 by the American Psychological Association, Inc. 2003, Vol. 117, No. 5, 1108 1114 0735-7044/03/$12.00 DOI: 10.1037/0735-7044.117.5.1108 Receptor Contributions to Configural

More information

Smell is perhaps our most evocative

Smell is perhaps our most evocative The Molecular Logic of Smell Mammals can recognize thousands of odors, some of which prompt powerful responses. Recent experiments illuminate how the nose and brain may perceive scents by Richard Axel

More information

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017 The Bayesian Brain Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester May 11, 2017 Bayesian Brain How do neurons represent the states of the world? How do neurons represent

More information

1

1 http://photos1.blogger.com/img/13/2627/640/screenhunter_047.jpg 1 The Nose Knows http://graphics.jsonline.com/graphics/owlive/img/mar05/sideways.one0308_big.jpg 2 http://www.stlmoviereviewweekly.com/sitebuilder/images/sideways-253x364.jpg

More information

Sensory/ Motor Systems March 10, 2008

Sensory/ Motor Systems March 10, 2008 Sensory/ Motor Systems March 10, 2008 Greg Suh greg.suh@med.nyu.edu Title: Chemical Senses- periphery Richard Axel and Linda Buck Win Nobel Prize for Olfactory System Research 2 3 After Leslie Vosshal

More information

COMP304 Introduction to Neural Networks based on slides by:

COMP304 Introduction to Neural Networks based on slides by: COMP34 Introduction to Neural Networks based on slides by: Christian Borgelt http://www.borgelt.net/ Christian Borgelt Introduction to Neural Networks Motivation: Why (Artificial) Neural Networks? (Neuro-)Biology

More information

Simple neuron model Components of simple neuron

Simple neuron model Components of simple neuron Outline 1. Simple neuron model 2. Components of artificial neural networks 3. Common activation functions 4. MATLAB representation of neural network. Single neuron model Simple neuron model Components

More information

AI Programming CS F-20 Neural Networks

AI Programming CS F-20 Neural Networks AI Programming CS662-2008F-20 Neural Networks David Galles Department of Computer Science University of San Francisco 20-0: Symbolic AI Most of this class has been focused on Symbolic AI Focus or symbols

More information

Neural Networks Introduction

Neural Networks Introduction Neural Networks Introduction H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011 H. A. Talebi, Farzaneh Abdollahi Neural Networks 1/22 Biological

More information

Consider the following spike trains from two different neurons N1 and N2:

Consider the following spike trains from two different neurons N1 and N2: About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in

More information

"That which we call a rose by any other name would smell as sweet": How the Nose knows!

That which we call a rose by any other name would smell as sweet: How the Nose knows! "That which we call a rose by any other name would smell as sweet": How the Nose knows! Nobel Prize in Physiology or Medicine 2004 Sayanti Saha, Parvathy Ramakrishnan and Sandhya S Vis'Wes'Wariah Richard

More information

Artificial Neural Networks. Edward Gatt

Artificial Neural Networks. Edward Gatt Artificial Neural Networks Edward Gatt What are Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning Very

More information

Biologically inspired signal transformations and neural classification of odours in an electronic nose

Biologically inspired signal transformations and neural classification of odours in an electronic nose Biologically inspired signal transformations and neural classification of odours in an electronic nose Bashan Naidoo School of Electrical, Electronic & Computer Engineering University of Natal, 4041, South

More information

Marr's Theory of the Hippocampus: Part I

Marr's Theory of the Hippocampus: Part I Marr's Theory of the Hippocampus: Part I Computational Models of Neural Systems Lecture 3.3 David S. Touretzky October, 2015 David Marr: 1945-1980 10/05/15 Computational Models of Neural Systems 2 Marr

More information

Artificial Neural Networks Examination, June 2004

Artificial Neural Networks Examination, June 2004 Artificial Neural Networks Examination, June 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Jeff Clune Assistant Professor Evolving Artificial Intelligence Laboratory Announcements Be making progress on your projects! Three Types of Learning Unsupervised Supervised Reinforcement

More information

CHEMICAL SENSES Smell (Olfaction) and Taste

CHEMICAL SENSES Smell (Olfaction) and Taste CHEMICAL SENSES Smell (Olfaction) and Taste Peter Århem Department of Neuroscience SMELL 1 Olfactory receptor neurons in olfactory epithelium. Size of olfactory region 2 Number of olfactory receptor cells

More information

Chemosensory System. Spring 2013 Royce Mohan, PhD Reading: Chapter 15, Neuroscience by Purves et al; FiGh edihon (Sinauer Publishers)

Chemosensory System. Spring 2013 Royce Mohan, PhD Reading: Chapter 15, Neuroscience by Purves et al; FiGh edihon (Sinauer Publishers) Chemosensory System Spring 2013 Royce Mohan, PhD mohan@uchc.edu Reading: Chapter 15, Neuroscience by Purves et al; FiGh edihon (Sinauer Publishers) Learning ObjecHves Anatomical and funchonal organizahon

More information

NEURAL RELATIVITY PRINCIPLE

NEURAL RELATIVITY PRINCIPLE NEURAL RELATIVITY PRINCIPLE Alex Koulakov, Cold Spring Harbor Laboratory 1 3 Odorants are represented by patterns of glomerular activation butyraldehyde hexanone Odorant-Evoked SpH Signals in the Mouse

More information

Optimal Adaptation Principles In Neural Systems

Optimal Adaptation Principles In Neural Systems University of Pennsylvania ScholarlyCommons Publicly Accessible Penn Dissertations 2017 Optimal Adaptation Principles In Neural Systems Kamesh Krishnamurthy University of Pennsylvania, kameshkk@gmail.com

More information

DEVS Simulation of Spiking Neural Networks

DEVS Simulation of Spiking Neural Networks DEVS Simulation of Spiking Neural Networks Rene Mayrhofer, Michael Affenzeller, Herbert Prähofer, Gerhard Höfer, Alexander Fried Institute of Systems Science Systems Theory and Information Technology Johannes

More information

Instituto Tecnológico y de Estudios Superiores de Occidente Departamento de Electrónica, Sistemas e Informática. Introductory Notes on Neural Networks

Instituto Tecnológico y de Estudios Superiores de Occidente Departamento de Electrónica, Sistemas e Informática. Introductory Notes on Neural Networks Introductory Notes on Neural Networs Dr. José Ernesto Rayas Sánche April Introductory Notes on Neural Networs Dr. José Ernesto Rayas Sánche BIOLOGICAL NEURAL NETWORKS The brain can be seen as a highly

More information

Synchronized Oscillatory Discharges of Mitral/Tufted Cells With Different Molecular Receptive Ranges in the Rabbit Olfactory Bulb

Synchronized Oscillatory Discharges of Mitral/Tufted Cells With Different Molecular Receptive Ranges in the Rabbit Olfactory Bulb Synchronized Oscillatory Discharges of Mitral/Tufted Cells With Different Molecular Receptive Ranges in the Rabbit Olfactory Bulb HIDEKI KASHIWADANI, 1,2 YASNORY F. SASAKI, 1 NAOSHIGE UCHIDA, 1 AND KENSAKU

More information

Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling

Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling Name: AP Biology Mr. Croft Section 1 1. What is a neuron? Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling 2. Neurons can be placed into three groups, based on their location and function.

More information

CHAPTER 3. Pattern Association. Neural Networks

CHAPTER 3. Pattern Association. Neural Networks CHAPTER 3 Pattern Association Neural Networks Pattern Association learning is the process of forming associations between related patterns. The patterns we associate together may be of the same type or

More information

Nerve Signal Conduction. Resting Potential Action Potential Conduction of Action Potentials

Nerve Signal Conduction. Resting Potential Action Potential Conduction of Action Potentials Nerve Signal Conduction Resting Potential Action Potential Conduction of Action Potentials Resting Potential Resting neurons are always prepared to send a nerve signal. Neuron possesses potential energy

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks 鮑興國 Ph.D. National Taiwan University of Science and Technology Outline Perceptrons Gradient descent Multi-layer networks Backpropagation Hidden layer representations Examples

More information

Data Mining Part 5. Prediction

Data Mining Part 5. Prediction Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,

More information

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks INFOB2KI 2017-2018 Utrecht University The Netherlands ARTIFICIAL INTELLIGENCE Artificial Neural Networks Lecturer: Silja Renooij These slides are part of the INFOB2KI Course Notes available from www.cs.uu.nl/docs/vakken/b2ki/schema.html

More information

Nervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation

Nervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation Nervous Tissue Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation What is the function of nervous tissue? Maintain homeostasis & respond to stimuli

More information

Introduction to Neural Networks

Introduction to Neural Networks Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning

More information

A Three-dimensional Physiologically Realistic Model of the Retina

A Three-dimensional Physiologically Realistic Model of the Retina A Three-dimensional Physiologically Realistic Model of the Retina Michael Tadross, Cameron Whitehouse, Melissa Hornstein, Vicky Eng and Evangelia Micheli-Tzanakou Department of Biomedical Engineering 617

More information

Multilayer Perceptron

Multilayer Perceptron Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Single Perceptron 3 Boolean Function Learning 4

More information

Deep Learning. What Is Deep Learning? The Rise of Deep Learning. Long History (in Hind Sight)

Deep Learning. What Is Deep Learning? The Rise of Deep Learning. Long History (in Hind Sight) CSCE 636 Neural Networks Instructor: Yoonsuck Choe Deep Learning What Is Deep Learning? Learning higher level abstractions/representations from data. Motivation: how the brain represents sensory information

More information

Introduction Biologically Motivated Crude Model Backpropagation

Introduction Biologically Motivated Crude Model Backpropagation Introduction Biologically Motivated Crude Model Backpropagation 1 McCulloch-Pitts Neurons In 1943 Warren S. McCulloch, a neuroscientist, and Walter Pitts, a logician, published A logical calculus of the

More information

Neural Networks. Henrik I. Christensen. Computer Science and Engineering University of California, San Diego

Neural Networks. Henrik I. Christensen. Computer Science and Engineering University of California, San Diego Neural Networks Henrik I. Christensen Computer Science and Engineering University of California, San Diego http://www.hichristensen.net Henrik I. Christensen (UCSD) Neural Networks 1 / 39 Introduction

More information

Course 395: Machine Learning - Lectures

Course 395: Machine Learning - Lectures Course 395: Machine Learning - Lectures Lecture 1-2: Concept Learning (M. Pantic) Lecture 3-4: Decision Trees & CBC Intro (M. Pantic & S. Petridis) Lecture 5-6: Evaluating Hypotheses (S. Petridis) Lecture

More information

Vertebrate Physiology 437 EXAM I 26 September 2002 NAME

Vertebrate Physiology 437 EXAM I 26 September 2002 NAME 437 EXAM1.DOC Vertebrate Physiology 437 EXAM I 26 September 2002 NAME 0. When you gaze at the stars, why do you have to look slightly away from the really faint ones in order to be able to see them? (1

More information

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing Linear recurrent networks Simpler, much more amenable to analytic treatment E.g. by choosing + ( + ) = Firing rates can be negative Approximates dynamics around fixed point Approximation often reasonable

More information

Neurons, Synapses, and Signaling

Neurons, Synapses, and Signaling LECTURE PRESENTATIONS For CAMPBELL BIOLOGY, NINTH EDITION Jane B. Reece, Lisa A. Urry, Michael L. Cain, Steven A. Wasserman, Peter V. Minorsky, Robert B. Jackson Chapter 48 Neurons, Synapses, and Signaling

More information

Pattern recognition for chemical sensor arrays with neuromorphic models of the olfactory system

Pattern recognition for chemical sensor arrays with neuromorphic models of the olfactory system Pattern recognition for chemical sensor arrays with neuromorphic models of the olfactory system Pattern Recognition and Intelligent Sensor Machines Lab Department of Computer Science Texas A&M University

More information

A Novel Activity Detection Method

A Novel Activity Detection Method A Novel Activity Detection Method Gismy George P.G. Student, Department of ECE, Ilahia College of,muvattupuzha, Kerala, India ABSTRACT: This paper presents an approach for activity state recognition of

More information

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS Learning Neural Networks Classifier Short Presentation INPUT: classification data, i.e. it contains an classification (class) attribute.

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Neural Networks Varun Chandola x x 5 Input Outline Contents February 2, 207 Extending Perceptrons 2 Multi Layered Perceptrons 2 2. Generalizing to Multiple Labels.................

More information

to appear in Frank Eeckman, ed., Proceedings of the Computational Neurosciences Conference CNS*93, Kluwer Academic Press.

to appear in Frank Eeckman, ed., Proceedings of the Computational Neurosciences Conference CNS*93, Kluwer Academic Press. to appear in Frank Eeckman, ed., Proceedings of the Computational Neurosciences Conference CNS*93, Kluwer Academic Press. The Sinusoidal Array: A Theory of Representation for Spatial Vectors A. David Redish,

More information

MOST current approaches for processing data from chemical

MOST current approaches for processing data from chemical IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 17, NO. 4, JULY 2006 1015 Processing of Chemical Sensor Arrays With a Biologically Inspired Model of Olfactory Coding Baranidharan Raman, Member, IEEE, Ping A.

More information

Plan. Perceptron Linear discriminant. Associative memories Hopfield networks Chaotic networks. Multilayer perceptron Backpropagation

Plan. Perceptron Linear discriminant. Associative memories Hopfield networks Chaotic networks. Multilayer perceptron Backpropagation Neural Networks Plan Perceptron Linear discriminant Associative memories Hopfield networks Chaotic networks Multilayer perceptron Backpropagation Perceptron Historically, the first neural net Inspired

More information

Chapter 48 Neurons, Synapses, and Signaling

Chapter 48 Neurons, Synapses, and Signaling Chapter 48 Neurons, Synapses, and Signaling Concept 48.1 Neuron organization and structure reflect function in information transfer Neurons are nerve cells that transfer information within the body Neurons

More information

Lecture 7 Artificial neural networks: Supervised learning

Lecture 7 Artificial neural networks: Supervised learning Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in

More information

Chapter 9. Nerve Signals and Homeostasis

Chapter 9. Nerve Signals and Homeostasis Chapter 9 Nerve Signals and Homeostasis A neuron is a specialized nerve cell that is the functional unit of the nervous system. Neural signaling communication by neurons is the process by which an animal

More information

Dynamic Modeling of Brain Activity

Dynamic Modeling of Brain Activity 0a Dynamic Modeling of Brain Activity EIN IIN PC Thomas R. Knösche, Leipzig Generative Models for M/EEG 4a Generative Models for M/EEG states x (e.g. dipole strengths) parameters parameters (source positions,

More information

Linear discriminant functions

Linear discriminant functions Andrea Passerini passerini@disi.unitn.it Machine Learning Discriminative learning Discriminative vs generative Generative learning assumes knowledge of the distribution governing the data Discriminative

More information

Lecture 4: Feed Forward Neural Networks

Lecture 4: Feed Forward Neural Networks Lecture 4: Feed Forward Neural Networks Dr. Roman V Belavkin Middlesex University BIS4435 Biological neurons and the brain A Model of A Single Neuron Neurons as data-driven models Neural Networks Training

More information

Self-organized Criticality and Synchronization in a Pulse-coupled Integrate-and-Fire Neuron Model Based on Small World Networks

Self-organized Criticality and Synchronization in a Pulse-coupled Integrate-and-Fire Neuron Model Based on Small World Networks Commun. Theor. Phys. (Beijing, China) 43 (2005) pp. 466 470 c International Academic Publishers Vol. 43, No. 3, March 15, 2005 Self-organized Criticality and Synchronization in a Pulse-coupled Integrate-and-Fire

More information

High-dimensional geometry of cortical population activity. Marius Pachitariu University College London

High-dimensional geometry of cortical population activity. Marius Pachitariu University College London High-dimensional geometry of cortical population activity Marius Pachitariu University College London Part I: introduction to the brave new world of large-scale neuroscience Part II: large-scale data preprocessing

More information

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon Neuron Detector Model 1 The detector model. 2 Biological properties of the neuron. 3 The computational unit. Each neuron is detecting some set of conditions (e.g., smoke detector). Representation is what

More information

Nervous Tissue. Neurons Neural communication Nervous Systems

Nervous Tissue. Neurons Neural communication Nervous Systems Nervous Tissue Neurons Neural communication Nervous Systems What is the function of nervous tissue? Maintain homeostasis & respond to stimuli Sense & transmit information rapidly, to specific cells and

More information

Artificial Neural Networks Examination, March 2004

Artificial Neural Networks Examination, March 2004 Artificial Neural Networks Examination, March 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

Artificial Neural Networks. Historical description

Artificial Neural Networks. Historical description Artificial Neural Networks Historical description Victor G. Lopez 1 / 23 Artificial Neural Networks (ANN) An artificial neural network is a computational model that attempts to emulate the functions of

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks CPSC 533 Winter 2 Christian Jacob Neural Networks in the Context of AI Systems Neural Networks as Mediators between Symbolic AI and Statistical Methods 2 5.-NeuralNets-2.nb Neural

More information

#33 - Genomics 11/09/07

#33 - Genomics 11/09/07 BCB 444/544 Required Reading (before lecture) Lecture 33 Mon Nov 5 - Lecture 31 Phylogenetics Parsimony and ML Chp 11 - pp 142 169 Genomics Wed Nov 7 - Lecture 32 Machine Learning Fri Nov 9 - Lecture 33

More information

PV021: Neural networks. Tomáš Brázdil

PV021: Neural networks. Tomáš Brázdil 1 PV021: Neural networks Tomáš Brázdil 2 Course organization Course materials: Main: The lecture Neural Networks and Deep Learning by Michael Nielsen http://neuralnetworksanddeeplearning.com/ (Extremely

More information

η 5 ξ [k+2] η 4 η 3 η 1 ξ [k]

η 5 ξ [k+2] η 4 η 3 η 1 ξ [k] On-line and Hierarchical Design Methods of Dynamics Based Information Processing System Masafumi OKADA, Daisuke NAKAMURA and Yoshihiko NAKAMURA Dept. of Mechano-Informatics, University of okyo 7-- Hongo

More information

Chapter 1 Introduction

Chapter 1 Introduction Chapter 1 Introduction 1.1 Introduction to Chapter This chapter starts by describing the problems addressed by the project. The aims and objectives of the research are outlined and novel ideas discovered

More information

Iterative face image feature extraction with Generalized Hebbian Algorithm and a Sanger-like BCM rule

Iterative face image feature extraction with Generalized Hebbian Algorithm and a Sanger-like BCM rule Iterative face image feature extraction with Generalized Hebbian Algorithm and a Sanger-like BCM rule Clayton Aldern (Clayton_Aldern@brown.edu) Tyler Benster (Tyler_Benster@brown.edu) Carl Olsson (Carl_Olsson@brown.edu)

More information

Fundamentals of Computational Neuroscience 2e

Fundamentals of Computational Neuroscience 2e Fundamentals of Computational Neuroscience 2e January 1, 2010 Chapter 10: The cognitive brain Hierarchical maps and attentive vision A. Ventral visual pathway B. Layered cortical maps Receptive field size

More information

Artificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino

Artificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino Artificial Neural Networks Data Base and Data Mining Group of Politecnico di Torino Elena Baralis Politecnico di Torino Artificial Neural Networks Inspired to the structure of the human brain Neurons as

More information

Clicker Question. Discussion Question

Clicker Question. Discussion Question Connectomics Networks and Graph Theory A network is a set of nodes connected by edges The field of graph theory has developed a number of measures to analyze networks Mean shortest path length: the average

More information

Deep Learning. What Is Deep Learning? The Rise of Deep Learning. Long History (in Hind Sight)

Deep Learning. What Is Deep Learning? The Rise of Deep Learning. Long History (in Hind Sight) CSCE 636 Neural Networks Instructor: Yoonsuck Choe Deep Learning What Is Deep Learning? Learning higher level abstractions/representations from data. Motivation: how the brain represents sensory information

More information

BIOLOGY. 1. Overview of Neurons 11/3/2014. Neurons, Synapses, and Signaling. Communication in Neurons

BIOLOGY. 1. Overview of Neurons 11/3/2014. Neurons, Synapses, and Signaling. Communication in Neurons CAMPBELL BIOLOGY TENTH EDITION 48 Reece Urry Cain Wasserman Minorsky Jackson Neurons, Synapses, and Signaling Lecture Presentation by Nicole Tunbridge and Kathleen Fitzpatrick 1. Overview of Neurons Communication

More information

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Neural Networks Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Brains as Computational Devices Brains advantages with respect to digital computers: Massively parallel Fault-tolerant Reliable

More information

Artificial neural networks

Artificial neural networks Artificial neural networks Chapter 8, Section 7 Artificial Intelligence, spring 203, Peter Ljunglöf; based on AIMA Slides c Stuart Russel and Peter Norvig, 2004 Chapter 8, Section 7 Outline Brains Neural

More information

Introduction to Artificial Neural Networks

Introduction to Artificial Neural Networks Facultés Universitaires Notre-Dame de la Paix 27 March 2007 Outline 1 Introduction 2 Fundamentals Biological neuron Artificial neuron Artificial Neural Network Outline 3 Single-layer ANN Perceptron Adaline

More information

Machine Learning. Neural Networks

Machine Learning. Neural Networks Machine Learning Neural Networks Bryan Pardo, Northwestern University, Machine Learning EECS 349 Fall 2007 Biological Analogy Bryan Pardo, Northwestern University, Machine Learning EECS 349 Fall 2007 THE

More information

CS:4420 Artificial Intelligence

CS:4420 Artificial Intelligence CS:4420 Artificial Intelligence Spring 2018 Neural Networks Cesare Tinelli The University of Iowa Copyright 2004 18, Cesare Tinelli and Stuart Russell a a These notes were originally developed by Stuart

More information

Revision: Neural Network

Revision: Neural Network Revision: Neural Network Exercise 1 Tell whether each of the following statements is true or false by checking the appropriate box. Statement True False a) A perceptron is guaranteed to perfectly learn

More information

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen Neural Networks - I Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Neural Networks 1 /

More information

Cellular Neurobiology BIPN140

Cellular Neurobiology BIPN140 Cellular Neurobiology BIPN140 2nd Midterm Exam Will be Ready for Pickup Next Monday By the elevator on the 3 rd Floor of Pacific Hall (waiver) Exam Depot Window at the north entrance to Pacific Hall (no

More information