Introduction and Perceptron Learning
|
|
- Eunice Hodges
- 5 years ago
- Views:
Transcription
1 Artificial Neural Networks Introduction and Perceptron Learning CPSC 565 Winter 2003 Christian Jacob Department of Computer Science University of Calgary Canada
2 CPSC Winter Emergent Computing The Brain Paradigm Example: Visual Cortex of a Cat Image 1 2
3 CPSC Winter Emergent Computing Image 2 ANN Image Processing Example 3
4 CPSC Winter Emergent Computing 4 Brains vs. Digital Computers Ï Computers require hundreds of cycles to simulate a firing of a neuron. (How does the "firing" pattern of a neuron look like?) Ï Computers are good at symbol processing. ï Is "life" and "mind" reducible to "symbol processing"? Ï Brains perform extremely well at highly parallel pattern recognition tasks: Ë face recognition, Ë language processing, Ë language understanding (!), Ë creativity, inventing, use of tools,... Ë self-reflection, self-awareness,... Computers versus Human Brains Human Brain: Ë Grown by cell differentiation and iterated cell division (instead of constructed from pre-fabricated building blocks) Ë Rather simple "processing elements" Ë High degree of interconnectivity (adaptive!) Ë Adaptive and hierarchical architecture Ë Highly parallel and distributed information processing Ë Redundant information storage and processing Ë Functionality is both pre-programmed (to some degree) and "programmable" Ë "Algorithms" are designed through learning, not programming.
5 CPSC Winter Emergent Computing 5 Computers versus Human Brains: Hard- / Software and Processing Computer Human Brain Computational units 1 CPU, 10 5 gates neurons Storage units 10 9 bits RAM, bits disk10 11 neurons, synapses Cycle time 10-9 sec 10-3 sec Neuron updates per sec Networks of Neurons Dendrites, Synapses, Cell Body, Axon Dendrites, synapses, cell body, and axon are the four elements that are usually adopted from the biological model in order to build artificial neural networks. Artificial neurons for computing will have Ë input channels, Ë a cell body, and Ë an output channel. Synapses are simulated by contact points between the cell body and input or output connections. A weight will be associated with these points.
6 CPSC Winter Emergent Computing 6 Figure 1. A typical motor neuron Transmission of Information A fundamental problem of any information processing system is the way by which information is transmitted through the system. Neurons transmit information using electrical signals. However, in biological structures this can not be done by simple electronic transport as in metallic cables. Evolution arrived at another solution: involving ions and semi-permeable membranes. Charged Cells Our body consists mainly of water, 55% of which is contained within the cells and 45% forming its environment. The cells preserve their identity and biological components by enclosing the protoplasm in a membrane. Membranes are made of a double layer of molecules that form a diffusion barrier. Some salts, present in our body, dissolve in the intra- and extracellular fluid and dissociate into negative and positive ions. Ions present in the cells that play an important role for neurons and their information processing are Ë sodium ions HNa + L, chlorine ions HCl - L, potassium HK + L, and calcium HCa 2+ L. The membranes of the cells exhibit different degrees of permeability for each of these ions.
7 CPSC Winter Emergent Computing 7 The permeability is determined by the number and size of pores in the membrane, the so-called ionic channels. The specific permeability of the membrane leads to different distributions of ions in the interior and the exterior of the cells. Action Potential In particular, differences in membrane permeability lead to the interior of neurons being negatively charged with respect to the extracellular fluid. An action potential is produced by an initial depolarization of the cell membrane. Figure 2. Typical form of the action potential The potential increases from -70mV to +40mV. After some time, the potential becomes negative again, but it overshoots. Gradually, the cell recovers and the cell membrane returns to the initial potential.
8 CPSC Winter Emergent Computing 8 Transmission of an Action Potential Figure 3. Transmission of an action potential Information Processing at the Synapses Neurons transmit information using action potentials. The processing of this information at the interfaces between neurons, the synapses, involves a combination of electrical and chemical processes. Directed Transmission of Information Synapses determine a direction for the transmission of information.
9 CPSC Winter Emergent Computing 9 Signals flow from one cell to another in a well-defined manner. Figure 4. Chemical signaling at the synapse When an electric impulse arrives at the synapse, the synaptic vesicles fuse with the cell membrane. The transmitters flow into the synaptic gap and some attach to the ionic channels. This opens the ionic channels such that more ions can now flow from the exterior to the interior of the cell. This way, the cell's potential is altered. If the interior of the cell potential is increased, this helps prepare an action potential and the synapse causes an excitation of the cell. Storage of Information and Learning NMDA receptors help to understand some forms of learning (among many others) in neurons (NMDA = N-methyl-Daspartate). NMDA receptors are ionic channels permeable for different kinds of molecules (sodium, calcium, or potassium ions). Figure 5. Unblocking of an NMDA receptor These channels are blocked by a magnesium ion, such that the permeability for sodium and potassium is low.
10 CPSC Winter Emergent Computing 10 If the cell has reached a certain excitation level, the ionic channels lose the magnesium ions and become unblocked. The permeability for Ca 2+ ions increases immediately, which starts a chain of reactions resulting in a durable change of the threshold level of the cell. Artificial Neural Networks: Introductory Concepts Definition of an ANN Ë A neural network is a system composed of (usually a large number of) simple processing elements (neurons). Ë Ideally, the processing elements operate asynchronously and in parallel. Ë The ANNs can be used to acquire (through training, learning), store, and utilize experiential knowledge. Mathematically, a neural network is a "mapping machine" capable of modeling a function F : n ö m That is, a network maps an m-dimensional real input vector Hx 1, x 2,, x n L to an m-dimensional real output Hy 1, y 2,, y m L. ANN Architectures Feed-forward networks: Ë Neurons are arranged in layers. Ë Links only follow in one direction, namely from input to output layer. Ë Usually, a unit is linked only to units in the following layer(s). Ë Units within the same layer are not linked. Ë Signal (and error) propagation as well as weight updating can proceed uniformly from the input to the output layer.
11 CPSC Winter Emergent Computing 11 Figure 6. Example of a feed-forward network with a single hidden layer Recurrent networks: Ë Links can be between any neuron and can form arbitrary topologies. Ë Can implement more complex neural architectures. Ë Internal states with memory can be modelled. Ë A stable internal state and output might not be reached. Figure 7. McCulloch-Pitts network for a binary scaler. For example, it translates the binary sequence into the sequence
12 CPSC Winter Emergent Computing 12 A Generic Neuron Model Generic Model of a Neuron Processing Unit A typical model of a neural processing unit: A more detailed model of a neural processing unit: Input function: in i = j w ji a j Activation function: ghin i L = gh j w ji a j L
13 CPSC Winter Emergent Computing 13 Output function: a i = outhghin i LL = outhgh j w ji a j LL Activation Functions (1) Step Function: stephx, tl = i j 1 if x t k 0 if x < t t = (2) Sign Function: signhxl = i j 1 if x 0 k -1 if x <
14 CPSC Winter Emergent Computing 14 (3) Sigmoid Function: sigmoidhx, al = 1 ÅÅÅÅÅÅÅÅÅÅÅÅÅÅÅÅÅÅÅ 1+E -a x The parameter a determines the slope of the sigmoid function: Ë 0.1 a 1: sigmoidhx, al = 1 ÅÅÅÅÅÅÅÅÅÅÅÅÅÅÅÅÅÅÅ 1+E -a x a = Ë 1 a 10 : sigmoidhx, al = 1 ÅÅÅÅÅÅÅÅÅÅÅÅÅÅÅÅÅÅÅ 1+E -a x a = This allows the sigmoid function to approximate both the step and the sign function.
15 CPSC Winter Emergent Computing 15 Neurons in Action: Logic Gates Neurons as Logic Gates Individual units, representing Boolean functions, can act as logic gates, given appropriate thresholds and weights. Activation function: stephx, tl t = (1) Which logic function? w = 1 t = 1.5 w = 1 (2) Which logic function? w = 1 t = 0.5 w = 1
16 CPSC Winter Emergent Computing 16 (3) Which logic function? w = -1 t = -0.5 Specific Neuron Models McCulloch-Pitts Units McCulloch-Pitts processing units are the simplest neuron models, which produce and transmit only binary information. Figure 8. McCulloch-Pitts unit The rule for evaluating the input of a McCulloch-Pitts unit (MP unit) is as follows: Ë The MP unit gets two sorts of input: - input x 1, x 2,, x n through n excitatory edges - input y 1, y 2,, y m through m inhibitory edges. Ë If m 1 and at least one of the signals y 1, y 2,, y m is 1, the unit is inhibited and the output is 0. n Ë Otherwise, the total excitation x = i=1 x i = x 1 + x x n is computed and compared to the threshold q: output = i j 1 if x q k 0 if x < q
17 CPSC Winter Emergent Computing 17 Conjunction and Disjunction Figure 9. Generalized AND and OR gates as McCulloch-Pitts units Negation and More Logical Functions Figure 10. Logical functions and their realizations as McCulloch-Pitts neurons What Do MP Units Compute? For visualization purposes, we consider the function space of logical functions of three variables. Figure 11. Function values of a logical function of three variables Hx 1, x 2, x 3 L
18 CPSC Winter Emergent Computing 18 McCulloch-Pitts units divide the input space into two half-spaces. For a given input Hx 1, x 2, x 3 L and a threshold q the condition x 1 + x 2 + x 3 q is tested, which is true for all points to one side of the plane defined by x 1 + x 2 + x 3 = 0 and false for all points to the other side. Figure 12. Separation of the input space for the OR function The majority function (with threshold q = 2) of three variables divides the input space in a similar manner, but the separating plane is given by the equation x 1 + x 2 + x 3 = 2. Figure 13. Separating planes of the OR and majority functions The planes are always parallel in the case of McCulloch-Pitts units.
19 CPSC Winter Emergent Computing 19 The Perceptron Today, the perceptron is one of the classic models of neural network processing elements and architectures. Its use in practical applications is limited, however, due to its simplicity (both in its structure and learning algorithm) it provides a good model to study the basics and problems of connectionist information processing. The Classical Perceptron The perceptron was probably the first computation device inspired by neural networks. The perceptron was developed in 1958 by the American psychologist Frank Rosenblatt. Rosenblatt used the perceptron for image processing and image classification tasks. Figure 14. The classical perceptron architecture as proposed by Frank Rosenblatt Minsky-and-Papert Perceptron Minsky and Papert distilled the essential features from Rosenblatt's model in order to study the computational capabilities of the perceptron under different assumptions. A retina is directly connected to logic elements called predicates, which can computer a single bit according to their input. These predicates can be as computationally complex as we like. For example, each predicate could perform a filter function on the pixel image.
20 CPSC Winter Emergent Computing 20 Figure 15. Predicates and weights of a perceptron. Each predicate, however, is limited in its diameter or the number of input pixels. No predicate sees the whole retina. A threshold unit, which receives weighted inputs from the predicates, is used to compute the final output of the perceptron. Limitations* A Perceptron Cell x 1 x 2 w 2 w 1... w n 0 y x n x n+1 = 1 w n+1 = -q Perceptron with a Bias In many cases it is more convenient to deal with perceptrons of threshold zero only. This corresponds to linear separations which go through the origin of the input space.
21 CPSC Winter Emergent Computing 21 Any perceptron with threshold q can be converted into an equivalent perceptron with threshold zero, which has an additional input called the bias weighted by -q. Figure 17. A perceptron with a bias Most learning algorithms can be stated more concisely by transforming thresholds into biases. The input and weight vectors must be extended: Ë extended input vector: Hx 1, x 2,, x n, 1L Ë extended weight vector: Hw 1, w 2,, w n, w n+1 L with w n+1 = -q. From Inputs to Output The perceptron calculates its output value as follows: y = 9 1 if i=1 n+1 w i ÿ x i 0 0 if n+1 i=1 w i ÿ x i < 0
22 CPSC Winter Emergent Computing 22 What Do Perceptrons Compute? Geometric Interpretation A simple perceptron is a computing unit with threshold q. Receiving the n real inputs x 1, x 2,, x n through edges with the associated weights w 1, w 2,, w n, a perceptron computes its output as follows: output = 9 1 if n i=1 w i x i q 0 otherwise. The following figure shows this separation of the input space for weights Hw 1, w 2 L = H0.9, 0.2L. Figure 18. Separation of input space with a perceptron testing the condition 0.9 x x 2 1 Linearly Separable Functions A perceptron network is capable of computing any logical function. If we reduce the network to a single perceptron, which functions are still computable? The 16 Boolean functions of two variables: x 1 x2 f0 f1 f2 f3 f4 f5 f 6 f7 f 8 f 9 f 10 f11 f12 f13 f14 f
23 CPSC Winter Emergent Computing Perceptron-computable functions are those for which the points whose function value is 0 can be separated from the points whose function value is 1 using a single line. Figure 19. Linear separations of input space corresponding to OR and AND Two sets of points A and B in an n-dimensional space are called linearly separable if n + 1 real numbers w 1,, w n+1 n exist, such that every point Hx 1, x 2,, x n L œ A satisfies i=1 w i x i w n+1 and every point Hx 1, x 2,, x n L œ B satisfies n w i x i < w n+1. i=1 Duality of Input Space and Weight Space Figure 20. Duality of input and weight space The computation performed by a perceptron can be visualized as a linear separation of input space. When trying to find the appropriate weights for a perceptron, the search process can be better visualized in weight space.
24 CPSC Winter Emergent Computing 24 The Error Function in Weight Space Assume that the set A of input vectors in n-dimensional space must be separated from the set B of input vectors such that a perceptron computes the binary function f w with f w HxL = 9 1 if x œ A 0 if x œ B. The function f w depends on the set w = Hw 1,, w N L of weights (including the threshold). The error function value is the number of false classifications for a particular weight vector w: EHwL = xœa H1 - f w HxLL + xœb f w HxL. Since EHwL is positive or zero, we want to reach the global minimum where EHwL = 0. Consequently, the aim of perceptron learning is to find the weight vector for which EHwL = 0. The optimization problem, which the learning algorithm has to solve, can be understood as a descent on the error surface. Figure 21. Error function for the AND function (for a perceptron with two inputs Hx 1, x 2 L and constant threshold q = 1. Here is an example of such a path through an interation of weight settings w 0, w 1, w 2, w *.
25 CPSC Winter Emergent Computing 25 Figure 22. Iteration steps to the region of minimal error The Perceptron Learning Algorithm Optimization Problem: Definition The optimization problem, which the learning algorithm has to solve, can be understood as descent on the error surface. But we can also look at the problem as a search for an inner point of the solution region (a polytope in the case of the perceptron). For example, let's have a look at the separation corresponding to the AND function: P = 8H1, 1L< N = 8H0, 0L, H1, 0L, H0, 1L< Here P and N are the two sets of points to be separated. The set P must be classified in the positive and the set N in the negative half-space. Optimization Problem: Analytical Solution Three weights w 1, w 2 and w 3 = -q are needed to implement the desired separation with a generic perceptron. With the extended input vector Hx 3 = 1L, the following four inequalities have to be fulfilled for the AND function: H0, 0, 1L ÿ Hw 1, w 2, w 3 L < 0 H1, 0, 1L ÿ Hw 1, w 2, w 3 L < 0
26 CPSC Winter Emergent Computing 26 H0, 1, 1L ÿ Hw 1, w 2, w 3 L < 0 H1, 1, 1L ÿ Hw 1, w 2, w 3 L > 0 These equations can be written in a simpler matrix form: i y i w i 1 y w j z > j zk w 3 { j k { k This can be written as ÿ w > 0, where is the 4µ3 matrix and w 0 y 0 0 z 0 { the weight vector (written as a column vector). This equation describes all points in the interior of a convex polytope. The sides of the polytope are delimited by the planes defined by each of the inequalities above. Any point in the interior of the polytope represents a solution for the learning problem. Figure 23. Solution polytope for the AND function in weight space
27 CPSC Winter Emergent Computing 27 Optimization Problem: Learning Algorithm The following procedure describes the learning algorithm for a single perceptron cell. Given are two sets of points P and N, which the perceptron should learn to classify. Ï Start: Generate an initial vector of weights w 0. t = 0; w = w 0 Ï Testing: Select x œ P N. Ï Addition: w t+1 = w t + x If x œ P and w t ÿ x > 0: goto Test for End If x œ P and w t ÿ x 0: goto Addition If x œ N and w t ÿ x < 0: goto Test for End If x œ N and w t ÿ x 0: goto Subtraction t = t + 1 goto Testing Ï Subtraction: w t+1 = w t - x t = t + 1 goto Testing Ï Test for End: Are all x œ P N correctly classified? Yes: No: END goto Testing Note: The perceptron learning procedure only works if the point sets are linearly separable.
28 CPSC Winter Emergent Computing 28 Example The following example illustrates the convergence behavior of the perceptron learning algorithm. Figure 24. Initial Configuration Figure 25. After correction with x 1
29 CPSC Winter Emergent Computing 29 Figure 26. After correction with x 3 Figure 27. After correction with x 1 Adaptive "Programming" of ANNs through Learning ANN Learning A learning algorithm is an adaptive method by which a network of computing units self-organizes to implement the desired behavior. Testing Input/Output Examples Calculating Network Errors Changing Network Parameters Figure 28. Learning process in a parametric system In some learning algorithms, examples of the desired input-output mapping are presented to the network. A correction step is executed iteratively until the network learns to produce the desired response.
30 CPSC Winter Emergent Computing 30 Learning Schemes Supervised Learning Some input vectors are collected and presented to the network. The output computed by the network is observed and the deviation from the expected answer is measured. The weights are corrected (= learning algorithm) according to the magnitude of the error. Ë Error-correction Learning: The magnitude of the error, together with the input vector, determines the magnitude of the corrections to the weights. Examples: Perceptron learning, backpropagation. Ë Reinforcement Learning: After each presentation of an input-output example we only know whether the network produces the desired result or not. The weights are updated based on this Boolean decision (true or false). Examples: Learning how to ride a bike. Unsupervised Learning For a given input, the exact numerical output a network should produce is unknown. Since no "teacher" is available, the network must organize itself (e.g., in order to associate clusters with units). Examples: Clustering with self-organizing feature maps, Kohonen networks. Figure 29. Three clusters and a classifier network
31 CPSC Winter Emergent Computing 31 References Rojas, R. (1996). Neural networks : a systematic introduction. Berlin ; New York, Springer-Verlag. Kasabov, N. K. (1998). Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering. Cambridge, MA, MIT Press. Nilsson, N. J. (1998). Artificial intelligence : a new synthesis. San Francisco, Calif., Morgan Kaufmann Publishers. Negnevitsky, M. (2002). Artificial intelligence : a guide to intelligent systems. Harlow, England ; Toronto, Addison-Wesley.
Artificial Neural Networks
Artificial Neural Networks CPSC 533 Winter 2 Christian Jacob Neural Networks in the Context of AI Systems Neural Networks as Mediators between Symbolic AI and Statistical Methods 2 5.-NeuralNets-2.nb Neural
More informationArtificial Neural Network and Fuzzy Logic
Artificial Neural Network and Fuzzy Logic 1 Syllabus 2 Syllabus 3 Books 1. Artificial Neural Networks by B. Yagnanarayan, PHI - (Cover Topologies part of unit 1 and All part of Unit 2) 2. Neural Networks
More informationEEE 241: Linear Systems
EEE 4: Linear Systems Summary # 3: Introduction to artificial neural networks DISTRIBUTED REPRESENTATION An ANN consists of simple processing units communicating with each other. The basic elements of
More informationData Mining Part 5. Prediction
Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,
More informationLecture 7 Artificial neural networks: Supervised learning
Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in
More informationLecture 4: Feed Forward Neural Networks
Lecture 4: Feed Forward Neural Networks Dr. Roman V Belavkin Middlesex University BIS4435 Biological neurons and the brain A Model of A Single Neuron Neurons as data-driven models Neural Networks Training
More informationArtificial Intelligence
Artificial Intelligence Jeff Clune Assistant Professor Evolving Artificial Intelligence Laboratory Announcements Be making progress on your projects! Three Types of Learning Unsupervised Supervised Reinforcement
More informationArtificial Neural Networks The Introduction
Artificial Neural Networks The Introduction 01001110 01100101 01110101 01110010 01101111 01101110 01101111 01110110 01100001 00100000 01110011 01101011 01110101 01110000 01101001 01101110 01100001 00100000
More informationFeedforward Networks
Feedforward Networks Gradient Descent Learning and Backpropagation Christian Jacob CPSC 433 Christian Jacob Dept.of Coputer Science,University of Calgary CPSC 433 - Feedforward Networks 2 Adaptive "Prograing"
More information2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks. Todd W. Neller
2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks Todd W. Neller Machine Learning Learning is such an important part of what we consider "intelligence" that
More informationIntroduction Biologically Motivated Crude Model Backpropagation
Introduction Biologically Motivated Crude Model Backpropagation 1 McCulloch-Pitts Neurons In 1943 Warren S. McCulloch, a neuroscientist, and Walter Pitts, a logician, published A logical calculus of the
More informationFeedforward Networks. Gradient Descent Learning and Backpropagation. Christian Jacob. CPSC 533 Winter 2004
Feedforward Networks Gradient Descent Learning and Backpropagation Christian Jacob CPSC 533 Winter 2004 Christian Jacob Dept.of Coputer Science,University of Calgary 2 05-2-Backprop-print.nb Adaptive "Prograing"
More information(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann
(Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for
More informationIntroduction to Artificial Neural Networks
Facultés Universitaires Notre-Dame de la Paix 27 March 2007 Outline 1 Introduction 2 Fundamentals Biological neuron Artificial neuron Artificial Neural Network Outline 3 Single-layer ANN Perceptron Adaline
More informationIntroduction to Neural Networks
Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning
More informationFeedforward Networks
Feedforward Neural Networks - Backpropagation Feedforward Networks Gradient Descent Learning and Backpropagation CPSC 533 Fall 2003 Christian Jacob Dept.of Coputer Science,University of Calgary Feedforward
More informationEE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan
EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, 2012 Sasidharan Sreedharan www.sasidharan.webs.com 3/1/2012 1 Syllabus Artificial Intelligence Systems- Neural Networks, fuzzy logic,
More informationLecture 4: Perceptrons and Multilayer Perceptrons
Lecture 4: Perceptrons and Multilayer Perceptrons Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning Perceptrons, Artificial Neuronal Networks Lecture 4: Perceptrons
More informationArtificial Neural Networks. Historical description
Artificial Neural Networks Historical description Victor G. Lopez 1 / 23 Artificial Neural Networks (ANN) An artificial neural network is a computational model that attempts to emulate the functions of
More informationSimple Neural Nets for Pattern Classification: McCulloch-Pitts Threshold Logic CS 5870
Simple Neural Nets for Pattern Classification: McCulloch-Pitts Threshold Logic CS 5870 Jugal Kalita University of Colorado Colorado Springs Fall 2014 Logic Gates and Boolean Algebra Logic gates are used
More informationNeural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21
Neural Networks Chapter 8, Section 7 TB Artificial Intelligence Slides from AIMA http://aima.cs.berkeley.edu / 2 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural
More informationArtificial Neural Network
Artificial Neural Network Contents 2 What is ANN? Biological Neuron Structure of Neuron Types of Neuron Models of Neuron Analogy with human NN Perceptron OCR Multilayer Neural Network Back propagation
More informationLearning and Memory in Neural Networks
Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units
More informationARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD
ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided
More informationArtifical Neural Networks
Neural Networks Artifical Neural Networks Neural Networks Biological Neural Networks.................................. Artificial Neural Networks................................... 3 ANN Structure...........................................
More informationARTIFICIAL INTELLIGENCE. Artificial Neural Networks
INFOB2KI 2017-2018 Utrecht University The Netherlands ARTIFICIAL INTELLIGENCE Artificial Neural Networks Lecturer: Silja Renooij These slides are part of the INFOB2KI Course Notes available from www.cs.uu.nl/docs/vakken/b2ki/schema.html
More informationLast update: October 26, Neural networks. CMSC 421: Section Dana Nau
Last update: October 26, 207 Neural networks CMSC 42: Section 8.7 Dana Nau Outline Applications of neural networks Brains Neural network units Perceptrons Multilayer perceptrons 2 Example Applications
More informationNeural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET
Unit-. Definition Neural network is a massively parallel distributed processing system, made of highly inter-connected neural computing elements that have the ability to learn and thereby acquire knowledge
More informationIntroduction To Artificial Neural Networks
Introduction To Artificial Neural Networks Machine Learning Supervised circle square circle square Unsupervised group these into two categories Supervised Machine Learning Supervised Machine Learning Supervised
More informationCS 4700: Foundations of Artificial Intelligence
CS 4700: Foundations of Artificial Intelligence Prof. Bart Selman selman@cs.cornell.edu Machine Learning: Neural Networks R&N 18.7 Intro & perceptron learning 1 2 Neuron: How the brain works # neurons
More informationComputational Intelligence Winter Term 2009/10
Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund Plan for Today Organization (Lectures / Tutorials)
More informationNeural Networks and the Back-propagation Algorithm
Neural Networks and the Back-propagation Algorithm Francisco S. Melo In these notes, we provide a brief overview of the main concepts concerning neural networks and the back-propagation algorithm. We closely
More informationNeural networks. Chapter 20, Section 5 1
Neural networks Chapter 20, Section 5 Chapter 20, Section 5 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 20, Section 5 2 Brains 0 neurons of
More informationArtificial Neural Networks. Part 2
Artificial Neural Netorks Part Artificial Neuron Model Folloing simplified model of real neurons is also knon as a Threshold Logic Unit x McCullouch-Pitts neuron (943) x x n n Body of neuron f out Biological
More informationNeural networks. Chapter 19, Sections 1 5 1
Neural networks Chapter 19, Sections 1 5 Chapter 19, Sections 1 5 1 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 19, Sections 1 5 2 Brains 10
More informationComputational Intelligence
Plan for Today Computational Intelligence Winter Term 29/ Organization (Lectures / Tutorials) Overview CI Introduction to ANN McCulloch Pitts Neuron (MCP) Minsky / Papert Perceptron (MPP) Prof. Dr. Günter
More informationLinear discriminant functions
Andrea Passerini passerini@disi.unitn.it Machine Learning Discriminative learning Discriminative vs generative Generative learning assumes knowledge of the distribution governing the data Discriminative
More informationArtificial Neural Networks. Q550: Models in Cognitive Science Lecture 5
Artificial Neural Networks Q550: Models in Cognitive Science Lecture 5 "Intelligence is 10 million rules." --Doug Lenat The human brain has about 100 billion neurons. With an estimated average of one thousand
More informationMachine Learning. Neural Networks. (slides from Domingos, Pardo, others)
Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward
More informationFeedforward Neural Nets and Backpropagation
Feedforward Neural Nets and Backpropagation Julie Nutini University of British Columbia MLRG September 28 th, 2016 1 / 23 Supervised Learning Roadmap Supervised Learning: Assume that we are given the features
More information22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1
Neural Networks Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Brains as Computational Devices Brains advantages with respect to digital computers: Massively parallel Fault-tolerant Reliable
More informationFundamentals of Neural Networks
Fundamentals of Neural Networks : Soft Computing Course Lecture 7 14, notes, slides www.myreaders.info/, RC Chakraborty, e-mail rcchak@gmail.com, Aug. 10, 2010 http://www.myreaders.info/html/soft_computing.html
More informationMachine Learning. Neural Networks. (slides from Domingos, Pardo, others)
Machine Learning Neural Networks (slides from Domingos, Pardo, others) Human Brain Neurons Input-Output Transformation Input Spikes Output Spike Spike (= a brief pulse) (Excitatory Post-Synaptic Potential)
More informationCSC Neural Networks. Perceptron Learning Rule
CSC 302 1.5 Neural Networks Perceptron Learning Rule 1 Objectives Determining the weight matrix and bias for perceptron networks with many inputs. Explaining what a learning rule is. Developing the perceptron
More informationPart 8: Neural Networks
METU Informatics Institute Min720 Pattern Classification ith Bio-Medical Applications Part 8: Neural Netors - INTRODUCTION: BIOLOGICAL VS. ARTIFICIAL Biological Neural Netors A Neuron: - A nerve cell as
More informationLinear Regression, Neural Networks, etc.
Linear Regression, Neural Networks, etc. Gradient Descent Many machine learning problems can be cast as optimization problems Define a function that corresponds to learning error. (More on this later)
More informationArtificial neural networks
Artificial neural networks Chapter 8, Section 7 Artificial Intelligence, spring 203, Peter Ljunglöf; based on AIMA Slides c Stuart Russel and Peter Norvig, 2004 Chapter 8, Section 7 Outline Brains Neural
More informationNeural Networks: Introduction
Neural Networks: Introduction Machine Learning Fall 2017 Based on slides and material from Geoffrey Hinton, Richard Socher, Dan Roth, Yoav Goldberg, Shai Shalev-Shwartz and Shai Ben-David, and others 1
More informationArtificial Neural Networks
Artificial Neural Networks 鮑興國 Ph.D. National Taiwan University of Science and Technology Outline Perceptrons Gradient descent Multi-layer networks Backpropagation Hidden layer representations Examples
More informationNeural Networks. Fundamentals Framework for distributed processing Network topologies Training of ANN s Notation Perceptron Back Propagation
Neural Networks Fundamentals Framework for distributed processing Network topologies Training of ANN s Notation Perceptron Back Propagation Neural Networks Historical Perspective A first wave of interest
More informationInstituto Tecnológico y de Estudios Superiores de Occidente Departamento de Electrónica, Sistemas e Informática. Introductory Notes on Neural Networks
Introductory Notes on Neural Networs Dr. José Ernesto Rayas Sánche April Introductory Notes on Neural Networs Dr. José Ernesto Rayas Sánche BIOLOGICAL NEURAL NETWORKS The brain can be seen as a highly
More informationCourse 395: Machine Learning - Lectures
Course 395: Machine Learning - Lectures Lecture 1-2: Concept Learning (M. Pantic) Lecture 3-4: Decision Trees & CBC Intro (M. Pantic & S. Petridis) Lecture 5-6: Evaluating Hypotheses (S. Petridis) Lecture
More informationSimple Neural Nets For Pattern Classification
CHAPTER 2 Simple Neural Nets For Pattern Classification Neural Networks General Discussion One of the simplest tasks that neural nets can be trained to perform is pattern classification. In pattern classification
More informationNeural networks. Chapter 20. Chapter 20 1
Neural networks Chapter 20 Chapter 20 1 Outline Brains Neural networks Perceptrons Multilayer networks Applications of neural networks Chapter 20 2 Brains 10 11 neurons of > 20 types, 10 14 synapses, 1ms
More informationUnit 8: Introduction to neural networks. Perceptrons
Unit 8: Introduction to neural networks. Perceptrons D. Balbontín Noval F. J. Martín Mateos J. L. Ruiz Reina A. Riscos Núñez Departamento de Ciencias de la Computación e Inteligencia Artificial Universidad
More informationArtificial Neural Networks. Edward Gatt
Artificial Neural Networks Edward Gatt What are Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning Very
More informationMachine Learning. Neural Networks. (slides from Domingos, Pardo, others)
Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward
More informationAN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009
AN INTRODUCTION TO NEURAL NETWORKS Scott Kuindersma November 12, 2009 SUPERVISED LEARNING We are given some training data: We must learn a function If y is discrete, we call it classification If it is
More informationNeural Networks. Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994
Neural Networks Neural Networks Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994 An Introduction to Neural Networks (nd Ed). Morton, IM, 1995 Neural Networks
More informationCS:4420 Artificial Intelligence
CS:4420 Artificial Intelligence Spring 2018 Neural Networks Cesare Tinelli The University of Iowa Copyright 2004 18, Cesare Tinelli and Stuart Russell a a These notes were originally developed by Stuart
More informationArtificial Neural Networks. Introduction to Computational Neuroscience Tambet Matiisen
Artificial Neural Networks Introduction to Computational Neuroscience Tambet Matiisen 2.04.2018 Artificial neural network NB! Inspired by biology, not based on biology! Applications Automatic speech recognition
More informationChapter 9: The Perceptron
Chapter 9: The Perceptron 9.1 INTRODUCTION At this point in the book, we have completed all of the exercises that we are going to do with the James program. These exercises have shown that distributed
More information18.6 Regression and Classification with Linear Models
18.6 Regression and Classification with Linear Models 352 The hypothesis space of linear functions of continuous-valued inputs has been used for hundreds of years A univariate linear function (a straight
More informationNeural Networks for Machine Learning. Lecture 2a An overview of the main types of neural network architecture
Neural Networks for Machine Learning Lecture 2a An overview of the main types of neural network architecture Geoffrey Hinton with Nitish Srivastava Kevin Swersky Feed-forward neural networks These are
More informationComputational Intelligence
Plan for Today Computational Intelligence Winter Term 207/8 Organization (Lectures / Tutorials) Overview CI Introduction to ANN McCulloch Pitts Neuron (MCP) Minsky / Papert Perceptron (MPP) Prof. Dr. Günter
More informationMachine Learning. Neural Networks
Machine Learning Neural Networks Bryan Pardo, Northwestern University, Machine Learning EECS 349 Fall 2007 Biological Analogy Bryan Pardo, Northwestern University, Machine Learning EECS 349 Fall 2007 THE
More informationSGD and Deep Learning
SGD and Deep Learning Subgradients Lets make the gradient cheating more formal. Recall that the gradient is the slope of the tangent. f(w 1 )+rf(w 1 ) (w w 1 ) Non differentiable case? w 1 Subgradients
More informationAI Programming CS F-20 Neural Networks
AI Programming CS662-2008F-20 Neural Networks David Galles Department of Computer Science University of San Francisco 20-0: Symbolic AI Most of this class has been focused on Symbolic AI Focus or symbols
More informationArtificial Neural Networks Examination, June 2005
Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either
More informationAn artificial neural networks (ANNs) model is a functional abstraction of the
CHAPER 3 3. Introduction An artificial neural networs (ANNs) model is a functional abstraction of the biological neural structures of the central nervous system. hey are composed of many simple and highly
More informationCOMP9444 Neural Networks and Deep Learning 2. Perceptrons. COMP9444 c Alan Blair, 2017
COMP9444 Neural Networks and Deep Learning 2. Perceptrons COMP9444 17s2 Perceptrons 1 Outline Neurons Biological and Artificial Perceptron Learning Linear Separability Multi-Layer Networks COMP9444 17s2
More informationNeural Networks Introduction
Neural Networks Introduction H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011 H. A. Talebi, Farzaneh Abdollahi Neural Networks 1/22 Biological
More informationNeural Networks. Xiaojin Zhu Computer Sciences Department University of Wisconsin, Madison. slide 1
Neural Networks Xiaoin Zhu erryzhu@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison slide 1 Terminator 2 (1991) JOHN: Can you learn? So you can be... you know. More human. Not
More informationARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92
ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 BIOLOGICAL INSPIRATIONS Some numbers The human brain contains about 10 billion nerve cells (neurons) Each neuron is connected to the others through 10000
More informationArtificial Neural Networks Examination, June 2004
Artificial Neural Networks Examination, June 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum
More informationMultilayer Perceptron
Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Single Perceptron 3 Boolean Function Learning 4
More informationSections 18.6 and 18.7 Artificial Neural Networks
Sections 18.6 and 18.7 Artificial Neural Networks CS4811 - Artificial Intelligence Nilufer Onder Department of Computer Science Michigan Technological University Outline The brain vs. artifical neural
More informationMultilayer Perceptron Tutorial
Multilayer Perceptron Tutorial Leonardo Noriega School of Computing Staffordshire University Beaconside Staffordshire ST18 0DG email: l.a.noriega@staffs.ac.uk November 17, 2005 1 Introduction to Neural
More informationSupervised (BPL) verses Hybrid (RBF) Learning. By: Shahed Shahir
Supervised (BPL) verses Hybrid (RBF) Learning By: Shahed Shahir 1 Outline I. Introduction II. Supervised Learning III. Hybrid Learning IV. BPL Verses RBF V. Supervised verses Hybrid learning VI. Conclusion
More informationArtificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!
Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011! 1 Todayʼs lecture" How the brain works (!)! Artificial neural networks! Perceptrons! Multilayer feed-forward networks! Error
More informationCMSC 421: Neural Computation. Applications of Neural Networks
CMSC 42: Neural Computation definition synonyms neural networks artificial neural networks neural modeling connectionist models parallel distributed processing AI perspective Applications of Neural Networks
More informationMaster Recherche IAC TC2: Apprentissage Statistique & Optimisation
Master Recherche IAC TC2: Apprentissage Statistique & Optimisation Alexandre Allauzen Anne Auger Michèle Sebag LIMSI LRI Oct. 4th, 2012 This course Bio-inspired algorithms Classical Neural Nets History
More informationRevision: Neural Network
Revision: Neural Network Exercise 1 Tell whether each of the following statements is true or false by checking the appropriate box. Statement True False a) A perceptron is guaranteed to perfectly learn
More informationSPSS, University of Texas at Arlington. Topics in Machine Learning-EE 5359 Neural Networks
Topics in Machine Learning-EE 5359 Neural Networks 1 The Perceptron Output: A perceptron is a function that maps D-dimensional vectors to real numbers. For notational convenience, we add a zero-th dimension
More informationSections 18.6 and 18.7 Artificial Neural Networks
Sections 18.6 and 18.7 Artificial Neural Networks CS4811 - Artificial Intelligence Nilufer Onder Department of Computer Science Michigan Technological University Outline The brain vs artifical neural networks
More informationCS 4700: Foundations of Artificial Intelligence
CS 4700: Foundations of Artificial Intelligence Prof. Bart Selman selman@cs.cornell.edu Machine Learning: Neural Networks R&N 18.7 Intro & perceptron learning 1 2 Neuron: How the brain works # neurons
More information3 Detector vs. Computer
1 Neurons 1. The detector model. Also keep in mind this material gets elaborated w/the simulations, and the earliest material is often hardest for those w/primarily psych background. 2. Biological properties
More informationNeural Networks. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington
Neural Networks CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington 1 Perceptrons x 0 = 1 x 1 x 2 z = h w T x Output: z x D A perceptron
More informationReification of Boolean Logic
526 U1180 neural networks 1 Chapter 1 Reification of Boolean Logic The modern era of neural networks began with the pioneer work of McCulloch and Pitts (1943). McCulloch was a psychiatrist and neuroanatomist;
More informationSynaptic Devices and Neuron Circuits for Neuron-Inspired NanoElectronics
Synaptic Devices and Neuron Circuits for Neuron-Inspired NanoElectronics Byung-Gook Park Inter-university Semiconductor Research Center & Department of Electrical and Computer Engineering Seoul National
More informationIntroduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen
Neural Networks - I Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Neural Networks 1 /
More informationSupervised Learning. George Konidaris
Supervised Learning George Konidaris gdk@cs.brown.edu Fall 2017 Machine Learning Subfield of AI concerned with learning from data. Broadly, using: Experience To Improve Performance On Some Task (Tom Mitchell,
More informationArtificial Neural Networks
Introduction ANN in Action Final Observations Application: Poverty Detection Artificial Neural Networks Alvaro J. Riascos Villegas University of los Andes and Quantil July 6 2018 Artificial Neural Networks
More informationNeural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2
Neural Nets in PR NM P F Outline Motivation: Pattern Recognition XII human brain study complex cognitive tasks Michal Haindl Faculty of Information Technology, KTI Czech Technical University in Prague
More informationIntelligent Systems Discriminative Learning, Neural Networks
Intelligent Systems Discriminative Learning, Neural Networks Carsten Rother, Dmitrij Schlesinger WS2014/2015, Outline 1. Discriminative learning 2. Neurons and linear classifiers: 1) Perceptron-Algorithm
More informationCh.8 Neural Networks
Ch.8 Neural Networks Hantao Zhang http://www.cs.uiowa.edu/ hzhang/c145 The University of Iowa Department of Computer Science Artificial Intelligence p.1/?? Brains as Computational Devices Motivation: Algorithms
More informationNeuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon
Neuron Detector Model 1 The detector model. 2 Biological properties of the neuron. 3 The computational unit. Each neuron is detecting some set of conditions (e.g., smoke detector). Representation is what
More informationNeural Networks. Nicholas Ruozzi University of Texas at Dallas
Neural Networks Nicholas Ruozzi University of Texas at Dallas Handwritten Digit Recognition Given a collection of handwritten digits and their corresponding labels, we d like to be able to correctly classify
More informationCOMP304 Introduction to Neural Networks based on slides by:
COMP34 Introduction to Neural Networks based on slides by: Christian Borgelt http://www.borgelt.net/ Christian Borgelt Introduction to Neural Networks Motivation: Why (Artificial) Neural Networks? (Neuro-)Biology
More informationDendrites - receives information from other neuron cells - input receivers.
The Nerve Tissue Neuron - the nerve cell Dendrites - receives information from other neuron cells - input receivers. Cell body - includes usual parts of the organelles of a cell (nucleus, mitochondria)
More information