COMP304 Introduction to Neural Networks based on slides by:
|
|
- Wilfred Wilson
- 5 years ago
- Views:
Transcription
1 COMP34 Introduction to Neural Networks based on slides by: Christian Borgelt Christian Borgelt Introduction to Neural Networks
2 Motivation: Why (Artificial) Neural Networks? (Neuro-)Biology / (Neuro-)Physiology / Psychology: Exploit similarity to real (biological) neural networks. Build models to understand nerve and brain operation by simulation. Computer Science / Engineering / Economics Mimic certain cognitive capabilities of human beings. Solve learning/adaptation, prediction, and optimization problems. Physics / Chemistry Use neural network models to describe physical phenomena. Special case: spin glasses (alloys of magnetic and non-magnetic metals). Christian Borgelt Introduction to Neural Networks
3 Motivation: Why Neural Networks in AI? Physical-Symbol System Hypothesis [Newell and Simon 976] A physical-symbol system has the necessary and sufficient means for general intelligent action. Neural networks process simple signals, not symbols. So why study neural networks in Artificial Intelligence? Symbol-based representations work well for inference tasks, but are fairly bad for perception tasks. Symbol-based expert systems tend to get slower with growing knowledge, human experts tend to get faster. Neural networks allow for highly parallel information processing. There are several successful applications in industry and finance. Christian Borgelt Introduction to Neural Networks 3
4 Biological Background Structure of a prototypical biological neuron terminal button synapsis dendrites cell core cell body (soma) axon myelin sheath Christian Borgelt Introduction to Neural Networks 4
5 Biological Background (Very) simplified description of neural information processing Axon terminal releases chemicals, called neurotransmitters. These act on the membrane of the receptor dendrite to change its polarization. (The inside is usually 7mV more negative than the outside.) Decrease in potential difference: excitatory synapse Increase in potential difference: inhibitory synapse If there is enough net excitatory input, the axon is depolarized. The resulting action potential travels along the axon. (Speed depends on the degree to which the axon is covered with myelin.) When the action potential reaches the terminal buttons, it triggers the release of neurotransmitters. Christian Borgelt Introduction to Neural Networks 5
6 Threshold Logic Units Christian Borgelt Introduction to Neural Networks 6
7 Threshold Logic Units A Threshold Logic Unit (TLU) is a processing unit for numbers with n inputs x,..., x n and one output y. The unit has a threshold θ and each input x i is associated with a weight w i. A threshold logic unit computes the function y =, if x w =, otherwise. n i= w i x i θ, x w.. θ y x n w n Christian Borgelt Introduction to Neural Networks 7
8 Threshold Logic Units: Examples Threshold logic unit for the conjunction x x. x 3 x 4 y x x 3x + x y 3 5 Threshold logic unit for the implication x x. x x y x x x x y Christian Borgelt Introduction to Neural Networks 8
9 Threshold Logic Units: Examples Threshold logic unit for (x x ) (x x 3 ) (x x 3 ). x x x 3 y x x x 3 i w i x i y 4 Christian Borgelt Introduction to Neural Networks 9
10 Threshold Logic Units: Geometric Interpretation Review of line representations Straight lines are usually represented in one of the following forms: with the parameters: Explicit Form: g x = bx + c Implicit Form: g a x + a x + d = Point-Direction Form: g x = p + k r Normal Form: g ( x p) n = b : c : p : r : n : Gradient of the line Section of the x axis Vector of a point of the line (base vector) Direction vector of the line Normal vector of the line Christian Borgelt Introduction to Neural Networks
11 Threshold Logic Units: Geometric Interpretation A straight line and its defining parameters. c x b = r r r n = (a, a ) q = d n n n g ϕ p d = p n x O Christian Borgelt Introduction to Neural Networks
12 Threshold Logic Units: Geometric Interpretation How to determine the side on which a point x lies. x z z = x n n n n q = d n n n g ϕ x x O Christian Borgelt Introduction to Neural Networks
13 Threshold Logic Units: Geometric Interpretation Threshold logic unit for x x. x 3 x 4 y x x A threshold logic unit for x x. x x y x x Christian Borgelt Introduction to Neural Networks 3
14 Threshold Logic Units: Geometric Interpretation Visualization of 3-dimensional Boolean functions: x 3 x (,, ) x (,, ) Threshold logic unit for (x x ) (x x 3 ) (x x 3 ). x x y x 3 x x 3 x Christian Borgelt Introduction to Neural Networks 4
15 Threshold Logic Units: Limitations The biimplication problem x x : There is no separating line. x x y x x Formal proof by reductio ad absurdum: since (, ) : θ, () since (, ) : w < θ, () since (, ) : w < θ, (3) since (, ) : w + w θ. (4) () and (3): w + w < θ. With (4): θ > θ, or θ >. Contradiction to (). Christian Borgelt Introduction to Neural Networks 5
16 Threshold Logic Units: Limitations Total number and number of linearly separable Boolean functions. ([Widner 96] as cited in [Zell 994]) inputs Boolean functions linearly separable functions For many inputs a threshold logic unit can compute almost no functions. Networks of threshold logic units are needed to overcome the limitations. Christian Borgelt Introduction to Neural Networks 6
17 Networks of Threshold Logic Units Solving the biimplication problem with a network. Idea: logical decomposition x x (x x ) (x x ) x x computes y = x x 3 computes y = x x computes y = y y y = x x Christian Borgelt Introduction to Neural Networks 7
18 Networks of Threshold Logic Units Solving the biimplication problem: Geometric interpretation x d c g g = y b ac g 3 a b d x y The first layer computes new Boolean coordinates for the points. After the coordinate transformation the problem is linearly separable. Christian Borgelt Introduction to Neural Networks 8
19 Representing Arbitrary Boolean Functions Let y = f(x,..., x n ) be a Boolean function of n variables. (i) Represent f(x,..., x n ) in disjunctive normal form. That is, determine D f = K... K m, where all K j are conjunctions of n literals, i.e., K j = l j... l jn with l ji = x i (positive literal) or l ji = x i (negative literal). (ii) Create a neuron for each conjunction K j of the disjunctive normal form (having n inputs one input for each variable), where {, if lji = x w ji = i, and θ, if l ji = x i, j = n + n w ji. (iii) Create an output neuron (having m inputs one input for each neuron that was created in step (ii)), where i= w (n+)k =, k =,..., m, and θ n+ =. Christian Borgelt Introduction to Neural Networks 9
20 Training Threshold Logic Units Christian Borgelt Introduction to Neural Networks
21 Training Threshold Logic Units Geometric interpretation provides a way to construct threshold logic units with and 3 inputs, but: Not an automatic method (human visualization needed). Not feasible for more than 3 inputs. General idea of automatic training: Start with random values for weights and threshold. Determine the error of the output for a set of training patterns. Error is a function of the weights and the threshold: e = e(w,..., w n, θ). Adapt weights and threshold so that the error gets smaller. Iterate adaptation until the error vanishes. Christian Borgelt Introduction to Neural Networks
22 Training Threshold Logic Units Single input threshold logic unit for the negation x. x w θ y x y Output error as a function of weight and threshold. w e θ error for x = w e θ error for x = w e θ sum of errors Christian Borgelt Introduction to Neural Networks
23 Training Threshold Logic Units The error function cannot be used directly, because it consists of plateaus. Solution: If the computed output is wrong, take into account, how far the weighted sum is from the threshold. Modified output error as a function of weight and threshold. 4 e 4 e 4 e w θ error for x = w θ error for x = w θ sum of errors Christian Borgelt Introduction to Neural Networks 3
24 Training Threshold Logic Units Example training procedure: Online and batch training. w θ Online-Lernen w θ Batch-Lernen 4 w e θ Batch-Lernen x y x Christian Borgelt Introduction to Neural Networks 4
25 Training Threshold Logic Units: Delta Rule Formal Training Rule: Let x = (x,..., x n ) be an input vector of a threshold logic unit, o the desired output for this input vector and y the actual output of the threshold logic unit. If y o, then the threshold θ and the weight vector w = (w,..., w n ) are adapted as follows in order to reduce the error: i {,..., n} : θ (new) = θ (old) + θ with θ = η(o y), = w (old) i + w i with w i = η(o y)x i, w (new) i where η is a parameter that is called learning rate. It determines the severity of the weight changes. This procedure is called Delta Rule or Widrow Hoff Procedure [Widrow and Hoff 96]. Online Training: Adapt parameters after each training pattern. Batch Training: Adapt parameters only at the end of each epoch, i.e. after a traversal of all training patterns. Christian Borgelt Introduction to Neural Networks 5
26 Training Threshold Logic Units: Delta Rule Turning the threshold value into a weight: = x w = θ x w x w x w θ y x w y.. w n w n x n x n n i= w i x i θ n i= w i x i θ Christian Borgelt Introduction to Neural Networks 6
27 Training Threshold Logic Units: Delta Rule procedure online training (var w, var θ, L, η); var y, e; (* output, sum of errors *) begin repeat e := ; (* initialize the error sum *) for all ( x, o) L do begin (* traverse the patterns *) if ( w x θ) then y := ; (* compute the output *) else y := ; (* of the threshold logic unit *) if (y o) then begin (* if the output is wrong *) θ := θ η(o y); (* adapt the threshold *) w := w + η(o y) x; (* and the weights *) e := e + o y ; (* sum the errors *) end; end; until (e ); (* repeat the computations *) end; (* until the error vanishes *) Christian Borgelt Introduction to Neural Networks 7
28 Training Threshold Logic Units: Not epoch x o x w y e θ w θ w Christian Borgelt Introduction to Neural Networks 8
29 Training Threshold Logic Units: Conjunction Threshold logic unit with two inputs for the conjunction. x w x w θ y x x y x y x Christian Borgelt Introduction to Neural Networks 9
30 Training Threshold Logic Units: Conjunction epoch x x o x w y e θ w w θ w w Christian Borgelt Introduction to Neural Networks 3
31 Training Threshold Logic Units: Biimplication epoch x x o x w y e θ w w θ w w Christian Borgelt Introduction to Neural Networks 3
32 Training Threshold Logic Units: Convergence Convergence Theorem: Let L = {( x, o ),... ( x m, o m )} be a set of training patterns, each consisting of an input vector x i IR n and a desired output o i {, }. Furthermore, let L = {( x, o) L o = } and L = {( x, o) L o = }. If L and L are linearly separable, i.e., if w IR n and θ IR exist, such that ( x, ) L : w x < θ and ( x, ) L : w x θ, then online as well as batch training terminate. The algorithms terminate only when the error vanishes. Therefore the resulting threshold and weights must solve the problem. For not linearly separable problems the algorithms do not terminate. Christian Borgelt Introduction to Neural Networks 3
33 Training Networks of Threshold Logic Units Single threshold logic units have strong limitations: They can only compute linearly separable functions. Networks of threshold logic units can compute arbitrary Boolean functions. Training single threshold logic units with the delta rule is fast and guaranteed to find a solution if one exists. Networks of threshold logic units cannot be trained, because there are no desired values for the neurons of the first layer, the problem can usually be solved with different functions computed by the neurons of the first layer. When this situation became clear, neural networks were seen as a research dead end. Christian Borgelt Introduction to Neural Networks 33
Introduction to Neural Networks
Introduction to Neural Networks Christian Borgelt Intelligent Data Analysis and Graphical Models Research Unit European Center for Soft Computing c/ Gonzalo Gutiérrez Quirós s/n, 33600 Mieres, Spain christian.borgelt@softcomputing.es
More informationArtificial Neural Networks and Deep Learning
Artificial Neural Networks and Deep Learning Christian Borgelt School of Computer Science University of Konstanz Universitätsstraße 0, 78457 Konstanz, Germany christian.borgelt@uni-konstanz.de christian@borgelt.net
More informationLinear discriminant functions
Andrea Passerini passerini@disi.unitn.it Machine Learning Discriminative learning Discriminative vs generative Generative learning assumes knowledge of the distribution governing the data Discriminative
More informationArtifical Neural Networks
Neural Networks Artifical Neural Networks Neural Networks Biological Neural Networks.................................. Artificial Neural Networks................................... 3 ANN Structure...........................................
More informationNeural Networks Introduction CIS 32
Neural Networks Introduction CIS 32 Functionalia Office Hours (Last Change!) - Location Moved to 0317 N (Bridges Room) Today: Alpha-Beta Example Neural Networks Learning with T-R Agent (from before) direction
More informationLinear Regression, Neural Networks, etc.
Linear Regression, Neural Networks, etc. Gradient Descent Many machine learning problems can be cast as optimization problems Define a function that corresponds to learning error. (More on this later)
More informationArtificial Neural Networks
Artificial Neural Networks 鮑興國 Ph.D. National Taiwan University of Science and Technology Outline Perceptrons Gradient descent Multi-layer networks Backpropagation Hidden layer representations Examples
More informationArtificial Neural Network
Artificial Neural Network Contents 2 What is ANN? Biological Neuron Structure of Neuron Types of Neuron Models of Neuron Analogy with human NN Perceptron OCR Multilayer Neural Network Back propagation
More informationMultilayer Perceptron
Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Single Perceptron 3 Boolean Function Learning 4
More informationMachine Learning. Neural Networks. (slides from Domingos, Pardo, others)
Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward
More informationInformation processing. Divisions of nervous system. Neuron structure and function Synapse. Neurons, synapses, and signaling 11/3/2017
Neurons, synapses, and signaling Chapter 48 Information processing Divisions of nervous system Central nervous system (CNS) Brain and a nerve cord Integration center Peripheral nervous system (PNS) Nerves
More informationChapter 37 Active Reading Guide Neurons, Synapses, and Signaling
Name: AP Biology Mr. Croft Section 1 1. What is a neuron? Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling 2. Neurons can be placed into three groups, based on their location and function.
More informationCOMP9444 Neural Networks and Deep Learning 2. Perceptrons. COMP9444 c Alan Blair, 2017
COMP9444 Neural Networks and Deep Learning 2. Perceptrons COMP9444 17s2 Perceptrons 1 Outline Neurons Biological and Artificial Perceptron Learning Linear Separability Multi-Layer Networks COMP9444 17s2
More informationCMSC 421: Neural Computation. Applications of Neural Networks
CMSC 42: Neural Computation definition synonyms neural networks artificial neural networks neural modeling connectionist models parallel distributed processing AI perspective Applications of Neural Networks
More informationSimple Neural Nets For Pattern Classification
CHAPTER 2 Simple Neural Nets For Pattern Classification Neural Networks General Discussion One of the simplest tasks that neural nets can be trained to perform is pattern classification. In pattern classification
More informationLecture 4: Perceptrons and Multilayer Perceptrons
Lecture 4: Perceptrons and Multilayer Perceptrons Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning Perceptrons, Artificial Neuronal Networks Lecture 4: Perceptrons
More informationHopfield Networks and Boltzmann Machines. Christian Borgelt Artificial Neural Networks and Deep Learning 296
Hopfield Networks and Boltzmann Machines Christian Borgelt Artificial Neural Networks and Deep Learning 296 Hopfield Networks A Hopfield network is a neural network with a graph G = (U,C) that satisfies
More information2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks. Todd W. Neller
2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks Todd W. Neller Machine Learning Learning is such an important part of what we consider "intelligence" that
More informationThe perceptron learning algorithm is one of the first procedures proposed for learning in neural network models and is mostly credited to Rosenblatt.
1 The perceptron learning algorithm is one of the first procedures proposed for learning in neural network models and is mostly credited to Rosenblatt. The algorithm applies only to single layer models
More informationPerceptron. (c) Marcin Sydow. Summary. Perceptron
Topics covered by this lecture: Neuron and its properties Mathematical model of neuron: as a classier ' Learning Rule (Delta Rule) Neuron Human neural system has been a natural source of inspiration for
More informationNeural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21
Neural Networks Chapter 8, Section 7 TB Artificial Intelligence Slides from AIMA http://aima.cs.berkeley.edu / 2 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural
More information22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1
Neural Networks Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Brains as Computational Devices Brains advantages with respect to digital computers: Massively parallel Fault-tolerant Reliable
More informationData Mining Part 5. Prediction
Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,
More informationCISC 3250 Systems Neuroscience
CISC 3250 Systems Neuroscience Systems Neuroscience How the nervous system performs computations How groups of neurons work together to achieve intelligence Professor Daniel Leeds dleeds@fordham.edu JMH
More informationMachine Learning. Neural Networks. (slides from Domingos, Pardo, others)
Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward
More informationIntroduction Biologically Motivated Crude Model Backpropagation
Introduction Biologically Motivated Crude Model Backpropagation 1 McCulloch-Pitts Neurons In 1943 Warren S. McCulloch, a neuroscientist, and Walter Pitts, a logician, published A logical calculus of the
More informationNeural Networks. Prof. Dr. Rudolf Kruse. Computational Intelligence Group Faculty for Computer Science
Neural Networks Prof. Dr. Rudolf Kruse Computational Intelligence Group Faculty for Computer Science kruse@iws.cs.uni-magdeburg.de Rudolf Kruse Neural Networks 1 Hopfield Networks Rudolf Kruse Neural Networks
More informationNeural networks. Chapter 20. Chapter 20 1
Neural networks Chapter 20 Chapter 20 1 Outline Brains Neural networks Perceptrons Multilayer networks Applications of neural networks Chapter 20 2 Brains 10 11 neurons of > 20 types, 10 14 synapses, 1ms
More informationRevision: Neural Network
Revision: Neural Network Exercise 1 Tell whether each of the following statements is true or false by checking the appropriate box. Statement True False a) A perceptron is guaranteed to perfectly learn
More informationLast update: October 26, Neural networks. CMSC 421: Section Dana Nau
Last update: October 26, 207 Neural networks CMSC 42: Section 8.7 Dana Nau Outline Applications of neural networks Brains Neural network units Perceptrons Multilayer perceptrons 2 Example Applications
More informationMachine Learning. Neural Networks. (slides from Domingos, Pardo, others)
Machine Learning Neural Networks (slides from Domingos, Pardo, others) Human Brain Neurons Input-Output Transformation Input Spikes Output Spike Spike (= a brief pulse) (Excitatory Post-Synaptic Potential)
More informationArtificial neural networks
Artificial neural networks Chapter 8, Section 7 Artificial Intelligence, spring 203, Peter Ljunglöf; based on AIMA Slides c Stuart Russel and Peter Norvig, 2004 Chapter 8, Section 7 Outline Brains Neural
More informationArtificial Neural Networks
Artificial Neural Networks CPSC 533 Winter 2 Christian Jacob Neural Networks in the Context of AI Systems Neural Networks as Mediators between Symbolic AI and Statistical Methods 2 5.-NeuralNets-2.nb Neural
More informationLecture 7 Artificial neural networks: Supervised learning
Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in
More informationLab 5: 16 th April Exercises on Neural Networks
Lab 5: 16 th April 01 Exercises on Neural Networks 1. What are the values of weights w 0, w 1, and w for the perceptron whose decision surface is illustrated in the figure? Assume the surface crosses the
More informationChapter 2 Single Layer Feedforward Networks
Chapter 2 Single Layer Feedforward Networks By Rosenblatt (1962) Perceptrons For modeling visual perception (retina) A feedforward network of three layers of units: Sensory, Association, and Response Learning
More informationFuzzy Systems. Neuro-Fuzzy Systems
Fuzzy Systems Neuro-Fuzzy Systems Prof. Dr. Rudolf Kruse Christian Moewes {kruse,cmoewes}@iws.cs.uni-magdeburg.de Otto-von-Guericke University of Magdeburg Faculty of Computer Science Department of Knowledge
More informationChapter 9. Nerve Signals and Homeostasis
Chapter 9 Nerve Signals and Homeostasis A neuron is a specialized nerve cell that is the functional unit of the nervous system. Neural signaling communication by neurons is the process by which an animal
More informationMachine Learning. Neural Networks. Le Song. CSE6740/CS7641/ISYE6740, Fall Lecture 7, September 11, 2012 Based on slides from Eric Xing, CMU
Machine Learning CSE6740/CS7641/ISYE6740, Fall 2012 Neural Networks Le Song Lecture 7, September 11, 2012 Based on slides from Eric Xing, CMU Reading: Chap. 5 CB Learning highly non-linear functions f:
More informationComputational Intelligence
Plan for Today Computational Intelligence Winter Term 207/8 Organization (Lectures / Tutorials) Overview CI Introduction to ANN McCulloch Pitts Neuron (MCP) Minsky / Papert Perceptron (MPP) Prof. Dr. Günter
More informationNeuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon
Neuron Detector Model 1 The detector model. 2 Biological properties of the neuron. 3 The computational unit. Each neuron is detecting some set of conditions (e.g., smoke detector). Representation is what
More informationControl and Integration. Nervous System Organization: Bilateral Symmetric Animals. Nervous System Organization: Radial Symmetric Animals
Control and Integration Neurophysiology Chapters 10-12 Nervous system composed of nervous tissue cells designed to conduct electrical impulses rapid communication to specific cells or groups of cells Endocrine
More information3 Detector vs. Computer
1 Neurons 1. The detector model. Also keep in mind this material gets elaborated w/the simulations, and the earliest material is often hardest for those w/primarily psych background. 2. Biological properties
More informationComputational Intelligence Winter Term 2009/10
Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund Plan for Today Organization (Lectures / Tutorials)
More informationArtificial Neural Network and Fuzzy Logic
Artificial Neural Network and Fuzzy Logic 1 Syllabus 2 Syllabus 3 Books 1. Artificial Neural Networks by B. Yagnanarayan, PHI - (Cover Topologies part of unit 1 and All part of Unit 2) 2. Neural Networks
More informationNeural Conduction. biologyaspoetry.com
Neural Conduction biologyaspoetry.com Resting Membrane Potential -70mV A cell s membrane potential is the difference in the electrical potential ( charge) between the inside and outside of the cell. The
More informationNerve Signal Conduction. Resting Potential Action Potential Conduction of Action Potentials
Nerve Signal Conduction Resting Potential Action Potential Conduction of Action Potentials Resting Potential Resting neurons are always prepared to send a nerve signal. Neuron possesses potential energy
More informationIntroduction and Perceptron Learning
Artificial Neural Networks Introduction and Perceptron Learning CPSC 565 Winter 2003 Christian Jacob Department of Computer Science University of Calgary Canada CPSC 565 - Winter 2003 - Emergent Computing
More informationChapter ML:VI. VI. Neural Networks. Perceptron Learning Gradient Descent Multilayer Perceptron Radial Basis Functions
Chapter ML:VI VI. Neural Networks Perceptron Learning Gradient Descent Multilayer Perceptron Radial asis Functions ML:VI-1 Neural Networks STEIN 2005-2018 The iological Model Simplified model of a neuron:
More informationComputational Intelligence
Plan for Today Computational Intelligence Winter Term 29/ Organization (Lectures / Tutorials) Overview CI Introduction to ANN McCulloch Pitts Neuron (MCP) Minsky / Papert Perceptron (MPP) Prof. Dr. Günter
More informationArtificial Neural Networks The Introduction
Artificial Neural Networks The Introduction 01001110 01100101 01110101 01110010 01101111 01101110 01101111 01110110 01100001 00100000 01110011 01101011 01110101 01110000 01101001 01101110 01100001 00100000
More informationNeural networks. Chapter 19, Sections 1 5 1
Neural networks Chapter 19, Sections 1 5 Chapter 19, Sections 1 5 1 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 19, Sections 1 5 2 Brains 10
More informationChapter 48 Neurons, Synapses, and Signaling
Chapter 48 Neurons, Synapses, and Signaling Concept 48.1 Neuron organization and structure reflect function in information transfer Neurons are nerve cells that transfer information within the body Neurons
More informationChapter 9: The Perceptron
Chapter 9: The Perceptron 9.1 INTRODUCTION At this point in the book, we have completed all of the exercises that we are going to do with the James program. These exercises have shown that distributed
More informationArtificial Neural Networks
Artificial Neural Networks Threshold units Gradient descent Multilayer networks Backpropagation Hidden layer representations Example: Face Recognition Advanced topics 1 Connectionist Models Consider humans:
More informationNeed for Deep Networks Perceptron. Can only model linear functions. Kernel Machines. Non-linearity provided by kernels
Need for Deep Networks Perceptron Can only model linear functions Kernel Machines Non-linearity provided by kernels Need to design appropriate kernels (possibly selecting from a set, i.e. kernel learning)
More informationCh. 5. Membrane Potentials and Action Potentials
Ch. 5. Membrane Potentials and Action Potentials Basic Physics of Membrane Potentials Nerve and muscle cells: Excitable Capable of generating rapidly changing electrochemical impulses at their membranes
More informationCE213 Artificial Intelligence Lecture 13
CE213 Artificial Intelligence Lecture 13 Neural Networks What is a Neural Network? Why Neural Networks? (New Models and Algorithms for Problem Solving) McCulloch-Pitts Neural Nets Learning Using The Delta
More informationHopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5
Hopfield Neural Network and Associative Memory Typical Myelinated Vertebrate Motoneuron (Wikipedia) PHY 411-506 Computational Physics 2 1 Wednesday, March 5 1906 Nobel Prize in Physiology or Medicine.
More informationEE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan
EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, 2012 Sasidharan Sreedharan www.sasidharan.webs.com 3/1/2012 1 Syllabus Artificial Intelligence Systems- Neural Networks, fuzzy logic,
More informationNeurons, Synapses, and Signaling
LECTURE PRESENTATIONS For CAMPBELL BIOLOGY, NINTH EDITION Jane B. Reece, Lisa A. Urry, Michael L. Cain, Steven A. Wasserman, Peter V. Minorsky, Robert B. Jackson Chapter 48 Neurons, Synapses, and Signaling
More informationNeural networks. Chapter 20, Section 5 1
Neural networks Chapter 20, Section 5 Chapter 20, Section 5 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 20, Section 5 2 Brains 0 neurons of
More informationBIOLOGY 11/10/2016. Neurons, Synapses, and Signaling. Concept 48.1: Neuron organization and structure reflect function in information transfer
48 Neurons, Synapses, and Signaling CAMPBELL BIOLOGY TENTH EDITION Reece Urry Cain Wasserman Minorsky Jackson Lecture Presentation by Nicole Tunbridge and Kathleen Fitzpatrick Concept 48.1: Neuron organization
More informationNeural Networks. Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994
Neural Networks Neural Networks Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994 An Introduction to Neural Networks (nd Ed). Morton, IM, 1995 Neural Networks
More informationNervous Tissue. Neurons Neural communication Nervous Systems
Nervous Tissue Neurons Neural communication Nervous Systems What is the function of nervous tissue? Maintain homeostasis & respond to stimuli Sense & transmit information rapidly, to specific cells and
More informationNeed for Deep Networks Perceptron. Can only model linear functions. Kernel Machines. Non-linearity provided by kernels
Need for Deep Networks Perceptron Can only model linear functions Kernel Machines Non-linearity provided by kernels Need to design appropriate kernels (possibly selecting from a set, i.e. kernel learning)
More informationUsing Variable Threshold to Increase Capacity in a Feedback Neural Network
Using Variable Threshold to Increase Capacity in a Feedback Neural Network Praveen Kuruvada Abstract: The article presents new results on the use of variable thresholds to increase the capacity of a feedback
More informationARTIFICIAL INTELLIGENCE. Artificial Neural Networks
INFOB2KI 2017-2018 Utrecht University The Netherlands ARTIFICIAL INTELLIGENCE Artificial Neural Networks Lecturer: Silja Renooij These slides are part of the INFOB2KI Course Notes available from www.cs.uu.nl/docs/vakken/b2ki/schema.html
More informationy(x n, w) t n 2. (1)
Network training: Training a neural network involves determining the weight parameter vector w that minimizes a cost function. Given a training set comprising a set of input vector {x n }, n = 1,...N,
More informationResting Distribution of Ions in Mammalian Neurons. Outside Inside (mm) E ion Permab. K Na Cl
Resting Distribution of Ions in Mammalian Neurons Outside Inside (mm) E ion Permab. K + 5 100-81 1.0 150 15 +62 0.04 Cl - 100 10-62 0.045 V m = -60 mv V m approaches the Equilibrium Potential of the most
More informationAdministration. Registration Hw3 is out. Lecture Captioning (Extra-Credit) Scribing lectures. Questions. Due on Thursday 10/6
Administration Registration Hw3 is out Due on Thursday 10/6 Questions Lecture Captioning (Extra-Credit) Look at Piazza for details Scribing lectures With pay; come talk to me/send email. 1 Projects Projects
More informationNeural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET
Unit-. Definition Neural network is a massively parallel distributed processing system, made of highly inter-connected neural computing elements that have the ability to learn and thereby acquire knowledge
More informationPhysiology Unit 2. MEMBRANE POTENTIALS and SYNAPSES
Physiology Unit 2 MEMBRANE POTENTIALS and SYNAPSES Neuron Communication Neurons are stimulated by receptors on dendrites and cell bodies (soma) Ligand gated ion channels GPCR s Neurons stimulate cells
More informationCE213 Artificial Intelligence Lecture 14
CE213 Artificial Intelligence Lecture 14 Neural Networks: Part 2 Learning Rules -Hebb Rule - Perceptron Rule -Delta Rule Neural Networks Using Linear Units [ Difficulty warning: equations! ] 1 Learning
More informationNeural Networks (Part 1) Goals for the lecture
Neural Networks (Part ) Mark Craven and David Page Computer Sciences 760 Spring 208 www.biostat.wisc.edu/~craven/cs760/ Some of the slides in these lectures have been adapted/borrowed from materials developed
More informationMEMBRANE POTENTIALS AND ACTION POTENTIALS:
University of Jordan Faculty of Medicine Department of Physiology & Biochemistry Medical students, 2017/2018 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Review: Membrane physiology
More informationNervous & Endocrine System
3/19 HW Day 1 Read pages 897-900 Complete Vocab. on pg 897 Aim: What is Regulation? Do Now: What 2 organ systems are involved in regulation? Nervous & Endocrine System Regulation: The control and coordination
More informationAI Programming CS F-20 Neural Networks
AI Programming CS662-2008F-20 Neural Networks David Galles Department of Computer Science University of San Francisco 20-0: Symbolic AI Most of this class has been focused on Symbolic AI Focus or symbols
More informationNervous System Organization
The Nervous System Chapter 44 Nervous System Organization All animals must be able to respond to environmental stimuli -Sensory receptors = Detect stimulus -Motor effectors = Respond to it -The nervous
More informationArtificial Neural Networks Examination, June 2005
Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either
More informationThe Perceptron Algorithm 1
CS 64: Machine Learning Spring 5 College of Computer and Information Science Northeastern University Lecture 5 March, 6 Instructor: Bilal Ahmed Scribe: Bilal Ahmed & Virgil Pavlu Introduction The Perceptron
More informationArtificial Neural Networks Examination, June 2004
Artificial Neural Networks Examination, June 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum
More informationArtificial Neural Networks. Historical description
Artificial Neural Networks Historical description Victor G. Lopez 1 / 23 Artificial Neural Networks (ANN) An artificial neural network is a computational model that attempts to emulate the functions of
More informationNOTES: CH 48 Neurons, Synapses, and Signaling
NOTES: CH 48 Neurons, Synapses, and Signaling A nervous system has three overlapping functions: 1) SENSORY INPUT: signals from sensory receptors to integration centers 2) INTEGRATION: information from
More informationCS:4420 Artificial Intelligence
CS:4420 Artificial Intelligence Spring 2018 Neural Networks Cesare Tinelli The University of Iowa Copyright 2004 18, Cesare Tinelli and Stuart Russell a a These notes were originally developed by Stuart
More informationArtificial Neural Networks. Q550: Models in Cognitive Science Lecture 5
Artificial Neural Networks Q550: Models in Cognitive Science Lecture 5 "Intelligence is 10 million rules." --Doug Lenat The human brain has about 100 billion neurons. With an estimated average of one thousand
More information(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann
(Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for
More informationLeast Mean Squares Regression
Least Mean Squares Regression Machine Learning Spring 2018 The slides are mainly from Vivek Srikumar 1 Lecture Overview Linear classifiers What functions do linear classifiers express? Least Squares Method
More informationCSC242: Intro to AI. Lecture 21
CSC242: Intro to AI Lecture 21 Administrivia Project 4 (homeworks 18 & 19) due Mon Apr 16 11:59PM Posters Apr 24 and 26 You need an idea! You need to present it nicely on 2-wide by 4-high landscape pages
More informationArtificial Neural Networks. Edward Gatt
Artificial Neural Networks Edward Gatt What are Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning Very
More informationBIOLOGY. 1. Overview of Neurons 11/3/2014. Neurons, Synapses, and Signaling. Communication in Neurons
CAMPBELL BIOLOGY TENTH EDITION 48 Reece Urry Cain Wasserman Minorsky Jackson Neurons, Synapses, and Signaling Lecture Presentation by Nicole Tunbridge and Kathleen Fitzpatrick 1. Overview of Neurons Communication
More informationLecture 6. Notes on Linear Algebra. Perceptron
Lecture 6. Notes on Linear Algebra. Perceptron COMP90051 Statistical Machine Learning Semester 2, 2017 Lecturer: Andrey Kan Copyright: University of Melbourne This lecture Notes on linear algebra Vectors
More informationNeurons and Nervous Systems
34 Neurons and Nervous Systems Concept 34.1 Nervous Systems Consist of Neurons and Glia Nervous systems have two categories of cells: Neurons, or nerve cells, are excitable they generate and transmit electrical
More informationOverview. Knowledge-Based Agents. Introduction. COMP219: Artificial Intelligence. Lecture 19: Logic for KR
COMP219: Artificial Intelligence Lecture 19: Logic for KR Last time Expert Systems and Ontologies oday Logic as a knowledge representation scheme Propositional Logic Syntax Semantics Proof theory Natural
More informationArtificial Neural Networks
Artificial Neural Networks Short introduction Bojana Dalbelo Bašić, Marko Čupić, Jan Šnajder Faculty of Electrical Engineering and Computing University of Zagreb Zagreb, June 6, 2018 Dalbelo Bašić, Čupić,
More informationNeurons, Synapses, and Signaling
CAMPBELL BIOLOGY IN FOCUS URRY CAIN WASSERMAN MINORSKY REECE 37 Neurons, Synapses, and Signaling Lecture Presentations by Kathleen Fitzpatrick and Nicole Tunbridge, Simon Fraser University SECOND EDITION
More informationNeural Networks. Xiaojin Zhu Computer Sciences Department University of Wisconsin, Madison. slide 1
Neural Networks Xiaoin Zhu erryzhu@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison slide 1 Terminator 2 (1991) JOHN: Can you learn? So you can be... you know. More human. Not
More informationUNIT I INTRODUCTION TO ARTIFICIAL NEURAL NETWORK IT 0469 NEURAL NETWORKS
UNIT I INTRODUCTION TO ARTIFICIAL NEURAL NETWORK IT 0469 NEURAL NETWORKS Elementary Neuro Physiology Neuron: A neuron nerve cell is an electricallyexcitable cell that processes and transmits information
More informationSections 18.6 and 18.7 Artificial Neural Networks
Sections 18.6 and 18.7 Artificial Neural Networks CS4811 - Artificial Intelligence Nilufer Onder Department of Computer Science Michigan Technological University Outline The brain vs. artifical neural
More informationCOMP219: Artificial Intelligence. Lecture 19: Logic for KR
COMP219: Artificial Intelligence Lecture 19: Logic for KR 1 Overview Last time Expert Systems and Ontologies Today Logic as a knowledge representation scheme Propositional Logic Syntax Semantics Proof
More information