Artificial Neural Networks. Part 2

Similar documents
Networks of McCulloch-Pitts Neurons

Part 8: Neural Networks

Neural Networks. Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

Simple Neural Nets For Pattern Classification

COMP9444 Neural Networks and Deep Learning 2. Perceptrons. COMP9444 c Alan Blair, 2017

Introduction to Neural Networks

Neural networks and support vector machines

N-bit Parity Neural Networks with minimum number of threshold neurons

Multilayer Feedforward Networks. Berlin Chen, 2002

Simple Neural Nets for Pattern Classification: McCulloch-Pitts Threshold Logic CS 5870

EEE 241: Linear Systems

Reification of Boolean Logic

Artificial Neural Networks. Q550: Models in Cognitive Science Lecture 5

Neural Networks. Associative memory 12/30/2015. Associative memories. Associative memories

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Neural Networks: Introduction

Logistic Regression. Machine Learning Fall 2018

Preliminaries. Definition: The Euclidean dot product between two vectors is the expression. i=1

Introduction to Artificial Neural Networks

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Data Mining Part 5. Prediction

Course 395: Machine Learning - Lectures

Machine Learning and Data Mining. Multi-layer Perceptrons & Neural Networks: Basics. Prof. Alexander Ihler

Neural Networks Introduction

Artificial Neural Networks The Introduction

Feedforward Neural Nets and Backpropagation

Artificial Neural Networks Examination, March 2004

Motivation for the topic of the seminar

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

18.6 Regression and Classification with Linear Models

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

Neural networks. Chapter 20, Section 5 1

Linear models: the perceptron and closest centroid algorithms. D = {(x i,y i )} n i=1. x i 2 R d 9/3/13. Preliminaries. Chapter 1, 7.

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Neural networks. Chapter 19, Sections 1 5 1

Artificial Neural Networks

Lecture 4: Feed Forward Neural Networks

Machine Learning. Neural Networks

Multi-Dimensional Neural Networks: Unified Theory

Artificial Intelligence

Introduction To Artificial Neural Networks

CSC Neural Networks. Perceptron Learning Rule

Lecture 7 Artificial neural networks: Supervised learning

Unit III. A Survey of Neural Network Model

Linear models and the perceptron algorithm

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net.

Artificial Neural Networks Examination, June 2005

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009

Artificial Neural Networks

Neural networks. Chapter 20. Chapter 20 1

Intelligent Systems Discriminative Learning, Neural Networks

Lecture 3a: The Origin of Variational Bayes

Neural Networks. Nicholas Ruozzi University of Texas at Dallas

Neural Networks and the Back-propagation Algorithm

Neural Networks Lecture 4: Radial Bases Function Networks

Computational Intelligence Winter Term 2009/10

Master Recherche IAC TC2: Apprentissage Statistique & Optimisation

Computational Intelligence

Introduction and Perceptron Learning

Machine Learning: Logistic Regression. Lecture 04

Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!

Neural Networks DWML, /25

Artificial Neural Network and Fuzzy Logic

CS:4420 Artificial Intelligence

COMP 551 Applied Machine Learning Lecture 14: Neural Networks

Multilayer Neural Networks

Artificial Neural Networks Examination, June 2004

Single layer NN. Neuron Model

y(x) = x w + ε(x), (1)

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET

Neural Networks Based on Competition

Artificial Neural Networks. Introduction to Computational Neuroscience Tambet Matiisen

CS 4700: Foundations of Artificial Intelligence

Unit 8: Introduction to neural networks. Perceptrons

Artificial neural networks

Revision: Neural Network

Nonlinear Classification

Artificial Neural Network

Neural Networks (Part 1) Goals for the lecture

Linear Discriminant Functions

Simple neuron model Components of simple neuron

Simulating Neural Networks. Lawrence Ward P465A

AI Programming CS F-20 Neural Networks

Neural Networks Lecturer: J. Matas Authors: J. Matas, B. Flach, O. Drbohlav

Introduction Biologically Motivated Crude Model Backpropagation

Computational Intelligence

COMP-4360 Machine Learning Neural Networks

2018 EE448, Big Data Mining, Lecture 5. (Part II) Weinan Zhang Shanghai Jiao Tong University

Sections 18.6 and 18.7 Artificial Neural Networks

CSC321 Lecture 5: Multilayer Perceptrons

Machine Learning for Large-Scale Data Analysis and Decision Making A. Neural Networks Week #6

CMSC 421: Neural Computation. Applications of Neural Networks

Multilayer Perceptron Tutorial

Learning and Memory in Neural Networks

Neural Networks biological neuron artificial neuron 1

) (d o f. For the previous layer in a neural network (just the rightmost layer if a single neuron), the required update equation is: 2.

COGS Q250 Fall Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November.

Transcription:

Artificial Neural Netorks Part

Artificial Neuron Model Folloing simplified model of real neurons is also knon as a Threshold Logic Unit x McCullouch-Pitts neuron (943) x x n n Body of neuron f out Biological Terminology Neuron Synapse Synaptic Efficiency Firing Frequency Artificial Neural Netork Terminology Node / Unit / Cell / Neurode Connection / Edge / Link Connection Strength / Weight Node Output

McCullouch-Pitts neuron x x x n n f out The neuron output may be ritten as: f x T x = n i= i x i n x n n x i i i= f T x f

McCullouch-Pitts neuron Activation functions Typical activation functions used are: Step functions hard-limiting activation functions Bipolar binary Unipolar binary f ( net) = sgn( net) =,, net net f ( net) = sgn( net) =,, net net The step function is very easy to implement Output does not increase/decrease to values hose magnitude is excessively high The outputs of the step function may be interpreted as class identifier

McCullouch-Pitts neuron Activation functions Typical activation functions used are: Ramp functions Bipolar binary Unipolar binary f if net ( net) = if net net otherise f if net ( net) = if net net otherise

McCullouch-Pitts neuron Activation functions Typical activation functions used are: Sigmoid functions Bipolar binary Unipolar binary f ( net) = exp( net) f ( net) = exp( net)

McCullouch-Pitts neuron Activation functions Typical activation functions used are: Gaussian functions f(net) = exp ( ( ) ) net

Perceptron An arrangement of one input layer of McCulloch-Pitts (M-P) neurons feeding forard to one output layer of McCulloch-Pitts neurons is knon as a Perceptron x n o x x n n n o, sgn( net) =, o m out net net n j = sgn( xi ij ) i= Connection eights beteen layer i and j

Perceptron We can use McCulloch-Pitts neurons to implement the basic logic gates: input - NOT sgn Output = input 5 5 AND sgn = Output

Perceptron sgn We can use McCulloch-Pitts neurons to implement the basic logic gates: input OR Output = input XOR Output sgn

Perceptron, AND, (, ) (, ) input x y sgn Output out The decision boundary is at:,, x y = = 3 = 5 = 5 = 3 = 7 = y = x

Perceptron AND OR,,,,,,,, XOR,,?,, It as the proof by Minsky & Papert in 969 that Perceptrons could only learn linearly separable functions that led to decline in neural netork research until the mid 98's hen it as proved that other netork architectures could learn these type of functions

Perceptron It ould simplify the mathematics if e could treat the neuron threshold as if it ere just another connection eight n i= x i ij = x j x j x 3 3j x n nj It is easy to see that if e define j = j and x = then this becomes: n i= x i ij = x j x j x 3 3j x n nj x j = n i= x i ij x x x n q bias out We just have to include an extra input unit ith activation x = and then e only need to compute eights, and no explicit thresholds

Neural Netorks Architectures No e ill discuss artificial neural netork architectures, some of hich derive inspiration from biological neural netorks Fully connected netorks every node is connected to every node, and these connections may be either excitatory (positive eights), inhibitory (negative eights), or irrelevant (almost zero eights) large number of parameters Difficult learning Biologically implausible

Neural Netorks Architectures Layered netorks These are netorks in hich nodes are partitioned into subsets called layers, ith no connections that lead from layer j to layer k if j>k Connections, ith arbitrary eights, may exist from any node in layer i to any node in layer j for j i intra-layer connections may exist

Neural Netorks Architectures Layered netorks Acyclic netorks there are no intra-layer connections In other ords, a connection may exist beteen any node in layer i and any node in layer j for j > i, but a connection is not alloed for i = j

Neural Netorks Architectures Layered netorks Acyclic netorks Feed forard netorks This is a subclass of acyclic netorks in hich a connection is alloed from a node in layer i only to nodes in layer i These netorks can be described by a sequence of numbers indicating the number of nodes in each layer Recurrent neural netorks A feed forard 3--3- netork

Perceptron Example A typical neural netork application is classification Ho do e construct a neural netork that can classify airplanes? Mass Speed Class Bomber Bomber 3 Fighter 3 Bomber 4 Fighter 3 4 Bomber 5 Fighter 5 5 Bomber 5 6 Fighter 6 7 Fighter General Procedure for Building Neural Netorks Understand and specify your problem in terms of inputs and required outputs Take the simplest form of netork you think might be able to solve your problem 3 Try to find appropriate connection eights (including neuron thresholds) 4 Make sure that the netork orks on its training and test data 5 If the netork doesn t perform ell enough, go back to stage 3 and try harder 6 If the netork still doesn t perform ell enough, go back to stage

Perceptron Example A typical neural netork application is classification Ho do e construct a neural netork that can classify airplanes? Mass Speed Class Bomber Bomber 3 Fighter 3 Bomber 4 Fighter 3 4 Bomber 5 Fighter 5 5 Bomber 5 6 Fighter 6 7 Fighter the inputs can be direct encodings of the masses and speeds We have to classes, so just one output unit is enough, ith activation for fighter and for bomber (or vice versa) The simplest netork to try first is a simple Perceptron Mass sgn Class class = sgn( Mass Speed) Speed Bias

Neural Netork Toolbox You can experiment ith a to-element neuron by running: nndn

Neural Netork Toolbox You might ant to try demo nnd4db It allos you to pick ne input vectors and change the decision boundary to monitor eights and bias changes