Artificial Neural Network

Similar documents
Lecture 7 Artificial neural networks: Supervised learning

Data Mining Part 5. Prediction

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan

Lecture 4: Feed Forward Neural Networks

Neural Networks and the Back-propagation Algorithm

Neural Networks Introduction

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

Artifical Neural Networks

Artificial Neural Networks. Edward Gatt

2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks. Todd W. Neller

Neural networks. Chapter 19, Sections 1 5 1

Artificial Neural Network and Fuzzy Logic

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Neural networks. Chapter 20. Chapter 20 1

CS:4420 Artificial Intelligence

Artificial neural networks

Revision: Neural Network

Artificial Neural Networks. Q550: Models in Cognitive Science Lecture 5

Neural networks. Chapter 20, Section 5 1

CMSC 421: Neural Computation. Applications of Neural Networks

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Artificial Neural Networks The Introduction

4. Multilayer Perceptrons

Part 8: Neural Networks

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

Artificial Neural Networks. Historical description

Feedforward Neural Nets and Backpropagation

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Instituto Tecnológico y de Estudios Superiores de Occidente Departamento de Electrónica, Sistemas e Informática. Introductory Notes on Neural Networks

Introduction To Artificial Neural Networks

Sections 18.6 and 18.7 Artificial Neural Networks

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning

Unit III. A Survey of Neural Network Model

Unit 8: Introduction to neural networks. Perceptrons

Simulating Neural Networks. Lawrence Ward P465A

SPSS, University of Texas at Arlington. Topics in Machine Learning-EE 5359 Neural Networks

Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!

Introduction Biologically Motivated Crude Model Backpropagation

Simple neuron model Components of simple neuron

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption

AI Programming CS F-20 Neural Networks

Artificial Intelligence

Multilayer Perceptron Tutorial

COMP-4360 Machine Learning Neural Networks

Linear Regression, Neural Networks, etc.

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen

Neural Networks DWML, /25

Neural Networks and Ensemble Methods for Classification

Sections 18.6 and 18.7 Artificial Neural Networks

Lecture 4: Perceptrons and Multilayer Perceptrons

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm

Introduction to Neural Networks

Ch.8 Neural Networks

COMP304 Introduction to Neural Networks based on slides by:

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009

Fundamentals of Neural Networks

100 inference steps doesn't seem like enough. Many neuron-like threshold switching units. Many weighted interconnections among units

Introduction to Artificial Neural Networks

COMP9444 Neural Networks and Deep Learning 2. Perceptrons. COMP9444 c Alan Blair, 2017

EEE 241: Linear Systems

Master Recherche IAC TC2: Apprentissage Statistique & Optimisation

Neural Network Based Response Surface Methods a Comparative Study

Neural Networks. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington

Lab 5: 16 th April Exercises on Neural Networks

Computational statistics

Introduction to Neural Networks

Artificial Neural Networks Examination, June 2005

Artificial Neural Networks. MGS Lecture 2

Artificial Neural Networks

In the Name of God. Lecture 9: ANN Architectures

Course 395: Machine Learning - Lectures

Neural Networks. Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994

Hopfield Neural Network

Machine Learning. Neural Networks

Simple Neural Nets For Pattern Classification

Artificial Neural Networks

COGS Q250 Fall Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November.

Numerical Learning Algorithms

Object Recognition Using a Neural Network and Invariant Zernike Features

Introduction and Perceptron Learning

Sections 18.6 and 18.7 Analysis of Artificial Neural Networks

POWER SYSTEM DYNAMIC SECURITY ASSESSMENT CLASSICAL TO MODERN APPROACH

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

Neural Networks: Introduction

Synaptic Devices and Neuron Circuits for Neuron-Inspired NanoElectronics

Neural Networks, Computation Graphs. CMSC 470 Marine Carpuat

Perceptron. (c) Marcin Sydow. Summary. Perceptron

Machine Learning. Neural Networks. Le Song. CSE6740/CS7641/ISYE6740, Fall Lecture 7, September 11, 2012 Based on slides from Eric Xing, CMU

Fundamentals of Neural Network

Artificial Neural Networks

An artificial neural networks (ANNs) model is a functional abstraction of the

INTRODUCTION TO NEURAL NETWORKS

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

Plan. Perceptron Linear discriminant. Associative memories Hopfield networks Chaotic networks. Multilayer perceptron Backpropagation

CN2 1: Introduction. Paul Gribble. Sep 10,

Study of a neural network-based system for stability augmentation of an airplane

Transcription:

Artificial Neural Network

Contents 2 What is ANN? Biological Neuron Structure of Neuron Types of Neuron Models of Neuron Analogy with human NN Perceptron OCR Multilayer Neural Network Back propagation

What is ANN? 3 A neural network can be defined as a model of reasoning based on the human brain. The brain consists of a densly interconnected set of nerve cells (information processing units) called neurons. Human brain has 10 billion neurons and 60 trillion connections. ANN s are a type of artificial intelligence that attempts to imitate the way a human brain works. The approach is beginning to prove useful in certain areas that involve recognizing complex patterns, such as voice recognition and image recognition.

Biological Neuron 4 A neuron has a cell body, a branching input structure and a branching output structure (the axon) Axons connect to dendrites via synapses. Electro-chemical signals are propagated from the dendrites input, through the cell body, and down the axon to other neurons

Structure of Neuron 5 A neuron only fires if its input signal exceeds a certain amount (the threshold) in a short time period. Synapses vary in strength Good connections allowing a large signal Slight connections allow only a weak signal. Synapses can be either excitatory or inhibitory.

Types of Neuron Neuron comes in many shapes and sizes 6Artificial Intelligence, Lecturer #22

Models of Neuron 7 Neuron is an information processing unit A set of synapses or connecting links characterized by weight or strength An adder summing the input signals weighted by synapses a linear combiner An activation function also called squashing function -squash (limits) the output to some finite values

Model of Neuron 8

Analogy 9 Inputs represent synapses Weights represent the strengths of synaptic links Wire presents dendrite secretion Summation block represents the addition of the secretions Output represents axon voltage

Explanation 10 Neural Networks use a set of processing elements (or nodes) loosely analogous to neurons in the brain. These nodes are interconnected in a network that can then identify patterns in data as it is exposed to the data. In a sense, the network learns from experience just as people do (case of supervised learning). This distinguishes neural networks from traditional computing programs, that simply follow instructions in a fixed sequential order.

Requirements to Build an ANN 11 How many neurons are to be used? How the neurons are to be connected to form a network. Which learning algorithm to use? How to train the neural network? Training: Initialize the weights of the network and update the weights from a set of training examples

Diagram of a Neuron 12 Input signals Weight Output Signals x 1 w 1 x 2 w 2 Neuron Y... x 3 w n

How Does the Neuron Determine its Output? The neuron computes the weighted sum of the input signals and compares the result with a threshold value of, T h If the net weighted input is less than the threshold the neuron output is 1. If the net weighted input is greater than or equal to the threshold, the neuron becomes activated and its output attains a value +1 (This type of activation function is called a sign function) 13

Example of NN: The Perceptron x1 w1 Linear Combiner Σ Hard Limiter Y-output 14 Single neuron with adjustable synaptic weight and a hard limiter. w2 x2 Th n i 1 x w i i Th 0 Threshold Step & sign activation function called hard limit functions.

How Does the Perceptron Learn? Step 1: Initialization Set the initial weights w 1,w 2,.w n and Threshold-Th Step 2: Activation Active the perceptron by applying inputs x 1 (p), x 2 (p).. x n (p) and desired output Y d (p). Where p iteration, n number of inputs Step 3: Weight Training Update the weight of the perceptron. w ( p 1) w ( p) x ( p) e( p) i i i Step 4: Iteration Increase iteration p by one, go back to step 2 and repeat the process. 15

Train a Perceptron to Perform Logical AND operation (1) 16 Epoch Inputs Y(d) Initial Weight Output Error Final Weight x1 x2 Yd w1 w2 Y e w1 w2 1 0 0 0 0.3-0.1 0 0 0.3-0.1 0 1 0 0.3-0.1 0 0 0.3-0.1 1 0 0 0.3-0.1 1-1 0.2-0.1 1 1 1 0.2-0.1 0 1 0.3 0.0 2 0 0 0 0.3 0.0 0 0 0.3 0.0 0 1 0 0.3 0.0 0 0 0.3 0.0 1 0 0 0.3 0.0 1-1 0.2 0.0 1 1 1 0.2 0.0 1 0 0.2 0.0 Threshold=0.2, Learning rate = 0.1

Train a Perceptron to Perform Logical AND operation (2) 17 Epoch Inputs Y(d) Initial Weight Output Error Final Weight x1 x2 Yd w1 w2 Y w1 w2 3 0 0 0 0.2 0.0 0 0 0.2 0.0 0 1 0 0.2 0.0 0 0 0.2 0.0 1 0 0 0.2 0.0 1-1 0.1 0.0 1 1 1 0.1 0.0 0 1 0.2 0.1 4 0 0 0 0.2 0.1 0 0 0.2 0.1 0 1 0 0.2 0.1 0 0 0.2 0.1 1 0 0 0.2 0.1 1-1 0.1 0.1 1 1 1 0.1 0.1 1 0 0.1 0.1 Threshold=0.2, Learning rate = 0.1

Train a Perceptron to Perform Logical AND operation (3) 18 Epoch Inputs Y(d) Initial Weight Output Error Final Weight x1 x2 Yd w1 W2 Y w1 w2 5 0 0 0 0.1 0.1 0 0 0.1 0.1 0 1 0 0.1 0.1 0 0 0.1 0.1 1 0 0 0.1 0.1 0 0 0.1 0.1 1 1 1 0.1 0.1 1 0 0.1 0.1 Threshold=0.2, Learning rate = 0.1

Application of ANN: OCR 19 OCR stands for Optical Character Recognition Necessity: A machine that reads banking checks can process many more checks than a human being in the same time. This kind of application saves time and money, and eliminates the requirement that a human perform such a repetitive task. Using ANN, a device is designed and trained to recognize the 26 letters of the alphabet

Advanced ANN: Multilayer Neural Network 20 A multilayer perceptron is a feedforward network with one or more hidden layers The network consists of: an input layer of source neurons, at least one middle or hidden layer of computation neurons An output layer of computation neurons The input signals are propagated in a forward direction on a layer-by-layer basis

Multilayer Perceptron with two Hidden Layers 21

Why do we Need a Hidden Layer? 22 The input layer accepts input signals from the outside world and redistributes these signals to all neurons in the hidden layer. Neuron in the hidden layer detect the features; the weights of the neurons represent the features hidden in the input patterns. The output layer accepts output signal from the hidden layer and establishes the output pattern of the entire network.

How Do Multilayer Neural Networks Learn? Most popular method of learning is back-propagation. Learning in a multi-layer network proceeds the same way as for a perceptron A training set of input patterns is presented to the network The network computes the output pattern. If there is an error, the weight are adjusted to reduce this error. In multilayer network, there are many weights, each of which contributes to more than one output. 23

Back Propagation Neural Network (2/2) 24 The net weighted input value is passed through the activation function. Unlike a perceptron, neuron in the back propagation network use a sigmoid activation function: Y sigmoid 1 1 e X

Three-layer Back Propagation Neural Network 25

Learning Law Used in Back- Propagation Network In three layer network, i,j and k refer to neurons in the input, hidden and output layers. Input signal x 1, x 2, x n, are propagated through the network from left to right Error signals e 1, e 2, e l from right to left. The symbol W ij denotes the weight for the connection between neuron i in the input layer and neuron j in the hidden layer The symbol W jk denotes the weight for the connection between neuron j in the hidden layer and neuron k in the output layer 26

Learning Law Used in Back- Propagation Network The error signal at the output of neuron k at iteration p is defined by, 27 e ( p) y ( p) y ( p) k d, k k The updated weight at the output layer defined by, is W p W p y p p ( 1) ( ) ( ) ( ) jk jk j k

Back Propagation Training Algorithm Initialization : Set all the weights and threshold levels of the network to random numbers uniformly distributed inside a small range (Haykin 1994): (-2.4/F i, +2.4/F i ), Where F i is the total number of inputs of neuron i in the network. Activation: Calculate the actual outputs of the neurons in the hidden layer Calculate the actual outputs of the neurons in the output layer Weight Training: Update the weights in the back-propagation network propagating backward the errors associated with output neurons. Iteration: Increase iteration p by one, go back to step 2 and repeat the process until the selected error criterion is satisfied. 28

Back-propagation: Activation 29 (A) Calculate the actual outputs of the neurons in the hidden layer n y ( p) sigmoid[ x ( p) W ( p) ] j i ij j i 1 (B) Calculate the actual outputs of the neurons in the outp ut layer m y ( p) sigmoid [ x ( p) W ( p) k jk jk k j 1

Back-propagation: Weight Training 30 (A) Calculate the error gradient for the neurons in the output layer. ( p) y ( p)[1 y ( p)] e ( p) k k k k e ( p) y ( p) y ( p) k d, k k W p W p y p p ( 1) ( ) ( ) ( ) jk jk j k

Back-propagation: Weight Training 31 (B) Calculate the error gradient for the neurons in the hidden layer. ( p) y ( p)[1 y ( p)] ( p) W ( p) j j j k jk k 1 l W p W p x p p ( 1) ( ) ( ) ( ) ij ij i j

Recommended Textbooks 32 [Negnevitsky, 2001] M. Negnevitsky Artificial Intelligence: A guide to Intelligent Systems, Pearson Education Limited, England, 2002. [Russell, 2003] S. Russell and P. Norvig Artificial Intelligence: A Modern Approach Prentice Hall, 2003, Second Edition [Patterson, 1990] D. W. Patterson, Introduction to Artificial Intelligence and Expert Systems, Prentice-Hall Inc., Englewood Cliffs, N.J, USA, 1990. [Lindsay, 1997] P. H. Lindsay and D. A. Norman, Human Information Processing: An Introduction to Psychology, Academic Press, 1977.