Simple neuron model Components of simple neuron

Similar documents
Νεςπο-Ασαυήρ Υπολογιστική Neuro-Fuzzy Computing

4. Multilayer Perceptrons

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET

Artificial Neural Network and Fuzzy Logic

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning

Data Mining Part 5. Prediction

Artificial Neural Network

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan

Lecture 10. Neural networks and optimization. Machine Learning and Data Mining November Nando de Freitas UBC. Nonlinear Supervised Learning

Artifical Neural Networks

Pattern Recognition Prof. P. S. Sastry Department of Electronics and Communication Engineering Indian Institute of Science, Bangalore

Introduction to Neural Networks

Artificial Neural Networks The Introduction

Artificial Neural Networks

Backpropagation Neural Net

Simulating Neural Networks. Lawrence Ward P465A

CSC321 Lecture 5: Multilayer Perceptrons

Simple Neural Nets For Pattern Classification

Security Analytics. Topic 6: Perceptron and Support Vector Machine

ECE521 Lectures 9 Fully Connected Neural Networks

Lecture 7 Artificial neural networks: Supervised learning

Multilayer Perceptron

Learning Deep Architectures for AI. Part I - Vijay Chakilam

Neural Networks and Ensemble Methods for Classification

Learning and Memory in Neural Networks

Unit III. A Survey of Neural Network Model

Feedforward Neural Nets and Backpropagation

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

SPSS, University of Texas at Arlington. Topics in Machine Learning-EE 5359 Neural Networks

Machine Learning (CSE 446): Neural Networks

Neural Networks with Applications to Vision and Language. Feedforward Networks. Marco Kuhlmann

EEE 241: Linear Systems

ECLT 5810 Classification Neural Networks. Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

Neural Networks Introduction

New Delhi & Affiliated to VTU, Belgavi ) Oorgaum, Kolar Dist ,Karnataka

Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!

Machine Learning and Data Mining. Multi-layer Perceptrons & Neural Networks: Basics. Prof. Alexander Ihler

Neural networks. Chapter 19, Sections 1 5 1

Introduction to Artificial Neural Networks

Gradient Descent Training Rule: The Details

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009

INVERSE RELIABILITY ANALYSIS IN STRUCTURAL DESIGN

CSC 411 Lecture 10: Neural Networks

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

COMP9444 Neural Networks and Deep Learning 2. Perceptrons. COMP9444 c Alan Blair, 2017

Neural Networks: Basics. Darrell Whitley Colorado State University

Artificial Neural Network

DEEP LEARNING AND NEURAL NETWORKS: BACKGROUND AND HISTORY

Artificial Neural Networks. Part 2

Multi-Layer Perceptron in MATLAB NN Toolbox

In Situ Adaptive Tabulation for Real-Time Control

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

Artificial Intelligence

A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption

ECE521 Lecture 7/8. Logistic Regression

Artificial neural networks

Deep Feedforward Networks

Artificial Neural Networks. Edward Gatt

Introduction Biologically Motivated Crude Model Backpropagation

Neural Networks (Part 1) Goals for the lecture

Chapter 9: The Perceptron

Neuro-Fuzzy Comp. Ch. 4 March 24, R p

Intro to Neural Networks and Deep Learning

Artificial Neuron (Perceptron)

Multilayer Perceptron Tutorial

Instituto Tecnológico y de Estudios Superiores de Occidente Departamento de Electrónica, Sistemas e Informática. Introductory Notes on Neural Networks

Neural Networks. Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994

1 What a Neural Network Computes

Neural networks. Chapter 20. Chapter 20 1

Artificial Neural Networks (ANN)

Artificial Neural Networks Examination, June 2004

Multilayer Perceptrons (MLPs)

Deep Feedforward Networks. Han Shao, Hou Pong Chan, and Hongyi Zhang

Revision: Neural Network

COGS Q250 Fall Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November.

Simple Neural Nets for Pattern Classification: McCulloch-Pitts Threshold Logic CS 5870

NONLINEAR CLASSIFICATION AND REGRESSION. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition

Reification of Boolean Logic

Part 8: Neural Networks

Neural networks. Chapter 20, Section 5 1

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Hopfield Neural Network

( t) Identification and Control of a Nonlinear Bioreactor Plant Using Classical and Dynamical Neural Networks

The perceptron learning algorithm is one of the first procedures proposed for learning in neural network models and is mostly credited to Rosenblatt.

Neural Networks. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington

Computational statistics

Notes on Back Propagation in 4 Lines

Neural Networks and the Back-propagation Algorithm

Introduction to Machine Learning Spring 2018 Note Neural Networks

Artificial Neural Networks

) (d o f. For the previous layer in a neural network (just the rightmost layer if a single neuron), the required update equation is: 2.

Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks. Cannot approximate (learn) non-linear functions

Feed-forward Network Functions

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm

Article from. Predictive Analytics and Futurism. July 2016 Issue 13

Neural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2

Backpropagation and Neural Networks part 1. Lecture 4-1

Transcription:

Outline 1. Simple neuron model 2. Components of artificial neural networks 3. Common activation functions 4. MATLAB representation of neural network. Single neuron model Simple neuron model Components of simple neuron Input Vector: The input of technical neurons consists of many components, therefore it is a vector. Scalar output: The output of a neuron is a scalar. Synapses change input: In technical neural networks the inputs are preprocessed, too. They are multiplied by a number (the weight) they are weighted. The set of such weights represents the information storage of a neural network. Accumulating the inputs: In biology, the inputs are summarized to a pulse according to the chemical change on the technical side this is often realized by the weighted sum. Non-linear characteristic: The input of our technical neurons is also not proportional to the output. Adjustable weights: The weights weighting the inputs are variable, similar to the chemical processes at the synaptic cleft. This adds a great dynamic to the network because a large part of the "knowledge" of a neural network is saved in the weights. 1

Simple neuron model: input vector x with components x i, These are multiplied by the appropriate weights w i and accumulated (called weighted sum) : w i x i i nonlinear mapping f defines the scalar output y: y = f ( w i x i ) The model consists of a set of synapses each of which is characterized by a weight or strength of its own. An adder, an activation function and a bias. i Components of artificial neural networks A neural network is characterized by 1) Its pattern of connections between the neurons (called its architecture). 2) Its method of determining the weights on the connections (called its training, or learning, algorithm). 3) Its activation function. Technical neural network consists of simple processing units, the neurons, and directed, weighted connections between those neurons. The reactions of the neurons to the input values depend on this activation state. The activation state indicates the extent of a neuron s activation and is often shortly referred to as activation (Neurons get activated if the network input exceeds their threshold value). Threshold value: Let j be a neuron. The threshold value θ j is uniquely assigned to j and marks the position of the maximum gradient value of the activation function. 2

The bias neuron is a technical trick to consider threshold values as connection weights: threshold values for neurons can also be realized as connecting weight of a continuously firing neuron. The activation function is also called transfer function. Common activation functions 1) Identity function f (x) = x for all x Matlab implementation (purelin) Neurons with this transfer function are used in the ADALINE networks 2) binary threshold function (Heaviside function) The simplest activation function which can only take on two values: If the input is above a certain threshold, the function changes from one value to another, but otherwise remain constant. This implies that the function is not differentiable at the threshold and for the rest the derivative is 0. Due to this fact, backpropagation learning, for example, is impossible. Matlab implementation (Hard Limit) 3

3) Fermi function or logistic sigmoid function (an S-shaped curve), This transfer function takes the input (which may have any value between plus and minus infinity) and squashes the output into the range 0 to 1, according to the expression The Fermi function can be expanded by a temperature parameter T into the form: The log-sigmoid transfer function is commonly used in multilayer networks that are trained using the backpropagation algorithm, in part because this function is differentiable. 4) Hyperbolic tangent which maps to ( 1, 1), this function is differentiable. 4

MATLAB representation of neural network Single neuron model: Single-Input Neuron Single-Input Neuron 5

The neuron output is calculated as a = f (wp + b), If, for instance w = 3, p = 2 and b = 1. 5, Then a = f(3 2 1. 5) = f(4. 5) The actual output depends on the particular transfer function that is chosen. Single neuron model: Multiple-Input Neuron The individual inputs p 1, p 2, p 3,, p R are each weighted by corresponding elements w 1,1, w 1,2, w 1,3,, w 1,R of the weight matrix W The net input n: Multiple-Input Neuron This expression can be written in matrix form: The neuron output can be written as Weight Indices: The first index indicates the neuron destination. The second index indicates the source of the signal fed to the neuron. Thus, the w 1,2 represents the connection to the first neuron from the second source. 6

Example: If p1 = 2.5; p2 = 3; w1 = 0.5; w2 = -0.7; b = 0.3. Let s assume the transfer function of the neuron is hardlimit Solution a = hardlim(n) = hardlim( w1p1 + w2p2 + b) = hardlim( 0.5 2.5 + (-0.7) 3 + 0.3 ) = hardlim( -0.55) = 0 Abbreviated Matlab notation Neuron with R Inputs, Abbreviated Notation 7