Neural Networks & Learning

Similar documents
Multigradient for Neural Networks for Equalizers 1

Supervised Learning NNs

Introduction to the Introduction to Artificial Neural Network

Evaluation of classifiers MLPs

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17

EEE 241: Linear Systems

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Model of Neurons. CS 416 Artificial Intelligence. Early History of Neural Nets. Cybernetics. McCulloch-Pitts Neurons. Hebbian Modification.

Multi-layer neural networks

Pattern Classification

Week 5: Neural Networks

Multilayer neural networks

Lecture 23: Artificial neural networks

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

EXPERT CONTROL BASED ON NEURAL NETWORKS FOR CONTROLLING GREENHOUSE ENVIRONMENT

Multilayer Perceptron (MLP)

Hopfield Training Rules 1 N

Solving Nonlinear Differential Equations by a Neural Network Method

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

Radial-Basis Function Networks

Report on Image warping

Adaptive RFID Indoor Positioning Technology for Wheelchair Home Health Care Robot. T. C. Kuo

A New Algorithm for Training Multi-layered Morphological Networks

CHAPTER III Neural Networks as Associative Memory

Neural Networks. Class 22: MLSP, Fall 2016 Instructor: Bhiksha Raj

Supporting Information

Unsupervised Learning

Other NN Models. Reinforcement learning (RL) Probabilistic neural networks

Efficient Weather Forecasting using Artificial Neural Network as Function Approximator

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Principe, J.C. Artificial Neural Networks The Electrical Engineering Handbook Ed. Richard C. Dorf Boca Raton: CRC Press LLC, 2000

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

CHAPTER 3 ARTIFICIAL NEURAL NETWORKS AND LEARNING ALGORITHM

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Appendix B: Resampling Algorithms

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,*

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks

1 Input-Output Mappings. 2 Hebbian Failure. 3 Delta Rule Success.

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

Chapter - 2. Distribution System Power Flow Analysis

Lecture 3: Dual problems and Kernels

Multi layer feed-forward NN FFNN. XOR problem. XOR problem. Neural Network for Speech. NETtalk (Sejnowski & Rosenberg, 1987) NETtalk (contd.

Nonlinear Classifiers II

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

CS407 Neural Computation

Non-linear Canonical Correlation Analysis Using a RBF Network

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

1 Convex Optimization

De-noising Method Based on Kernel Adaptive Filtering for Telemetry Vibration Signal of the Vehicle Test Kejun ZENG

Gradient Descent Learning and Backpropagation

Generalized Linear Methods

AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING

VQ widely used in coding speech, image, and video

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

On the Multicriteria Integer Network Flow Problem

Rhythmic activity in neuronal ensembles in the presence of conduction delays

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015

Neural Networks. Neural Network Motivation. Why Neural Networks? Comments on Blue Gene. More Comments on Blue Gene

Kernel Methods and SVMs Extension

b ), which stands for uniform distribution on the interval a x< b. = 0 elsewhere

An Empirical Study of Fuzzy Approach with Artificial Neural Network Models

Chapter 11: Simple Linear Regression and Correlation

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

PERFORMANCE COMPARISON BETWEEN BACK PROPAGATION, RPE AND MRPE ALGORITHMS FOR TRAINING MLP NETWORKS

arxiv: v1 [cs.lg] 17 Jan 2019

Using collocation neural network to solve nonlinear singular differential equations

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations

LECTURE NOTES. Artifical Neural Networks. B. MEHLIG (course home page)

EEL 6266 Power System Operation and Control. Chapter 3 Economic Dispatch Using Dynamic Programming

Numerical Heat and Mass Transfer

Image classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them?

CSE 252C: Computer Vision III

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

MMA and GCMMA two methods for nonlinear optimization

Time Series Forecasting Using Artificial Neural Networks under Dempster Shafer Evidence Theory and Trimmed-winsorized Means

3Department of physics, Suiz Canal University, Suiz canal, Egypt

The Chaotic Robot Prediction by Neuro Fuzzy Algorithm (2) = θ (3) = ω. Asin. A v. Mana Tarjoman, Shaghayegh Zarei

Online Classification: Perceptron and Winnow

SHORT-TERM POWER FORECASTING BY STATISTICAL METHODS FOR PHOTOVOLTAIC PLANTS IN SOUTH ITALY

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB

Feature Selection: Part 1

Why feed-forward networks are in a bad shape

Support Vector Machines CS434

Short Term Load Forecasting using an Artificial Neural Network

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

The Minimum Universal Cost Flow in an Infeasible Flow Network

Pulse Coded Modulation

Video Data Analysis. Video Data Analysis, B-IT

Problem Set 9 Solutions

Lecture Notes on Linear Regression

Fundamentals of Computational Neuroscience 2e

Neural networks. Nuno Vasconcelos ECE Department, UCSD

829. An adaptive method for inertia force identification in cantilever under moving mass

Some modelling aspects for the Matlab implementation of MMA

A Hybrid Variational Iteration Method for Blasius Equation

Meshless Surfaces. presented by Niloy J. Mitra. An Nguyen

NUMERICAL DIFFERENTIATION

Transcription:

Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred by bologcal nervous systems, such as the bran. It s composed of a large number of hghly nterconnected processng elements orkng n unon to solve specfc problems. ANN s type of artfcal ntellgence that attempts to mtate the ay a human bran orks. We advse the reader to read S. Haykn, Neural Netorks: A comprehensve foundaton, 999. Fg. 0 A bologcal neuron Most of the ANN structures used commonly for many applcatons often consder the behavor of sngle neuron as the basc computng unt descrbng neural nformaton processng operatons. Each computng unt,.e. the artfcal neuron n the neural netork s based on the concept of an deal neuron. A bologcal neuron or a nerve cell conssts of synapses, dendrtes and the axon, the functon of the man elements can be gven as: - Dendrte: Receves sgnals from other neurons. - Soma: Sums all the ncomng sgnals. - Axon: When a partcular amount of nput s receved, then the cell fres. It transmts sgnal through axon to other cells. For the purposes of the course e ll look at neural netorks as functon approxmator. As shon n Fgure, e have some unknon functon that e sh to approxmate. We ant to adust the parameters of the netork so that t ll produce the same response as the unknon functon, f the same nput s appled to both systems.

Fg. Neural Netork as Functon Approxmator For our applcatons, the unknon functon may correspond to a system e are tryng to control, n hch case the neural netork ll be the dentfed plant model. The unknon functon could also represent the nverse of a system e are tryng to control, n hch case the neural netork can be used to mplement the controller. At the end of ths tutoral e ll present several control archtectures demonstratng a varety of uses for functon approxmator neural netorks. Fgure Neural Netork as Functon Approxmator In the next secton e ll present the multlayer perceptron neural netork, and ll demonstrate ho t can be used as a functon approxmator. 2. Artfcal Neural Netorks Artfcal neural netorks are nonlnear nformaton processng devces, hch are bult from nterconnected elementary processng devces called neurons. ANN thus s an nformaton-processng system. In ths nformaton-processng system, the elements called as neurons, process the nformaton. The sgnals are transmtted by the means of connecton lnks. The lnk possesses an assocated eght, hch s multpled along th the ncomng sgnal (net nput) for any typcal neural net. The output sgnal s obtaned by applyng actvatons to the net nput. An Artfcal neuron s charactered by:. Archtecture (connecton beteen neurons) 2. Tranng or learnng (determnng eghts on the connectons) 3. Actvaton functon

3. Basc Netork Structures a. Hstorcally, the earlest ANN s are The perceptron, proposed by the psychologst Frank Rosenblatt. b. The Adalne (Adaptve Lne near Neuron). It s a sngle neuron, not a netork. c. The Madalne (Many Adalne). Ths s ANN formulaton based on the Adalne above. d. The Mult-Layer Perceptrons. Ths s a generaled archtecture of the perceptron. Ths net s used for approxmaton functon problem. e. The Hopfeld Netork. Ths dfferent netork has mportant aspect hch has recurrent feature feedback beteen neurons. Ths net provdes an effcent soluton for the Travelng Sales-man Problem. f. The Self-Organng Mappng s utled to facltate unsupervsed learnng. These nets are appled to many recognton problems. 4. Feedback Neural Netorks Archtecture Ths type of netork as descrbed by J.J. Hopfeld n 982. The topology of a Hopfeld netork s very smple: t has n neurons, hch are all netorked th each other.

The archtecture shon n the prevous fgure conssts of 4 numbers of nput neurons and 4 output neurons. It should be noted that apart from recevng a sgnal from nput, the frst neuron receves sgnal form other output neurons also. Ths s the same for the all other output neurons. Thus, there exsts a feedback output beng returned to each output neuron. That s hy the Hopfeld netork s called a feedback netork. 5. Feed forard Neural Netorks Archtecture A MLP and RBF neural netorks archtecture may be veed as a practcal vehcle for performng a nonlnear nput-output mappng of a general nature. 5. MLP We study multlayer feedforard netorks, an mportant class of neural netork. Typcally, the netork consst of a set of sensory of unts that consttute the nput layer, one or more hdden layers of computaton nodes, and an output layer of computaton nodes. The nput sgnal propagates through the netork n a forard drecton, on a layerby-layer bass. These neural netorks are commonly referred to as Mult Layers Perceptrons.

y y 2 y m outputs o om v o p pm p hdden layer v op v v np nputs x x 2 x n Fgure. MLPNN archtecture Fgure. shos the archtectural graph of MLP th 2 hdden layers and an output layer. A MLP has three dstnctve characterstcs:. The model of each neuron n the netork ncludes a nonlnear actvaton functon. The mportant pont s that the nonlnearty s smooth and commonly used form nonlnearty that satsfes ths requrement s a sgmodal nonlnearty defned as follo y = x + e

2. The neural netork contans one or more layers of hdden neurons. 3. The netork exhbts a hgh degree of connectvty, determned by the synapses of the netork. A change n the connectvty of the netork requres a change n the populaton of synaptc connectons or ther eghts. Back-propagaton algorthm MLP have been appled successfully to solve some dffcult and dverse problems by tranng them n a supervsed manner th hghly popular algorthm knon as the error back-propagaton algorthm. Ths s based on the error-correcton learnng rule. Bascally, error BP algorthm conssts of to passes though the dfferent layers of the netork: a forard pass and backard pass. In the forard pass, an nput vector s appled to the nput layer of the netork, and ts effect propagates through the netork layer by layer. Durng the backard pass, on the other hand, the synaptc eghts of the netorks are all adusted n accordance th error-correcton rule. Specfcally, the actual response of the netork s subtracted from a desred response to produce an error sgnal. Ths error s then propagate backard through the netork. The tranng algorthm of back propagaton can be descrbed as follo: Intalaton of the eghts Step : Intale eght to small random values. Step 2: Whle stoppng condton s false, do steps 3-0 Step 3: For each tranng par do steps 4-9 Feedforard pass Step 4: Each nput unt receves the nput sgnal x and transmts ths sgnals to all unts n the hdden layer Step 5: Each nput unt receves the nput sgnal, =,..., p sums ts eghted nput sgnals n n = vo + = f ( n applyng actvaton functon Z = ) and sends ths to all unts n the output layer Step 6: Each output unt y k, k =,..., m sums ts eghted nput sgnals y nk = ok + and apples ts actvaton functon to calculate the output sgnal Y = f y ) x v p = k k ( nk Backard pass Step 7: Each output unt y k, k =,..., m receves a target pattern correspondng to an nput pattern, error nformaton term s calculated as δ = t y ) f ( y ) k ( k k nk

Step 8: Each hdden unt, =,..., p sums ts delta nputs from unts n the layer above δ n = m k = δ The error nformaton term s calculated as δ = δ f n k ( n Updatng Weght and Bases Step 9: Each output unt y k, k =,..., m updates ts bas and eghts ( = 0,..., p). The eght correcton term s gven by Δ W k = αδ k and the bas correcton term s gven by Δ Wok = αδ k Therefore, W k ( ne) = W k ( old) + ΔW k, Wok ( ne) = Wok ( old) + ΔWok Each hdden unt, =,..., p updates ts bas and eghts ( = 0,..., n). The eght correcton term Δ V = αδ x The eght correcton term Δ Vo = αδ Therefore, V ( ne) = V ( old) + ΔV, Vo ( ne) = Vo ( old) + ΔVo Step 0: Test the stoppng condton, the stoppng condton may be the mnmaton of the errors, number of teratons etc. ) 5.2 RBF Radal bass functon netork can be used for approxmatng functons too. It uses Gaussan kernel functons. The archtecture of radal bass functon conssts of three layers, the nput, the hdden and the output layers as shon n the fgure belo. The archtecture of radal bass functon netork s multlayer feed-forard netork. The output of the RBFNN s lnear combnaton f bass functons. here ϕ s mappng R + y = n = ϕ ( x) R and the norm s Eucldean dstance.

Weghts Lnear eghts Radal Bass Functons Fgure.2 RBFNN archtecture The follong forms have been consdered as radal bass functon a. Mult Quadratc functon and r R b. ϕ ( r ) = r c. d. e. ϕ ( r ) = r ϕ ( r ) = r ϕ ( r) = e 2 3 2 r 2 / 2 ϕ ( r ) = ( r + c) here C s postve constant Tranng algorthm for an RBFNN The tranng algorthm for the radal bass functon netork s gven belo: Step : Intale the eghts (set to small random values) Step 2: Whle stoppng s false do step 3-0 Step 3: For each nput do step 4-9 Step 4: Each nput unt x, =,..., n receves nput sgnals to all unts n the layer above. Step 5: Calculate the radal bass functon Step 6: Choose the centers for the radal bass functon. The centres are chosen from the set of nput vectors. A suffcent number of centers have to be selected n order to ensure adequate samplng of the nput vector space. Step 7: The output of m unt v ( x ) n the hdden layer

here xˆ s center of the RBF neurons, [ ] 2 x xˆ 2 / r v ( x ) = exp σ = σ dth of the RBF and x s the nput varable. Step 8: Intale the eghts n the output layer of the netork to some small random value Step 9: Calculate the output of the neural netork H y net = mv ( x ) + = here H s number of hdden neurons, y net output value of m neurons, W m eght beteen I RBF unt and m output node, and basng term at n output nde. Step 0: Calculate error and test stoppng condton, the stoppng condton may be the eght change, number of teratons, etc. 6. Ho to desgn an ANN o - Choose the approprate ANN needed to solve your problem (MLP, RBF, Hopfeld, ) - Choose the number of hdden layers needed. - Choose the number of neurons. - Choose the tranng algorthm - Valdate your results on ne examples. o 7. Applcaton to Control Neural netorks have been appled very successfully n the dentfcaton and control of dynamc systems. The unversal approxmaton capabltes of the multlayer perceptron have made t a popular choce for modelng nonlnear systems and for mplementng general-purpose nonlnear controllers. In the last chapter of ths lecture notes, e ll ntroduce some of the more popular neural netork archtectures for system dentfcaton and control.