Biologically Inspired Compu4ng: Neural Computa4on. Lecture 5. Patricia A. Vargas

Similar documents
Artificial Neural Networks Examination, March 2004

Flexible Couplings: Diffusing Neuromodulators and Adaptive Robotics

Chapter 1 Introduction

Evolutionary computation

Artificial Neural Networks Examination, June 2005

Evolutionary Computation. DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia)

Artificial Neural Networks Examination, June 2004

Artificial Neural Networks Examination, March 2002

Unicycling Helps Your French: Spontaneous Recovery of Associations by. Learning Unrelated Tasks

Learning and Memory in Neural Networks

Using a Hopfield Network: A Nuts and Bolts Approach

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

Artificial Intelligence Hopfield Networks

Revision: Neural Network

Genetic Engineering and Creative Design

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

Supporting Online Material for

biologically-inspired computing lecture 18

Data Mining Part 5. Prediction

V. Evolutionary Computing. Read Flake, ch. 20. Genetic Algorithms. Part 5A: Genetic Algorithms 4/10/17. A. Genetic Algorithms

Toward a Dynamical Systems Analysis of Neuromodulation

Artificial Neural Networks. Q550: Models in Cognitive Science Lecture 5

Non-Adaptive Evolvability. Jeff Clune

Shigetaka Fujita. Rokkodai, Nada, Kobe 657, Japan. Haruhiko Nishimura. Yashiro-cho, Kato-gun, Hyogo , Japan. Abstract

Artificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino

Artificial Neural Networks

Fundamentals of Genetic Algorithms

A Novel Activity Detection Method

The Role of Nearly Neutral Mutations in the Evolution of Dynamical Neural Networks

Artificial Neural Network and Fuzzy Logic

Artificial Intelligence

Sample Exam COMP 9444 NEURAL NETWORKS Solutions

[Read Chapter 9] [Exercises 9.1, 9.2, 9.3, 9.4]

Lecture 5: Recurrent Neural Networks

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required.

An artificial neural networks (ANNs) model is a functional abstraction of the

V. Evolutionary Computing. Read Flake, ch. 20. Assumptions. Genetic Algorithms. Fitness-Biased Selection. Outline of Simplified GA

Evolutionary Design I

GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS

Statistical Machine Learning from Data

Explorations in Evolutionary Visualisation

IV. Evolutionary Computing. Read Flake, ch. 20. Assumptions. Genetic Algorithms. Fitness-Biased Selection. Outline of Simplified GA

Neuroevolution for sound event detection in real life audio: A pilot study

Improving Coordination via Emergent Communication in Cooperative Multiagent Systems: A Genetic Network Programming Approach

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

The Evolution of Gene Dominance through the. Baldwin Effect

Neural networks. Chapter 20. Chapter 20 1

Simple Neural Nets for Pattern Classification: McCulloch-Pitts Threshold Logic CS 5870

Parallel Genetic Algorithms

Neural Systems and Artificial Life Group, Institute of Psychology, National Research Council, Rome. Evolving Modular Architectures for Neural Networks

Part 8: Neural Networks

Artificial Neural Networks. Part 2

Neural networks. Chapter 20, Section 5 1

Integer weight training by differential evolution algorithms

Neural Networks. Mark van Rossum. January 15, School of Informatics, University of Edinburgh 1 / 28

Forecasting & Futurism

Evolutionary computation

Analysis of Multilayer Neural Network Modeling and Long Short-Term Memory

Negatively Correlated Echo State Networks

Neural networks. Chapter 19, Sections 1 5 1

Application of Fully Recurrent (FRNN) and Radial Basis Function (RBFNN) Neural Networks for Simulating Solar Radiation

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92

Chapter 17: Population Genetics and Speciation

Dynamic Working Memory in Recurrent Neural Networks

Introduction to Artificial Neural Networks

Individual learning and population evolution

Towards Synthesizing Artificial Neural Networks that Exhibit Cooperative Intelligent Behavior: Some Open Issues in Artificial Life Michael G.

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm

Associative Neural Networks using Matlab

CMSC 421: Neural Computation. Applications of Neural Networks

Artificial Neural Network

Microevolution Changing Allele Frequencies

Evolutionary Algorithms

Incremental Stochastic Gradient Descent

Evolving more efficient digital circuits by allowing circuit layout evolution and multi-objective fitness

Hebbian Learning using Fixed Weight Evolved Dynamical Neural Networks

Evolutionary Robotics

Table S1 shows the SOM parameters that were used in the main manuscript. These correspond to the default set of options of the SOM Toolbox.

Mechanisms of Evolution Microevolution. Key Concepts. Population Genetics

An Adaptive Clustering Method for Model-free Reinforcement Learning

Introduction to Neural Networks: Structure and Training

Lecture 7 Artificial neural networks: Supervised learning

Machine Learning. Neural Networks

Development of a Deep Recurrent Neural Network Controller for Flight Applications

Template-Based Representations. Sargur Srihari

Using Variable Threshold to Increase Capacity in a Feedback Neural Network

Short Term Memory Quantifications in Input-Driven Linear Dynamical Systems

Multiobjective Optimization of an Extremal Evolution Model

NEUROEVOLUTION. Contents. Evolutionary Computation. Neuroevolution. Types of neuro-evolution algorithms

CISC 889 Bioinformatics (Spring 2004) Hidden Markov Models (II)

Neural Networks Based on Competition

SP-CNN: A Scalable and Programmable CNN-based Accelerator. Dilan Manatunga Dr. Hyesoon Kim Dr. Saibal Mukhopadhyay

Computational Intelligence Lecture 6: Associative Memory

Neural Networks Learning the network: Backprop , Fall 2018 Lecture 4

DEVS Simulation of Spiking Neural Networks

GENETICS - CLUTCH CH.22 EVOLUTIONARY GENETICS.

An Evolution Strategy for the Induction of Fuzzy Finite-state Automata

Stability of backpropagation learning rule

Restarting a Genetic Algorithm for Set Cover Problem Using Schnabel Census

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield

Transcription:

Biologically Inspired Compu4ng: Neural Computa4on Lecture 5 Patricia A. Vargas

Lecture 5 I. Lecture 4 Revision II. Ar4ficial Neural Networks (Part IV) I. Recurrent Ar4ficial Networks I. GasNet models II. Evolving Ar4ficial Neural Networks I. MLPs II. GasNet models F21BC2 BIC Neural Computa4on 2010 2

Ar4ficial Neural Networks Recurrent Neural Networks The Hopfield Equations Hopfield Neural Network Training performed in one pass: where, w 1 n k k ij =! pi p j N k = 1 w ij is the weight between nodes i & j " N # N is the number of nodes in the network! si = sign $ wij s j % & j= 1 ' n is the number of patterns to be learnt p ik is the value required for the i-th node in pattern k totalmente realimentada (exceto auto-realim Pattern Completion (III) 64 pixel image of an H Same image with 10 pixels altered (I.e. approximately 16% noise added) Execution performed iteratively: " N # s = sign $! w s % Pattern Completion (IV) Food for thought - flight of fancy? Memories of a deceased dog named Tanya...

4

5

6

7

8

9

Gas dispersion NOT centred on the node 10

Gas Concentration around emitter 11

12

Nodes do NOT have a spatial relation? [0,1] is called the Mbias or Modulator Bias 13

Ar4ficial Neural Networks I. Evolving Ar4ficial Neural Networks I. MLPs I. Topology and weights II. Example: Evolving (training) MLPs to learn some func4ons <Toolbox> from Cazangi & Moioli (2005) available @ VISION x 0 x 1 y 1 x 2 y 2 y o x m Camada de

Ar4ficial Neural Networks I. Evolving Ar4ficial Neural Networks I. GasNet models I. Topology + all network parameters + task dependent parameters < genotype >:: (< genes >) < gene >=< node >

I. Evolving GasNets models Original de variable in the phenotype space. Node variables Description [Range]DiscreteValues Gene locus Description Coordinates Node coordinates on the [0, 99] < x > < x value > Euclidean plane (100x100) [0, 99] < y > < y value > Electrical Defines the parameters of the [0, 50] <r p > < radius > connectivity two segments of circle centred <r p > on the node that will determine [0, 2π] < θ e > < angular extent > the excitatory and inhibitory < θ e > links < θ p > < orientation > < θ p > Recurrence Determines whether the node {-1,0,1} < rec > < recurrent status > status has an inhibitory, none or excitatory recurrent connection Emitting Determines the circumstances {0,1,2} <E s > < emitting status > status under which the node will emit gas {none, electrical, gas} Type of gas Determines which gas {1, 2} <G t > < gas type > the node will emit Rate of Determines the rate of gas [1, 11] < s > < build up/ build up/ build up and decay decay rate > decay Radius of Maximum radius of [10%,60%]* <G r > < gas radius > emission gas emission * of plane dimension 100x100 Transfer Used in (2) to determine the [1, 11] <K 0 > < transfer function function transfer parameter value K n i default value > parameter default value K 0 i Bias The b i term? on 1 [ 1.0, 1.0] <b> < bias value > Task Parameters which depend on? <? >? parameters the task, e.g. a robot vision sensors input area ([1])

I. phenotype Evolving GasNets models NSGasNet space. Node Description [Range] Gene Description variables DiscreteValues locus Recurrence Determines whether the node {-1,0,1} < rec > < recurrent status > status has an inhibitory, none or excitatory recurrent connection Emitting Determines the circumstances {0,1,2} <E s > < emitting status > status under which the node will emit gas {none, electrical, gas} Type of gas Determines which gas {1, 2} <G t > < gas type > the node will emit Rate of Determines the rate of gas [1, 11] < s > < build up/ build up/ build up and decay decay rate > decay Transfer Used in (2) to determine the [1, 11] <K 0 > < transfer function function transfer parameter value K n i default value > parameter default value K 0 i Bias The term b i on 1 [ 1.0, 1.0] <b> < bias value > Task Parameters which depend on? <? >? parameters the task, e.g. a robot vision sensors input area ([1]) Modulator Parameter which depend on [0, 1] < Mbias n > < Modulator bias Bias the number of nodes, for node n > (Mbias n ) i.e. there will be as many Mbias as the number of network nodes

1) Initialise population of 100 genotypes in 10x10 grid 2) Evaluate initial genotype fitnesses 3) Repeat until termination criteria met: Repeat 100 times for one generation: Select genotype at random Create mating pool of genotype plus eight nearest grid neighbours One parent P selected proportional to fitness rank in mating pool Offspring O mutated Offspring O evaluated Place O in mating pool, replacing genotype selected proportional to inverse of fitness rank in mating pool The steady-state genetic algorithm works as 18

19

Original X NSGasNet 20

Evolu4on of: All nodes parameters 21

50 runs 200 evalua-ons per genera-on lutionary regime parameters employed on Parameter CPG task Mutation rate 8% Fitness function F itness CP G = ( C0 n0 )+( C1 n1 ) 2 Number of runs 50 Maximum number 1000 of generations Population size 100 Genotype size 42 (Original) 46 (NSGasNet) Trials 1 Number of evaluations 200 per trial 22

23

24

25

26

Evolu4on of: Number of nodes All nodes parameters 27

28

29

30

31

32

40 runs 10 trials per fitness evalua-on 33

34

2d simulation 3d simulation 35

3d simulation 36

Lecture 5 References: Husbands,P., Smith, T., Jakobi, N. and O Shea, M. (1998). Beier living through chemistry: Evolving GasNets for robot control, Connec4on Science, vol. 10(3 4), pp. 185 210, 1998. Philippides, P. Husbands, T. Smith, and M. O Shea (1999) Flexible couplings: Diffusing neuromodulators and adap4ve robo4cs, Ar4ficial Life, vol. 11, pp. 139 160, 2005.ciples and individual variability, Journal of Computa4onal Neuroscience, vol. 7, pp. 119 147. Smith, T. M. S. (2002). The Evolvability of Ar4ficial Neural Net works for Robot Control. University of Sussex, Brighton, England, UK, 2002, PhD thesis. Vargas, P.A., Di Paolo, E. A. & Husbands, P. (2007). "Preliminary Inves4ga4ons on the Evolvability of a Non spa4al GasNet Model". In Proceedings of the 9th European Conference on Ar4ficial life, ECAL'2007, Springer Verlag, Lisbon, Portugal, September 2007, pp: 966 975. Vargas, P.A., Di Paolo, E. A., & Husbands, P. (2008). "A study of GasNet spa4al embedding in a delayed response task". In the Proc. of ALIFE XI'2008, England, UK, pp:640 647. Husbands, P., Philippides, A., Vargas, P. A., Buckley, C, Fine, P., Di Paolo, E. & O Shea, M. (2010). Spa4al, Temporal and Modulatory Factors affec4ng GasNet Evolvability, to appear in the Complexity Journal, Wiley Periodicals, Inc. (invited paper). F21BC2 BIC Neural Computa4on 2010 37

Lecture 5 I. Lecture 4 Revision II. Ar4ficial Neural Networks (Part IV) I. Recurrent Neural Networks I. GasNet Models II. Evolving ANN I. MLP II. GasNet models F21BC2 BIC Neural Computa4on 2010 38

Lecture 6 What s next? Ar4ficial Neural Networks (Part V) F21BC2 BIC Neural Computa4on 2010 39

Material: @ hip://www.macs.hw.ac.uk/~pav1 (please note that this link only works in Mozilla Firefox or Safari web browsers) @ VISION F21BC2 BIC Neural Computa4on 2010 40