Computational Intelligence

Similar documents
Computational Intelligence Winter Term 2009/10

Computational Intelligence

Simple Neural Nets for Pattern Classification: McCulloch-Pitts Threshold Logic CS 5870

COMP9444 Neural Networks and Deep Learning 2. Perceptrons. COMP9444 c Alan Blair, 2017

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Computational Intelligence Winter Term 2017/18

Artificial Neural Networks The Introduction

Computational Intelligence

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

Neural Networks. Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994

Artificial Neural Network and Fuzzy Logic

Neural Networks Introduction CIS 32

Neural networks. Chapter 20, Section 5 1

Introduction Biologically Motivated Crude Model Backpropagation

Neural networks. Chapter 19, Sections 1 5 1

Neural Networks Introduction

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

Artificial neural networks

Lecture 4: Feed Forward Neural Networks

Artificial Neural Networks. Part 2

Artificial Neural Networks

Introduction and Perceptron Learning

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan

Neural networks. Chapter 20. Chapter 20 1

Artificial Neural Networks. Historical description

EEE 241: Linear Systems

Artificial Neural Networks

Master Recherche IAC TC2: Apprentissage Statistique & Optimisation

Multi-Dimensional Neural Networks: Unified Theory

Neural Networks: Introduction

Feedforward Neural Nets and Backpropagation

Data Mining Part 5. Prediction

CS:4420 Artificial Intelligence

Introduction to Artificial Neural Networks

CSC321 Lecture 5: Multilayer Perceptrons

Introduction To Artificial Neural Networks

CE213 Artificial Intelligence Lecture 13

Sections 18.6 and 18.7 Artificial Neural Networks

Artificial Neural Networks. Q550: Models in Cognitive Science Lecture 5

Neural Networks for Machine Learning. Lecture 2a An overview of the main types of neural network architecture

Course 395: Machine Learning - Lectures

Intelligent Systems Discriminative Learning, Neural Networks

Perceptron. (c) Marcin Sydow. Summary. Perceptron

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks

Computational Intelligence

Artificial Neural Network

Simple Neural Nets For Pattern Classification

Artificial Neural Networks. MGS Lecture 2

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

COMP304 Introduction to Neural Networks based on slides by:

Lecture 7 Artificial neural networks: Supervised learning

Neural Networks. Xiaojin Zhu Computer Sciences Department University of Wisconsin, Madison. slide 1

Week 3: More Processing, Bits and Bytes. Blinky, Logic Gates, Digital Representations, Huffman Coding. Natural Language and Dialogue Systems Lab

Symbolic vs. subsymbolic representation in cognitive science and artificial intelligence Vladimír Kvasnička FIIT STU.

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1

Artifical Neural Networks

CS 4700: Foundations of Artificial Intelligence

Sections 18.6 and 18.7 Artificial Neural Networks

2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks. Todd W. Neller

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET

Linear Regression, Neural Networks, etc.

Introduction to Neural Networks

Machine Learning. Neural Networks. Le Song. CSE6740/CS7641/ISYE6740, Fall Lecture 7, September 11, 2012 Based on slides from Eric Xing, CMU

Lecture 4: Perceptrons and Multilayer Perceptrons

Νεςπο-Ασαυήρ Υπολογιστική Neuro-Fuzzy Computing

PETER PAZMANY CATHOLIC UNIVERSITY Consortium members SEMMELWEIS UNIVERSITY, DIALOG CAMPUS PUBLISHER

Neural Networks Lecturer: J. Matas Authors: J. Matas, B. Flach, O. Drbohlav

Grundlagen der Künstlichen Intelligenz

Lab 5: 16 th April Exercises on Neural Networks

Networks of McCulloch-Pitts Neurons

CS 4700: Foundations of Artificial Intelligence

New Delhi & Affiliated to VTU, Belgavi ) Oorgaum, Kolar Dist ,Karnataka

Input layer. Weight matrix [ ] Output layer

Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009

Machine Learning (CSE 446): Neural Networks

CS 256: Neural Computation Lecture Notes

Chapter ML:VI. VI. Neural Networks. Perceptron Learning Gradient Descent Multilayer Perceptron Radial Basis Functions

Multi-layer Perceptron Networks

Simulating Neural Networks. Lawrence Ward P465A

17 Neural Networks NEURAL NETWORKS. x XOR 1. x Jonathan Richard Shewchuk

Neural Networks. Fundamentals Framework for distributed processing Network topologies Training of ANN s Notation Perceptron Back Propagation

Artificial Neural Networks Lecture Notes Part 3

Multilayer Perceptron Tutorial

Neural Networks (Part 1) Goals for the lecture

Compartmental Modelling

Multilayer Perceptron

Fundamentals of Neural Networks

Computational Intelligence Winter Term 2018/19

CSE446: Neural Networks Spring Many slides are adapted from Carlos Guestrin and Luke Zettlemoyer

Vote. Vote on timing for night section: Option 1 (what we have now) Option 2. Lecture, 6:10-7:50 25 minute dinner break Tutorial, 8:15-9

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

In the Name of God. Lecture 9: ANN Architectures

Principles of Artificial Intelligence Fall 2005 Handout #7 Perceptrons

COMP 551 Applied Machine Learning Lecture 14: Neural Networks

Master Recherche IAC Apprentissage Statistique, Optimisation & Applications

Pattern Recognition Prof. P. S. Sastry Department of Electronics and Communication Engineering Indian Institute of Science, Bangalore

Transcription:

Plan for Today Computational Intelligence Winter Term 29/ Organization (Lectures / Tutorials) Overview CI Introduction to ANN McCulloch Pitts Neuron (MCP) Minsky / Papert Perceptron (MPP) Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS ) Fakultät für Informatik TU Dortmund 2 Organizational Issues Organizational Issues Who are you? either studying Automation and Robotics (Master of Science) Module Optimization or studying Informatik - BA-Modul Einführung in die Computational Intelligence - Hauptdiplom-Wahlvorlesung (SPG 6 & 7) Who am I? Günter Rudolph Fakultät für Informatik, LS Guenter.Rudolph@tu-dortmund.de OH-4, R. 232 office hours: Tuesday, :3 :3am and by appointment best way to contactme if you want to see me 3 4

Organizational Issues Prerequisites Lectures Wednesday :5-:45 OH-4, R. E23 Tutorials Wednesday 6:5-7: OH-4, R. 34 group Thursday 6:5-7: OH-4, R. 34 group 2 Tutor Nicola Beume, LS Information http://ls www.cs.unidortmund.de/people/rudolph/ teaching/lectures/ci/ws29 /lecture.jsp Slides Literature see web see web Knowledge about mathematics, programming, logic is helpful. But what if something is unknown to me? covered in the lecture pointers to literature... and don t hesitate to ask! 5 6 Overview Computational Intelligence Overview Computational Intelligence What is CI? umbrella term for computational methods inspired by nature artifical neural networks evolutionary algorithms backbone fuzzy systems swarm intelligence artificial immune systems new developments growth processes in trees... term computational intelligence coined by John Bezdek (FL, USA) originally intended as a demarcation line establish border between artificial and computational intelligence nowadays: blurring border our goals:. know what CI methods are good for! 2. know when refrain from CI methods! 3. know why they work at all! 4. know how to apply and adjust CI methods to your problem! 7 8 2

Biological Prototype Abstraction Neuron human being: 2 neurons - Information gathering (D) electricity in mv range - Information processing (C) - Information propagation (A / S) speed: 2 m / s dendrites nucleus / cell body axon synapse cell body (C) axon (A) nucleus dendrite (D) synapse (S) signal input signal processing signal output 9 Model 943: Warren McCulloch / Walter Pitts description of neurological networks modell: McCulloch-Pitts-Neuron (MCP) funktion f f(,,, x n ) basic idea: - neuron is either active or inactive - skills result from connecting neurons x n McCulloch-Pitts-Neuron 943: x i {, } =: B considered static networks (i.e. connections had been constructed and not learnt) f: B n B 2 3

McCulloch-Pitts-Neuron McCulloch-Pitts-Neuron NOT n binary input signals,, x n n binary input signals,, x n threshold > threshold > y in addition: m binary inhibitory signals y,, y m boolean OR boolean AND if at least one y j =, then output = can be realized:... x n = x n... n = n 3 otherwise: - sum of inputs threshold, then output = else output = 4 Analogons Neurons Synapse Topology simple MISO processors (with parameters: e.g. threshold) connection between neurons (with parameters: synaptic weight) interconnection structure of net Assumption: inputs also available in inverted form, i.e. inverted inputs. Theorem: Every logical function F: B n B canbesimulated with a two-layered McCulloch/Pitts net. + Example: Propagation Training / Learning working phase of ANN processes input to output adaptation of ANN to certain data x 3 x 3 3 3 x 4 2 5 6 4

Proof: (by construction) Every boolean function F can be transformed in disjunctive normal form 2 layers (AND - OR). Every clause gets a decoding neuron with = n output = only if clause satisfied (AND gate) 2. All outputs of decoding neurons are inputs of a neuron with = (OR gate) q.e.d. Generalization: inputs with weights x 3,2 fires if,2 +,4 +,3 x 3,7,4,7 2 x,3 + 4 + 3 x 3 7 duplicate inputs! 7 x 3 equivalent! 7 8 Theorem: Weighted and unweighted MCP-nets are equivalent for weights Q +. Conclusion for MCP nets Proof: + feed-forward: able to compute any Boolean function Let N + recursive: able to simulate DFA very similar to conventional logical circuits Multiplication with yields inequality with coefficients in N difficult to construct Duplicate input x i, such that we get a i b b 2 b i- b i+ b n inputs. no good learning algorithm available Threshold = a b b n Set all weights to. q.e.d. 9 2 5

Perceptron (Rosenblatt 958) complex model reduced by Minsky & Papert to what is necessary Minsky-Papert perceptron (MPP), 969 = = AND OR NAND NOR What can a single MPP do? J isolation of yields: J Example: N J N N separating line separates R 2 in 2 classes XOR? xor w + w 2 < w 2 w w + w 2 < w, w 2 > w + w 2 2 contradiction! 2 22 969: Marvin Minsky / Seymor Papert how to leave the dead end : book Perceptrons analysis math. properties of perceptrons disillusioning result: perceptions fail to solve a number of trivial problems! - XOR-Problem - Parity-Problem - Connectivity-Problem conclusion : All artificial neurons have this kind of weakness! research in this field is a scientific dead end! consequence: research funding for ANN cut down extremely (~ 5 years) 23. Multilayer Perceptrons: 2. Nonlinear separating functions: XOR 2 2 realizes XOR g(, ) = 2 + 2 4 - with = g(,) = g(,) = + g(,) = + g(,) = 24 6

How to obtain weights w i and threshold? Perceptron Learning as yet: by construction example: NAND-gate x x NAND 2 w 2 w w + w 2 < requires solution of a system of linear inequalities ( P) (e.g.: w = w 2 = -2, = -3) Assumption: test examples with correct I/O behavior available Principle: () choose initial weights in arbitrary manner (2) fed in test pattern (3) if output of perceptron wrong, then change weights (4) goto (2) until correct output for al test paterns now: by learning / training graphically: translation and rotation of separating lines 25 26 Perceptron Learning P: set of positive examples N: set of negative examples Example. choose w at random, t = 2. choose arbitrary x P N 3. if x P and w t x > then goto 2 if x N and w t x then goto 2 4. if x P and w t x then w t+ = w t + x; t++; goto 2 5. if x N and w t x > then w t+ = w t x; t++; goto 2 I/O correct! let w x, should be >! (w+x) x = w x + x x > w x let w x >, should be! (w x) x = w x x x < w x threshold as a weight: w = (, w, w 2 ) - w w 2 suppose initial vector of weights is 6. stop? If I/O correct for all examples! w () = (, -, ) remark: algorithm converges, is finite, worst case: exponential runtime 27 28 7

We know what a single MPP can do. What can be achieved with many MPPs? Single MPP separates plane in two half planes Many MPPs in 2 layers canidentifyconvexsets A B a,b X: a + (- ) b X for (,). How? 2 layers! 2. Convex? 29 Single MPP separates plane in two half planes Many MPPs in 2 layers can identify convex sets Many MPPs in 3 layers can identify arbitrary sets Many MPPs in > 3 layers not really necessary! arbitrary sets:. partitioning of nonconvex set in several convex sets 2. two-layered subnet for each convex set 3. feed outputs of two-layered subnets in OR gate (third layer) 3 8