Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!

Similar documents
Neural networks. Chapter 19, Sections 1 5 1

Neural networks. Chapter 20. Chapter 20 1

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Neural networks. Chapter 20, Section 5 1

18.6 Regression and Classification with Linear Models

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1

CS:4420 Artificial Intelligence

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks. Todd W. Neller

Regression and Classification" with Linear Models" CMPSCI 383 Nov 15, 2011!

Linear classification with logistic regression

Learning from Examples

Neural Networks DWML, /25

Lecture 4: Perceptrons and Multilayer Perceptrons

Artificial Intelligence

Artificial neural networks

Data Mining Part 5. Prediction

Ch.8 Neural Networks

Learning and Neural Networks

CSC242: Intro to AI. Lecture 21

Artificial Intelligence Roman Barták

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Sections 18.6 and 18.7 Artificial Neural Networks

Artificial Neural Networks

Neural Networks (Part 1) Goals for the lecture

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

AI Programming CS F-20 Neural Networks

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis

COMP 551 Applied Machine Learning Lecture 14: Neural Networks

Artificial Neural Networks

Lab 5: 16 th April Exercises on Neural Networks

CSC321 Lecture 5: Multilayer Perceptrons

Grundlagen der Künstlichen Intelligenz

Sections 18.6 and 18.7 Artificial Neural Networks

Sections 18.6 and 18.7 Analysis of Artificial Neural Networks

SPSS, University of Texas at Arlington. Topics in Machine Learning-EE 5359 Neural Networks

Feedforward Neural Nets and Backpropagation

CS 4700: Foundations of Artificial Intelligence

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009

Neural Networks and Deep Learning

ECE 471/571 - Lecture 17. Types of NN. History. Back Propagation. Recurrent (feedback during operation) Feedforward

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

Neural Networks Lecturer: J. Matas Authors: J. Matas, B. Flach, O. Drbohlav

Machine Learning. Neural Networks

Artificial Neural Networks. Q550: Models in Cognitive Science Lecture 5

Neural Networks, Computation Graphs. CMSC 470 Marine Carpuat

Introduction to Neural Networks

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning

Feed-forward Networks Network Training Error Backpropagation Applications. Neural Networks. Oliver Schulte - CMPT 726. Bishop PRML Ch.

MIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October,

Nonlinear Classification

100 inference steps doesn't seem like enough. Many neuron-like threshold switching units. Many weighted interconnections among units

Machine Learning and Data Mining. Multi-layer Perceptrons & Neural Networks: Basics. Prof. Alexander Ihler

Neural Networks. Nicholas Ruozzi University of Texas at Dallas

Neural Networks and the Back-propagation Algorithm

Mining Classification Knowledge

Neural Networks. Bishop PRML Ch. 5. Alireza Ghane. Feed-forward Networks Network Training Error Backpropagation Applications

Artificial Neural Networks

Course 395: Machine Learning - Lectures

Feed-forward Network Functions

ECLT 5810 Classification Neural Networks. Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann

COMP-4360 Machine Learning Neural Networks

Multilayer Perceptrons and Backpropagation

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92

ECE521 Lectures 9 Fully Connected Neural Networks

Lecture 5: Logistic Regression. Neural Networks

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan

Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks. Cannot approximate (learn) non-linear functions

Artificial Neural Networks. Edward Gatt

Artifical Neural Networks

CS 4700: Foundations of Artificial Intelligence

Multilayer Perceptron

Part 8: Neural Networks

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

Supervised Learning Artificial Neural Networks

Machine Learning (CSE 446): Neural Networks

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Artificial Neural Networks. Historical description

Artificial Neural Network

y(x n, w) t n 2. (1)

Neural Networks. Intro to AI Bert Huang Virginia Tech

CMSC 421: Neural Computation. Applications of Neural Networks

Artificial Neural Networks. MGS Lecture 2

Unit III. A Survey of Neural Network Model

Lecture 7 Artificial neural networks: Supervised learning

Artificial Neural Networks

Neural Networks. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington

Logistic Regression & Neural Networks

Neural Networks and Ensemble Methods for Classification

Mining Classification Knowledge

Artificial Neural Networks. Introduction to Computational Neuroscience Tambet Matiisen

Neural Networks biological neuron artificial neuron 1

<Special Topics in VLSI> Learning for Deep Neural Networks (Back-propagation)

Introduction To Artificial Neural Networks

EEE 241: Linear Systems

Administration. Registration Hw3 is out. Lecture Captioning (Extra-Credit) Scribing lectures. Questions. Due on Thursday 10/6

DM534 - Introduction to Computer Science

In the Name of God. Lecture 11: Single Layer Perceptrons

Transcription:

Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011! 1

Todayʼs lecture" How the brain works (!)! Artificial neural networks! Perceptrons! Multilayer feed-forward networks! Error backpropagation algorithm! Nonparametric methods! Nearest neighbor models! Nonparametric regression! 2

How the brain works" Remains a great mystery of science!! Basic component: the neuron.! 3

The brain and the digital computer" Even though a computer is a million times faster in raw switching speed, the brain ends up being a billion times faster at what it does." Massive parallelism! 4

Artificial Neural Networks" Also called: neural networks, connectionist systems, neuromorphic systems, parallel distributed processing (PDP) systems, etc." Networks of relatively simple processing units, which are very abstract models of neurons; the network does the computation more than the units.! 5

Neuron-like units" i w i, j j 6

Typical activation functions" 7

A useful trick" n a j = step t ( w i, j a i ) = step 0 ( w i, j a i ) i=1 n i=0 where! w 0, j = t a 0 = 1 So we can always! assume a threshold of 0! if we add an extra input! always equal to 1! 8

NNs and logic" McCulloch and Pitts, 1943: showed that whatever! you can do with logic networks, you can do with! networks of abstract neuron-like units.! Units with step function activation functions! 9

Network structures" Feed-forward vs. recurrent networks! Multi-layer feed-forward networks! Output node(s)! Input nodes! Hidden nodes! 10

Network structures" 11

Perceptrons" Name given in 1950s to layered feed-forward networks.! O = Step 0 ( W j I j ) j = Step 0 (W I) 12

What can perceptrons represent" Only linearly separable functions! 13

In three dimensions" 14

Perceptrons vs. Decision trees" Majority function with 11 inputs! WillWait from restaurant example! 15

Multilayer feed-forward nets" 16

Representing complicated functions" A single logistic unit! 17

Back-propagation learning" To update weights from hidden units to output unit! Err k = k th component of y h w w j,k w j,k + α a j Err k g ʹ (in k ) letting Δ k = Err k g ʹ (in k ) this becomes w j,k w j,k + α a j Δ k i w i, j j w j,k k input unit! hidden unit! output unit! 18

Back-prop contd." To update weights from input units to hidden units! Δ j = g ʹ (in j ) k w j,k Δ k w i, j w i, j + α a i Δ j i w i, j j w j,k k input unit! hidden unit! output unit! 19

Back-prop as gradient descent" k Err = (y k a k ) 2 sum squared error! An error surface for net with! linear activation functions! If nonlinear, much more! complicated: local minima! 20

Multilayer net vs. decision tree" 21

Comments on network learning" Expressiveness: given enough hidden units, can represent any function (almost).! Computational efficiency: generally slow to train, but fast to use once trained.! Generalization: good success in a number of real-world problems.! Sensitivity to noise: very tolerant to noise in data! 22

Comments contd." Transparency: not good!! Prior knowledge: not particularly easy to insert prior knowledge, although possible.! 23

Nonparametric Methods" Parametric model: a learning model that has a set of parameters of fixed size! e.g., linear models, neural networks (of fixed size)! Nonparametric model: a learning model whose set of parameters is not bounded! Parameter set grows with the number of training examples! e.g., just save the examples in a lookup table! 24

Nearest Neighbor Models" K-nearest neighbors algorithm:! Save all the training examples! For classification: find k nearest neighbors of the input and take a vote (make k odd)! For regression: take mean or median of the k nearest neighbors, or do a local regression on them! How do you measure distance?! How do you efficiently find the k nearest neighbors?! 25

Distance Measures" Minkowski distance L p (x j,x q ) = i p =1 Manhattan distance p = 2 Euclidean distance p x j,i x q,i 1/ p Hamming distance for Boolean attribute values 26

k-nearest neighbor for k=1 and k=5" 27

Curse of Dimensionality" In high dimensions, the nearest points tend to be far away.! 28

Nonparametric Regression" 29

Locally Weighted Regression" j w* = argmin w K(Distance(x q,x j ))(y j w x j ) 2 30

Summary" How the brain works (!)! Artificial neural networks! Perceptrons! Multilayer feed-forward networks! Error backpropagation algorithm! Nonparametric methods! Nearest neighbor models! Nonparametric regression! 31

Next Class" Learning Probabilistic Models! Secs. 20.1 20.4! 32