ECLT 5810 Classification Neural Networks. Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann

Similar documents
CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning

Data Mining Part 5. Prediction

Neural Networks and Ensemble Methods for Classification

Data Mining. Preamble: Control Application. Industrial Researcher s Approach. Practitioner s Approach. Example. Example. Goal: Maintain T ~Td

COGS Q250 Fall Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November.

Artificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino

Artificial Intelligence

Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!

Neural networks. Chapter 19, Sections 1 5 1

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

Neural networks. Chapter 20. Chapter 20 1

Chapter 3 Supervised learning:

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

EEE 241: Linear Systems

CS:4420 Artificial Intelligence

Multilayer Perceptrons and Backpropagation

Multilayer Neural Networks

Learning Deep Architectures for AI. Part I - Vijay Chakilam

Lecture 4: Perceptrons and Multilayer Perceptrons

EPL442: Computational

Learning and Neural Networks

The error-backpropagation algorithm is one of the most important and widely used (and some would say wildly used) learning techniques for neural

Artificial Neural Networks. Edward Gatt

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis

Simple neuron model Components of simple neuron

Multilayer Perceptron

Artifical Neural Networks

Neural networks. Chapter 20, Section 5 1

COMP-4360 Machine Learning Neural Networks

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

Neural Networks. Nicholas Ruozzi University of Texas at Dallas

Revision: Neural Network

Machine Learning for Large-Scale Data Analysis and Decision Making A. Neural Networks Week #6

Introduction to Artificial Neural Networks

AI Programming CS F-20 Neural Networks

SPSS, University of Texas at Arlington. Topics in Machine Learning-EE 5359 Neural Networks

Classification goals: Make 1 guess about the label (Top-1 error) Make 5 guesses about the label (Top-5 error) No Bounding Box

Input layer. Weight matrix [ ] Output layer

Neural Networks biological neuron artificial neuron 1

CS249: ADVANCED DATA MINING

COMP 551 Applied Machine Learning Lecture 14: Neural Networks

Backpropagation Neural Net

Lab 5: 16 th April Exercises on Neural Networks

Multilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs)

Artificial neural networks

Article from. Predictive Analytics and Futurism. July 2016 Issue 13

Multilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs)

y(x n, w) t n 2. (1)

Deep Learning. Basics and Intuition. Constantin Gonzalez Principal Solutions Architect, Amazon Web Services

Neural Networks and the Back-propagation Algorithm

CS 4700: Foundations of Artificial Intelligence

Neural networks and support vector machines

Final Examination CS 540-2: Introduction to Artificial Intelligence

Lecture 10. Neural networks and optimization. Machine Learning and Data Mining November Nando de Freitas UBC. Nonlinear Supervised Learning

Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks. Cannot approximate (learn) non-linear functions

<Special Topics in VLSI> Learning for Deep Neural Networks (Back-propagation)

International Journal of Advanced Research in Computer Science and Software Engineering

Supervised Learning in Neural Networks

Lecture 6. Notes on Linear Algebra. Perceptron

In the Name of God. Lecture 11: Single Layer Perceptrons

4. Multilayer Perceptrons

Single layer NN. Neuron Model

Week 5: The Backpropagation Algorithm

Multilayer Neural Networks

Decision Support Systems MEIC - Alameda 2010/2011. Homework #8. Due date: 5.Dec.2011

Artificial Neural Networks

A Novel Activity Detection Method

Intelligent Systems Discriminative Learning, Neural Networks

Speaker Representation and Verification Part II. by Vasileios Vasilakakis

CMSC 421: Neural Computation. Applications of Neural Networks

CSE 417T: Introduction to Machine Learning. Final Review. Henry Chai 12/4/18

CSC 411 Lecture 10: Neural Networks

Mining Classification Knowledge

Ch.8 Neural Networks

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption

Pattern Classification

Introduction to Deep Learning

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009

Neural Networks. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington

BACKPROPAGATION. Neural network training optimization problem. Deriving backpropagation

Topic 3: Neural Networks

Simple Neural Nets For Pattern Classification

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm

Course 395: Machine Learning - Lectures

Supervised (BPL) verses Hybrid (RBF) Learning. By: Shahed Shahir

Artificial Neural Networks

Nonlinear Classification

Machine Learning. Neural Networks

Mining Classification Knowledge

Holdout and Cross-Validation Methods Overfitting Avoidance

Administration. Registration Hw3 is out. Lecture Captioning (Extra-Credit) Scribing lectures. Questions. Due on Thursday 10/6

Neural Networks and Deep Learning

Artificial Neural Networks. Historical description

Feedforward Neural Nets and Backpropagation

Neural Networks, Computation Graphs. CMSC 470 Marine Carpuat

Introduction to Neural Networks

1 Introduction 2. 4 Q-Learning The Q-value The Temporal Difference The whole Q-Learning process... 5

Transcription:

ECLT 5810 Classification Neural Networks Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann

Neural Networks A neural network is a set of connected input/output units where each connection has a weight associated with it. It is also referred to as connectionist learning. Has been applied to various industry including financial area e.g. stock market trading ECLT 5810 Classification - Neural Networks 2

Demonstrating Some Intelligence Mastering the game of Go with Deep Neural Networks and Tree Search, Nature 529, Jan 28, 2016. ECLT 5810 Classification - Neural Networks 3

Application in Financial Industry Algorithms Take Control of Wall Street, Felix Salmon and Jon Strokes Magazine, 2010 Dec 27. ECLT 5810 Classification - Neural Networks 4

Neural Networks Advantages prediction accuracy is generally high robust, works when training examples contain errors fast evaluation of the learned target function Criticism long training time difficult to understand the learned function (weights) not easy to incorporate domain knowledge ECLT 5810 Classification - Neural Networks 5

Multi-Layer Feed-Forward Network Output vector Output nodes Hidden nodes θ neuron (processing unit) w i Input nodes Input vector: x i ECLT 5810 Classification - Neural Networks 6

Defining a Network Topology Normalize the input values for each attribute Discrete-valued attributes may be encoded such that there is one input unit per domain value An output unit is used to represent two classes. If there are more than two classes, then one output unit per class is used. ECLT 5810 Classification - Neural Networks 7

A Neuron (I) The n-dimensional input vector is mapped into the output variable by means of the scalar product and a nonlinear function mapping - (bias) O 0 w 0 O 1 O n w 1 w n I f Output O Inputs (outputs from previous layer) weight vector w weighted sum Activation function ECLT 5810 Classification - Neural Networks 8

A Neuron (II) - O 0 w 0 O 1 O n w 1 w n I i w i O i I f O' Output O 1 1 e I squashing function (to map a large input domain onto [0,1]) ECLT 5810 Classification - Neural Networks 9

An example of a neural network 1 X 1 1 W 14 4 Assume all the weights and thresholds have been trained 0 X 2 2 W 24 W 15 W 25 4 5 W 46 6 6 Prediction 1 X 3 W 34 3 W 35 5 W 56 X 1 X 2 X 3 W 14 W 15 W 24 W 25 W 34 W 35 W 46 W 56 4 5 6 1 0 1 0.2-0.3 0.4 0.1-0.5 0.2-0.3-0.2-0.4 0.2 0.1 ECLT 5810 Classification - Neural Networks 10

Using the Neural Network for Prediction Prediction output = 0 ECLT 5810 Classification - Neural Networks 11

Network Training The ultimate obective of training obtain a set of weights that makes almost all the tuples in the training data classified correctly Steps Initialize weights with random values While terminating condition not satisfied For each training samples (input tuples) Compute the output value of each unit (including the output unit) // back-propagate the errors For each output unit Compute the error For each hidden unit Compute the error // weight and bias updating (case updating) For each output and hidden unit Update the weights and the bias ECLT 5810 Classification - Neural Networks 12

Network Training (Backpropagation) (I) Output vector Output nodes Err where O T ( 1 O )( T O is the true output ) Hidden nodes Err w k O (1 O ) k Err k w k Input nodes Input vector: x i ECLT 5810 Classification - Neural Networks 13

Network Training (Backpropagation) (II) Output vector Output nodes θ l is the learning rate w i (l) Err w w ( l) i i O i Hidden nodes θ Input nodes w i (l) Err w w ( l) Err O i i i Input vector: x i ECLT 5810 Classification - Neural Networks 14

An example of training a neural network 1 X 1 1 W 14 4 0 1 X 2 X 3 2 W 24 W 34 3 W 15 W 25 W 35 4 5 5 W 46 W 56 6 6 Correct Output=1 X 1 X 2 X 3 W 14 W 15 W 24 W 25 W 34 W 35 W 46 W 56 4 5 6 1 0 1 0.2-0.3 0.4 0.1-0.5 0.2-0.3-0.2-0.4 0.2 0.1 Assume these are initial values for training ECLT 5810 Classification - Neural Networks 15

Learning Example ECLT 5810 Classification - Neural Networks 16

Learning rate = 0.9 ECLT 5810 Classification - Neural Networks 17

ECLT 5810 Classification - Neural Networks 18

Weight Updating Case updating The weights and biases are updated after the presentation of each sample. Epoch updating The weight and bias increments could be accumulated in variables, so that the weights and biases are updated after all of the samples in the training set have been presented. In practice, case updating is more common. ECLT 5810 Classification - Neural Networks 19

Terminating Condition Training stops when All changes in weights were so small as to be below some threshold, or The percentage of samples misclassified is below some threshold, or A pre-specified number of epochs has expired. In practice, several hundreds of thousands of epochs may be required before the weights will converge. ECLT 5810 Classification - Neural Networks 20