s Lecture 4 Netorks of McCulloch-Pitts Neurons The McCulloch and Pitts (M_P) Neuron x x sgn x n
Netorks of M-P Neurons One neuron can t do much on its on, but a net of these neurons x i x i i sgn i ij x ij x ni i th Neuron j th Neuron i-j Snapse x ki k ki x ij i ij sgn( n i x ki k ) i 3 Netorks of M-P Neurons We can connect several number of McCulloch-Pitts neurons together, as follo: Output laer Input laer An arrangement of one input laer of McCulloch-Pitts neurons feeding forard to one output laer of McCulloch-Pitts neurons as above is knon as a Perceptron. 4
Implementing Logic Gates ith M-P Neurons According to the McCulloch-Pitts Neuron properties e can use it to implement the basic logic gates. Not in out 0 0 And In in out 0 0 0 0 0 0 0 OR In in out 0 0 0 0 0 What should e do to implement or realize a logic gate, Not/AND/OR, b N.N.? 5 Implementing Logic Gates ith M-P Neurons What should e do to implement or realize a logic gate, Not/AND/OR, b N.N.? All e need to do is find the appropriate snapses (connection) eights and neuron thresholds to produce the right outputs corresponding to each set of inputs. To solutions can be introduced for this problem:. Analticall Approach. Learning Algorithms 6 3
Find Weights Analticall for NOT x sgn( x) in Not out sgn( ) 0 0 0 sgn( ) 0 0 So: 0.5 7 Find Weights Analticall for AND gate x sgn( x x ) x And In in out 0 0 0 0 0 0 0 sgn( ) sgn( ) 0 sgn( ) 0 sgn( ) 0 0 So:. 5 8 4
Find Weights Analticall for XOR gate x sgn( x x ) x XOR In in out 0 0 0 0 0 0 sgn( ) sgn( ) sgn( ) sgn( ) 0 0 0 But, the st equation is not compatible ith others. 9 Find Weights Analticall for XOR gate What is the solution? x x Ne questions: Ho can compute the eights and thresholds? Is analticall solution reasonable and practical or not? 0 5
A Ne Idea: Learning Algorithm Linearl separable problems: 0 0 0 Not -0. -0. 0 AND A Ne Idea: Learning Algorithm Wh is single laer neural netorks capable to solve the linearl separable problems? x sign ( x x ) x i x i i 0 i i x i 0 i i 0 x i i i 6
Learning Algorithm What is the goal of learning algorithm? We need a learning algorithm hich it updates the eights i () so that finall (at end of learning process) the input patterns lie on both sides of the line decided b the Perceptron. Step: Step: Step: 3 3 Learning Algorithm Perceptron Learning Rule: T ( x( ] x( ) ( t ) ( ( [ d( sign t Desired Output: d( if if x( x( in class in class ( 0: Learning rate 4 7
Preparing the Perceptron for Learning x x b x( x ( x ( ) t ( b( ( ( ) t x x b b(: bias (: Actual Response of N.N. 5 Preparing the Perceptron for Learning Training Data: x (), d () x (), d () x ( p ), d ( p ) x x b T ( x( ] x( ) ( t ) ( ( [ d( sign t 6 8
Learning Algorithm. Initialization Set (0)=rand. Then perform the folloing computatio for time step t=,,.... Activation At time step t, activate the Perceptron b appling input vector x( and desired response d( 3. Computation the actual response of N.N. Compute the actual response of the Perceptron ( = sign ( ( x( T ) 4. Adaptation of eight vector Update the eight vector of the Perceptron (t+) = (+ η( [ d( - ( ] x( 5. Continuation and return to. x B xdr. B. Moaveni b d 7 Learning Algorithm Where or When to stop? There are to approaches to stop the learning process:. Converging the generalized error to a zero constant value.. Repeat the learning process for predefined number. x(), d() x(), d() x( p), d( p) G. E. p T d ( sign( x( t 8 9
Training Tpes To tpes of netork training: Sequential mode (on-line, stochastic, or per-pattern) Weights updated after each pattern is presented (Perceptron is in this class) Batch mode (off-line or per-epoch) Weights updated after all pattern in a period is presented 9 st Mini Project. B using the Perceptron learning rule generate a N.N. to represent a NOT gate.. B using the Perceptron learning rule generate a N.N. to represent a AND gate (Sequential and Batch Mode). 3. B using the Perceptron learning rule generate a N.N. to represent a XOR gate (Sequential and Batch Mode). 4. Please sho that the generalized error converge to constant value after a learning process. 5. Please test the above N.N.s b testing data? 6. Please check the above N. N.s ith data hich added to noise. 7. Repeat the learning process for above N.N.s in both ith and ithout bias (parts & ). 8. Please plot the updated eights. 0 0