New Delhi & Affiliated to VTU, Belgavi ) Oorgaum, Kolar Dist ,Karnataka

Similar documents
Simple neuron model Components of simple neuron

Artificial Neural Network

Artificial Neural Network and Fuzzy Logic

EEE 241: Linear Systems

Neural Networks Introduction

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan

Artifical Neural Networks

Simple Neural Nets for Pattern Classification: McCulloch-Pitts Threshold Logic CS 5870

Artificial Neural Networks. Historical description

Lecture 4: Feed Forward Neural Networks

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

Data Mining Part 5. Prediction

2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks. Todd W. Neller

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1

Design of Basic Logic Gates using CMOS and Artificial Neural Networks (ANN)

Artificial neural networks

Computational Intelligence Winter Term 2009/10

Ch.8 Neural Networks

Computational Intelligence

Introduction Biologically Motivated Crude Model Backpropagation

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Neural networks. Chapter 19, Sections 1 5 1

Lecture 7 Artificial neural networks: Supervised learning

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

CS:4420 Artificial Intelligence

Computational Intelligence

Perceptron. (c) Marcin Sydow. Summary. Perceptron

) (d o f. For the previous layer in a neural network (just the rightmost layer if a single neuron), the required update equation is: 2.

Introduction and Perceptron Learning

Reduced-Area Constant-Coefficient and Multiple-Constant Multipliers for Xilinx FPGAs with 6-Input LUTs

Research Article Architecture Analysis of an FPGA-Based Hopfield Neural Network

Artificial Neural Networks. Q550: Models in Cognitive Science Lecture 5

FPGA IMPLEMENTATION OF BASIC ADDER CIRCUITS USING REVERSIBLE LOGIC GATES

Artificial Neural Networks. Part 2

Simple Neural Nets For Pattern Classification

Machine Learning and Data Mining. Multi-layer Perceptrons & Neural Networks: Basics. Prof. Alexander Ihler

Part 8: Neural Networks

Design and Implementation of REA for Single Precision Floating Point Multiplier Using Reversible Logic

Sections 18.6 and 18.7 Artificial Neural Networks

Synaptic Devices and Neuron Circuits for Neuron-Inspired NanoElectronics

Linear Regression, Neural Networks, etc.

Reification of Boolean Logic

Artificial Neural Networks Examination, June 2004

Introduction To Artificial Neural Networks

Neural networks. Chapter 20, Section 5 1

Sections 18.6 and 18.7 Artificial Neural Networks

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm

NEUROMORPHIC COMPUTING WITH MAGNETO-METALLIC NEURONS & SYNAPSES: PROSPECTS AND PERSPECTIVES

Artificial Neural Networks

Introduction to the Xilinx Spartan-3E

DM534 - Introduction to Computer Science

CMSC 421: Neural Computation. Applications of Neural Networks

Neural Networks: Introduction

Neural networks. Chapter 20. Chapter 20 1

Simulating Neural Networks. Lawrence Ward P465A

Address for Correspondence

THE MOST IMPORTANT BIT

Are Rosenblatt multilayer perceptrons more powerfull than sigmoidal multilayer perceptrons? From a counter example to a general result

Multilayer Perceptron Tutorial

Effect of number of hidden neurons on learning in large-scale layered neural networks

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning

DESIGN AND ANALYSIS OF A FULL ADDER USING VARIOUS REVERSIBLE GATES

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92

y k = (a)synaptic f(x j ) link linear i/p o/p relation (b) Activation link linear i/p o/p relation

FUZZY PERFORMANCE ANALYSIS OF NTT BASED CONVOLUTION USING RECONFIGURABLE DEVICE

Artificial Neural Networks Examination, June 2005

Revision: Neural Network

Artificial Intelligence

DESIGN AND IMPLEMENTATION OF EFFICIENT HIGH SPEED VEDIC MULTIPLIER USING REVERSIBLE GATES

An Area Efficient Enhanced Carry Select Adder

DSP Configurations. responded with: thus the system function for this filter would be

FPGA Implementation of Ripple Carry and Carry Look Ahead Adders Using Reversible Logic Gates

Compartmental Modelling

Instituto Tecnológico y de Estudios Superiores de Occidente Departamento de Electrónica, Sistemas e Informática. Introductory Notes on Neural Networks

COMP9444 Neural Networks and Deep Learning 2. Perceptrons. COMP9444 c Alan Blair, 2017

Performance Enhancement of Reversible Binary to Gray Code Converter Circuit using Feynman gate

Neural networks (not in book)

Νεςπο-Ασαυήρ Υπολογιστική Neuro-Fuzzy Computing

CSC321 Lecture 5: Multilayer Perceptrons

Reservoir Computing with Stochastic Bitstream Neurons

Neural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2

Reversible Circuit Using Reversible Gate

Using a Hopfield Network: A Nuts and Bolts Approach

Neural Networks Introduction CIS 32

Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!

Design Collocation Neural Network to Solve Singular Perturbed Problems with Initial Conditions

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption

ECE 407 Computer Aided Design for Electronic Systems. Simulation. Instructor: Maria K. Michael. Overview

Sections 18.6 and 18.7 Analysis of Artificial Neural Networks

Power Minimization of Full Adder Using Reversible Logic

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Study of a neural network-based system for stability augmentation of an airplane

Artificial Neural Networks Examination, March 2004

High Speed Time Efficient Reversible ALU Based Logic Gate Structure on Vertex Family

Neural Networks. Nicholas Ruozzi University of Texas at Dallas

Design of Digital Adder Using Reversible Logic

AI Programming CS F-20 Neural Networks

Development and Implementation of the LIRA Neural Classifier

Transcription:

Design and Implementation of Logic Gates and Adder Circuits on FPGA Using ANN Neelu Farha 1., Ann Louisa Paul J 2., Naadiya Kousar L S 3., Devika S 4., Prof. Ruckmani Divakaran 5 1,2,3,4,5 Department of Electronics and communication Engineering, Dr.T.Thimmaiah Institute of Technology (Approved by AICTE, New Delhi & Affiliated to VTU, Belgavi ) Oorgaum, Kolar Dist.-563120,Karnataka Abstract- In this paper, Design and hardware implementation of multiple neurons on Field Programmable (FPGA)is done sequentially. First Multiple Neurons are implemented then logic Gates and Adder circuit are implemented using Feed Forward Neural Network. FPGA has been used to reduce the neuron hardware by designing activation function inside the neuron without using lookup tables. With low precision neural network designs, FPGA's have higher speed and smaller in size for real time applications compare to VLSI. Keywords- Artificial Neural Network, Activation functions, Feed Forward Neural Network, verilog. I. INTRODUCTION As the title suggests our project deals with the hardware implementation of Artificial Neural Networks using Activation functions as application on Logic Gates,Half Adder and Full Adder. In technological view humans are trying to emulate the behavior of Biological Neuron. Artificial Neuron can be modeled to develop future applications in computational Neuroscience, as well as Artificial intelligence. ANN simplify the behavior of human brain so their applications are used in different fields such as Medical applications, Security, Robotics, Electronics, Military etc. ANN's are implemented in software, but designers need not to know the inner network of neural network elements, they are only to concentrate on application of neural network. However, software based ANN's used in real time application is slower execution compared with hardware based ANN's FPGA based architectures offer high flexibility designs. Digital system architecture is to realize Feed Forward Multilayer neural network. The developed architecture is designed using very high speed integrated circuit hardware description language. There are 3 important layers in neuron model, they are input layer, hidden layer and output layer. ANN's are biologically inspired and require parallel computation, hence microprocessors and DSP's are not suitable for parallel designs [1]. II. NEURON ACTIVATION FUNCTION One of the most important part of a neuron is its Activation function and the Activation function of a node defines the output of that node given in an input or set of inputs. The non-linearity of the Activation function makes it possible to approximate any function. The types of Activation functions used in this work are Symmetrical Hard limit Activation Function, Saturating Linear Activation Function and Sigmoid Activation Function [2]. A. Symmetrical Hard limit Activation Function It is referred as 'Hardlims'. Its is used to classify inputs into two distinct categories. Hard limiting means clipping, it is an limiting action in which there is a over permitted dynamic range, negligible variation in the expected characteristics of the output signal and a steady state signal at the maximum permitted level. Hard limit activation function can be calculated as follows. 623

Fig 1. Hardlimit Function B. Saturated Linear Activation Function Satlin is a neural transfer function. Transfer function calculate a layers output from its net input. The output is as shown below Fig 2. Satlin Function C. Sigmoid Activation Function In the hardware concept of Neural Network it is not easy to implement on FPGA, because it consists of infinite exponential series. Formula to calculate sigmoid function is as follows. y = 1 1 e (x t) Fig 3: Sigmoid Function III. COMPARISION OF BIOLOGICAL NEURON WITH ARTIFICIAL NEURON A Human Brain consists of large number of neurons.a typical neuron collects a signal from hair like structures called Dendrites. Neurons sends out a spike of electrical signal through a long, this stand known as Axon [3]. The comparision between biological neuron and Artificial neuron is shown in fig. 4, where the Neuron is compared to processing element, Dendrites are compared as combining functions, Cell body is compared as transfer function, Synapse are compared as weights and Axon is compared to element output. 624

Fig 4. Comparison of biological Neuron and Artificial Neuron IV. ARTIFICIAL NEURAL NETWORK WITH FEED FORWARD SYSTEM Artificial Neural Network is an abstract description of human brain. It is a highly interconnected mathematical model that can be programmed in software and moduled in hardware. It provides very high degree of parallel computation. Fig 5. Functional Model of an Artificial Neuron A. Working of Neuron The working model of neuron is shown in Fig.5. It consists of n bits of input and weights which are multiplied with shift and add multiplier. There are two sub inputs for each neuron and output result is given to activation function [4]. These activation functions are used to fire the neuron when their action potential crosses its threshold. Result of corresponding activation block will generate 1 bit output signal to fire neuron. N u= x i w i +b i i 1 B. Feed Forward Neural Network The Feed Forward networks are generally arranged in distinct layers that contains only forward path that is shown in Fig.6. In this type of network each layer receives inputs from the previous layer and outputting to the next layer which indicates there is no 625

feedback. It means that signals from 1 layer are not transmitted to a previous layer. Fig 6. Structure of Feed Forward Neural Network V. DESIGN OF LOGIC GATES USING ARTIFICIAL NEURAL NETWORK A. AND Gate Using Hard Limit Activation Function Fig 7. Neuron Model for AND Gate Hard Limit Activation Function In case of Hard limit activation function A High output (1) results only if obtained output from neural network is greater than 0. The low output(-1) results only if obtained output from neural network is less than 0. B. OR Gate Using Saturating Linear Activation Function The OR gate is a digital logic gate that implements logical disjunction. A HIGH output (1) results if one or both the inputs to the gate are high. If neither input is high, a LOW output 0 results. In Neural network, if output is greater than 0 it is considered as +1,if output is less than or greater than 0 it is considered to be -1.This can be decided by the Saturating Linear activation function. C. XOR Gate Using Sigmoid Activation Function Fig 8. Neuron Model for OR Gate using Saturating Linear Activation Function Fig 9. Neuron Model for XOR Gate using Sigmoid Activation Function 626

A. Half Adder Using Neural Network VI. DESIGN OF HALF ADDER AND FULL ADDER USING ANN B. Full Adder Using Neural Network Fig 10. Neuron Model for Half Adder Fig 11. Neuron Model for Full Adder 627

VII. RESULTS Fig12. Sigmoid MATLAB Curve Fig13.Simulation for Half Adder using Activation function Fig14.Simulation for Full Adder using Activation function 628

Fig 15.ANN Hardware implementation on Spartan-3 on FPGA VIII. CONCLUSION The Logic Gates, Half adder and Full adder using activation function is successfully implemented in Artificial Neural Network. The code is written in Verilog and successfully simulated and synthesized using Xilinx ISE9.1 using Spartan-3 FPGA. ANNs are resulting high degree of parallel computation. FPGA not only offers flexible designs and saving in cost.hardware implementation of Logic Gates, Half Adder and Full Adder using Activation function is done on Spartan-3 FPGA and the device used is XC3S400 with package TQ144.Hence we can conclude that Artificial Neural Network is a best field to implement Combinational circuits on FPGA. REFERENCES [1] Engel,Fernando.M G.Moraes Rolf F. Molz, Paulo," Codesign to Fully Parallel Neural Network for a Classification Problem",University Montpellier II, France, 2000. [2] Esraa Zeki Mohammed and Haitham Kareem Ali, Hardware Implementation of Artificial Neural Network Using Field Programmable Gate Array International Journal of Computer Theory and Engineering, Vol. 5, No. 5, October 2013. Haitham Kareem Ali and Esraa Zeki Mohammed, Design Artificial Neural Network Using FPGA IJCSNS VOL.10 No.8, August 2010. [3] Hardik H. Makwana, Dharmesh J. Shah, Priyesh P. Gandhi, FPGA Implementation of Artificial Neural Network International Journal of Emerging Technology an Advanced Engineering Volume 3, Issue 1, January 2013 [4] Muthuramalingam. A,S. Himavathi, and E.Srinivasan, Neural network implementation using fpga : Issues and Application, The International Journal of Information Technology, vol. 4, no.2, pp.86-92, 2008. 629