Cellular Automata Evolution for Pattern Recognition
|
|
- Jason Mills
- 6 years ago
- Views:
Transcription
1 Cellular Automata Evolution for Pattern Recognition Pradipta Maji Center for Soft Computing Research Indian Statistical Institute, Kolkata, , INDIA Under the supervision of Prof. P Pal Chaudhuri Prof. Debesh K Das Professor Emeritus Professor Dept. of Comp. Sci. & Tech. Dept. of Comp. Sci. & Engg. Bengal Engineering College (DU), Shibpur, INDIA Jadavpur University, INDIA
2 Introduction Cellular Automata (CA) promising research area Artificial Intelligence (AI) Artificial Life (ALife) Considerable research in modeling tool image processing language recognition pattern recognition VLSI testing Cellular Automata (CA) learns association from a set of examples apply this knowledge-base to handle unseen cases such associations effective for classifying patterns
3 Contribution of the Thesis Analysis and synthesis of linear boolean CA (MACA) CA with only XOR logic application of MACA in pattern recognition data mining image compression fault diagnosis of electronic circuit Analysis and synthesis of non-linear boolean CA (GMACA) CA with all possible logic application of non-linear CA in pattern recognition Analysis and synthesis of fuzzy CA (FMACA) CA with fuzzy logic application of fuzzy CA in pattern recognition
4 Cellular Automata (CA) A special type of computing model 50 s - J Von Neumann 80 s - S. Wolfram A CA displays three basic characteristics Simplicity: Basic unit of CA cell is simple Vast parallelism: CA achieves parallelism on a scale larger than massively parallel computers Locality: CA characterized by local connectivity of its cell all interactions take place on a purely local basis a cell can only communicate with its neighboring cells interconnection links usually carry only a small amount of information no cell has a global view of the entire system
5 Cellular Automata (CA) A computational model with discrete cells updated synchronously Uniform CA, hybrid / non-uniform CA, null boundary CA, periodic boundary CA Each cell can have 256 different rules state 3-neighborhood CA cell Clock Input output 0/1 From left neighbor Combinati onal Logic From right neighbor
6 For 2nd Cell Rule 230 PS NS CA State Transition
7 Different Types of CA Linear CA Based on XOR logic Total 7 rules (60, 90, 102, 150, 170, 204, 240) Can be expressed through matrix (T), characteristic polynomial Next state of the CA cell P(t+1) = T. P(t) Additive CA Based on XOR and XNOR logic Total 14 rules (linear rules + 195,165,153,105,85,51,15) Can be expressed through matrix, inversion vector, characteristic polynomial The next state of the CA cell P(t+1) = T. P(t) + F T = F =
8 Additive Cellular Automata XOR Logic XNOR Logic Rule 60 : q I (t+1) = q I-1 (t) q I (t) Rule 90 : q I (t+1) = q I-1 (t) q I+1 (t) Rule 195 : q I (t+1) = q I-1 (t) q I (t) Rule 165 : q I (t+1) = q I-1 (t) q I+1 (t) Rule 102 : q I (t+1) = q I (t) q I+1 (t) Rule 153 : q I (t+1) = q I (t) q I+1 (t) Rule 150 : q I (t+1) = q I-1 (t) q I (t) q I-1 (t) Rule 105 : q I (t+1) = q I-1 (t) q I (t) q I-1 (t) Rule 170 : q I (t+1) = q I-1 (t) Rule 204 : q I (t+1) = q I (t) Rule 85 : q I (t+1) = q I-1 (t) Rule 51 : q I (t+1) = q I (t) Rule 240 : q I (t+1) = q I+1 (t) Rule 15 : q I (t+1) = q I+1 (t)
9 CA - State Transition Diagram Group CA Non-group Cellular Automata Linear Non-linear Fuzzy 0 1 Non-group CA Associative Memory MACA GMACA FMACA Perform pattern recognition task
10 Pattern Recognition Pattern Recognition/Classification most important foundation stone of knowledge extraction methodology demands automatic identification of patterns of interest (objects, images) from its background (shapes, forms, outlines, etc) conventional approach machine compares given input pattern with each of stored patterns identifies the closest match time to recognize the closest match O(k) recognition slow Associative Memory Entire state space - divided Transient into some pivotal points Transient Transient 1. MACA (linear) 2. GMACA (non-linear) 3. FMACA (fuzzy) States close to pivot - associated with that pivot Time to recognize a pattern - Independent of number of stored patterns
11 Multiple Attractor CA (MACA) Employs linear CA rules State with self loop attractor Transient states and attractor form attractor basin Behaves as an associative memory Forms natural clusters
12 Multiple Attractor CA (MACA) Next state: P(t+1) = T. P(t) Characteristic Polynomial: X (n-m) (1+X) m where m=log 2 (k) n denotes number of CA cell k denotes number of attractor basins Depth d of MACA number of edges between a non-reachable state and an attractor state Attractor of a basin: P(t+d) = T d P(t) m-bit positions pseudo-exhaustive: extract PEF (pseudo-exhaustive field) from attractor state Problem: Complexity of identification of attractor basin is O(n 3 ) Exponential search space Redundant solutions
13 Chromosome of Genetic Algorithm T = Rule vector Matrix Characteristic polynomial x 3 (1+x) 2 Elementary divisors x 3 (1+x) 2 x 2 (1+x) (1+x) x x 2 (1+x) ( 1+x) x
14 Dependency Vector/Dependency String T = T1 = Characteristic polynomial x 2 (1+x) Characteristic polynomial x 3 (1+x) 2 T2 = Characteristic polynomial x(1+x) Matrix T is obtained from T1 and T2 by Block Diagonal Method
15 Dependency Vector/Dependency String T d = Dependency Vector DV1 = < > Dependency String DS = < > < 1 0 > Dependency String DS = < > T1 d = T2 d = Dependency Vector DV1 = < 1 0 >
16 Dependency Vector/Dependency String Zero basin of T1 + Zero basin of T Zero basin of T1 + Non-zero basin of T2 --- Non-zero basin of T1 + Zero basin of T Non-zero basin of T1 + Non-zero basin of T2 --- PEF Bits DV1 contributes 1 st PEF Bits DV2 contributes 2 nd PEF Bits PEF = [PEF1] [PEF2] = [DS.P] = [DV1.P1] [DV2.P2] P = [ ] DS = [ ] PEF = [PEF1][PEF2] = [<0 0 1><1 1 1>][<1 0><1 1>] = 1 1
17 Matrix/Rule from Dependency String 0 in DV T1 = [ 0 ] 1 in DV T2 = [ 1 ] 11 in DV T3 = [T1] in DV T4 = DS = < > [T1] DV = < > DV = < > T1 = T2 = T = CA Rule Vector <102, 102, 150, 0, 102, 170, 0>
18 Dependency Vector/Dependency String Characteristic polynomial x 4 (1+x) T1 = T2 = T3 = <170, 0, 150, 0, 240> <170, 0, 150, 0, 240> <170, 0, 150, 0, 240> Dependency Vector < > Identification of attractor basins in O(n) Reduction of search space
19 Image Compression Block diagram of codebook generation scheme Training Images Spatial Domain High Compression ratio Acceptable image quality 16 X 16 Set 8 X 8 Set 4 X 4 Set Applications - Human Portraits TSVQ 16 X 16 Codebook 8 X 8 Codebook 4 X 4 Codebook
20 Tree-Structured Vector Quantization N X N Set S1, S2, S3, S4 Cluster 1 Cluster 2 S1, S2 S3, S4 Centroid 1 Centroid 2 S1 S2 S3 S4 Clusters and centroids generation using Tree-Structured Vector Quantization (TSVQ) Logical structure of multi-class classifier equivalent to PTSVQ
21 MACA Based Two Stage Classifier Input Layer Classifier 1 Hidden Layer Classifier 2 Output Layer Classifier 1: n-bit DS consists of m DVs Classifier 2: m-bit DV No of Bits (n) Value of PEF (m) Memory Size Ratio Software Hardware MSR (software) = (n+m) / (n+2 m ) MSR (hardware) = (3n+3m-4) / (3n-2+2 m )
22 Image Compression Original Decompressed Execution Time (in milli seconds) Block Size Full Search TSVQ CA 4 X compression 96.43%, PSNR X X compression 95.66%, PSNR High compression Acceptable image quality Higher speed
23 MACA based Tree-Structured Classifier Selection of MACA: Diversity of i th attractor basin (node): M i = max{n ij } / j N ij where N ij - number of tuples of class j covered by i th attractor basin M i 1, i th attractor indicates class j for which N ij is maximum Figure of Merit: FM = 1/k i M i where k denotes number of attractor basins MACA 1 MACA 2 II IV MACA 3 MACA 4 I III IV I II III I II IV
24 Fault Diagnosis of Digital Circuit Fault Injection Diagnosis of an example CUT EC Set of Test Vect ors Module 1 Module 2 CUT EC S A Signature Set Pattern Classifier MACA Fault Injection CUT (n,p) C1908(25,1801) # Partition 6 MACA Dictionary Memory C6288(32,7648) C7552(108,7053) S4863(16,4123) S3271(14,2585) S6669(55,6358)
25 Fault Diagnosis of Analog Circuit OTA1 OTA2 Component OTA1 # Samples 8970 Detected 8962 Not detected 8 SR V in C1 V out C2 OTA2 C C Component # Samples Detected Not detected SR OTA3 OTA1 OTA2 OTA V in OTA1 X1 OTA2 X2 C1 C X1: Output of BPF X2: Output of LPF
26 Performance on STATLOG Dataset Classification Accuracy (%) Memory Overhead (Kbyte) Dataset Bayesian C4.5 MLP MACA Bayesian C4.5 MLP MACA Australian Diabetes DNA German Heart Satimage Shuttle Letter Vehicle Segment
27 MACATree VS C4.5 on Statlog Dataset STATLOG Dataset Classifn. Accuracy C4.5 MACA Memory Overhead C4.5 MACA No of Nodes C4.5 MACA Retrieval Time(ms) C4.5 MACA Australian Diabetes DNA German Heart Satimage Shuttle Letter Vehicle Segment Comparable classification accuracy Low memory overhead Lesser number of intermediate nodes Lesser retrieval time
28 Conclusion Advantages: Explore computational capability of MACA Introduction of Dependency Vector (DV)/String (DS) to characterize MACA Reduction of complexity to identify attractor basins from O(n 3 ) to O(n) Elegant evolutionary algorithm combination of DV/DS and GA MACA based tree-structured pattern classifier Application of MACA in Classification image compression fault diagnosis of electronic circuits Codon to amino acid mapping, S-box of AES Problems: Linear MACA employs only XOR logic, functionally incomplete Distribution of each attractor basin is even Can handle only binary patterns Solutions: Nonlinear MACA (GMACA) Fuzzy MACA (FMACA)
29 Generalized MACA (GMACA) Employs non-linear hybrid rules with all possible logic Cycle or attractor length greater than 1 Can perform pattern recognition task Behaves as an associative memory Rule vector: <202,168,218,42> P1 attractor-1 P2 attractor
30 Basins of Attraction (Theoretical) n = 50 Error correcting capability at single bit noise k = 10 Error correcting capability at multiple bit noise
31 Distribution of CA Rule (Theoretical) Degree of Homogeneity DH = 1- r/4 where r = number of 1 s of a rule More homogeneous less probability of occurrence
32 Synthesis of GMACA Phase I: Random Generation of a directed sub-graph Phase II: State transition table from sub-graph Phase III: GMACA rule vector from State transition table For 2nd Cell: Rule 232 Basin Basin g1 Present State Basin Next State Present State Basin Next State g2
33 Resolution of Collision: Genetic Algorithm if n 0 = n 1 ; next state is either `0 or `1 if n 0 > n 1 ; next state is `0 if n 0 < n 1 ; next state is `1 where n 0 = Occurrence of state `0 for a configuration n 1 = Occurrence of state `1 g1 g2.. gk Example chromosome format each g x a basin of a pattern P x k numbers of genes in a chromosome Each gene - a single cycle directed sub-graph with p number of nodes, where p = 1 + n
34 Maximum Permissible Noise/Height Minimum value of maximum permissible height h max = 2 Minimum value of maximum permissible noise r max = 1
35 Performance Analysis of GMACA Higher memorizing capacity than Hopfield network Cost of computation is constant depends on transient length of CA
36 Basins of Attraction (Experimental) n = 50 k = 10
37 Distribution of CA Rule (Experimental)
38 Conclusion Advantages: Explore computational capability of non-linear MACA Characterization of basins of attraction of GMACA Fundamental results to characterize GMACA rules Reverse engineering method to synthesize GMACA Combination of reverse engineering method and GA Higher memorizing capacity than Hopfield network Problems: Can handle only binary patterns Solutions: Fuzzy MACA (FMACA)
39 Fuzzy Cellular Automata (FCA) A linear array of cells Each cell assumes a state - a rational value in [0, 1] Combines both fuzzy logic and Cellular Automata Out of 256 rules, 16 rules are OR and NOR rules (including 0 and 255) Boolean Function Opeartion FCA Operation OR (a + b) min{1, (a + b)} AND (a.b) (a.b) NOT (~a) (1 a)
40 Fuzzy Cellular Automata (FCA) OR Logic NOR Logic Rule 170 : q I (t+1) = q I-1 (t) Rule 204 : q I (t+1) = q I (t) Rule 85 : q I (t+1) = q I-1 (t) Rule 51 : q I (t+1) = q I (t) Rule 238 : q I (t+1) = q I (t) + q I+1 (t) Rule 17 : q I (t+1) = q I (t) + q I+1 (t) Rule 240 : q I (t+1) = q I+1 (t) Rule 15 : q I (t+1) = q I+1 (t) Rule 250 : q I (t+1) = q I-1 (t) + q I+1 (t) Rule 252 : q I (t+1) = q I-1 (t) + q I (t) Rule 5 : q I (t+1) = q I-1 (t) + q I+1 (t) Rule 3 : q I (t+1) = q I-1 (t) + q I (t) Rule 254 : q I (t+1) = q I-1 (t) + q I (t) + q I+1 (t) Rule 1 : q I (t+1) = q I-1 (t) + q I (t) + q I+1 (t)
41 Fuzzy Cellular Automata (FCA) 16 OR and NOR rules can be represented by n x n matrix T and an n dimensional binary vector F S i (t) represents the state of i th cell at t th time instant S i (t+1) = F i -min{1, Σ j T ij.s j (t)} where T 1 if next state of ij = ith cell dependents on j th cell 0 otherwise F = Inversion vector, contains 1 where NOR rule is applied cell null boundary hybrid FCA <238,1,238,3> T = F =
42 Fuzzy Multiple Attractor CA (FMACA)
43 Fuzzy Multiple Attractor CA (FMACA) Dependency Vector (DV) corresponding to matrix Derived Complement Vector (DCV) corresponding to inversion vector Pivot cell (PC) represents an attractor basin uniquely State of Pivot Cell (PC) of attractor of the basin where a state belongs q m = min {1, Σ j DCV j -DV j.s j (t) } Size of attractor basins equal as well as unequal Matrix, inversion vector from DV/DCV
44 Fuzzy Multiple Attractor CA (FMACA) T = F = 0 DV = 1 DCV =
45 Fuzzy Multiple Attractor CA (FMACA) T = F = 1 DV = 1 DCV =
46 FMACA based Tree-Structured Classifier FMACA based tree-structured pattern classifier Can handle binary as well as real valued datasets Provides equal and unequal size of attractor basins Combination of GA and DV/DCV FMACA 1 FMACA 2 II IV FMACA 3 FMACA 4 I III IV I II III I II IV
47 Experimental Setup Randomly generate K number of centroids Around each centroid, generate t number of tuples 50 % patterns are taken for training 50 % patterns are taken for testing A 1 D min A 2 d max A K
48 Performance Analysis of FMACA Generalization of FMACA tree Dataset Depth Training Accuracy Testing Accuracy Breadth Depth: Number of layers from root to leaf Breadth: Number of intermediate nodes Can generalize dataset irrespective of classes, tuples, attributes (n=5,k=2,t=4000) Attributes (n) Size (t) No of Classes FMACA C Higher classification accuracy compared to C
49 Performance Analysis of FMACA Generation Time (ms) Retrieval Time (ms) Dataset FMACA C4.5 FMACA C4.5 (n=5,k=2,t=2000) High generation time - but, one time cost Lower retrieval time compared to C4.5 (n=5,k=2,t=20000) (n=6,k=2,t=2000) (n=6,k=2,t=20000) Attributes (n) Size (t) No of Classes FMACA C Lower memory overhead compared to C
50 Performance on STATLOG Dataset Classification Accuracy (%) Dataset FMACA MACA DNA Satimage Shuttle Letter Number of CA cells FMACA MACA No of Nodes of Tree FMACA MACA Memory Overhead Retrieval Time (ms) Comparable accuracy Lesser CA cells Lesser memory overhead Lesser retrieval time Dataset DNA Satimage Shuttle Letter FMACA MACA FMACA MACA
51 Conclusion Introduction of fuzzy CA in pattern recognition New mathematical tools Dependency matrix, Dependency vector Complement vector, Derived complement vector Reduction of complexity to identify attractors from O(n 3 ) to O(n) Both equal and unequal size of attractor basins Movement of patterns from one to another basin Reduction of search space Elegant evolutionary algorithm combination of DV/DCV and GA FMACA based tree-structured pattern classifier
52 Future Extensions Applications in pattern clustering, mix-mode learning Theoretical analysis of memorizing capacity of non-linear CA Combination of fuzzy set and fuzzy CA 1-D CA to 2-D CA Development of hybrid systems using CA CA + neural network + fuzzy set CA + fuzzy set + rough set Boolean CA to multi-valued / hierarchical CA Application of CA in Bioinformatics Medical Image Analysis Image Compression Data Mining
53 Thank You
Error Correcting Capability of Cellular Automata Based Associative Memory
466 IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS PART A: SYSTEMS AND HUMANS, VOL. 33, NO. 4, JULY 2003 Error Correcting Capability of Cellular Automata Based Associative Memory Pradipta Maji, Niloy
More informationInteger weight training by differential evolution algorithms
Integer weight training by differential evolution algorithms V.P. Plagianakos, D.G. Sotiropoulos, and M.N. Vrahatis University of Patras, Department of Mathematics, GR-265 00, Patras, Greece. e-mail: vpp
More informationConvolutional Associative Memory: FIR Filter Model of Synapse
Convolutional Associative Memory: FIR Filter Model of Synapse Rama Murthy Garimella 1, Sai Dileep Munugoti 2, Anil Rayala 1 1 International Institute of Information technology, Hyderabad, India. rammurthy@iiit.ac.in,
More informationData Mining. Preamble: Control Application. Industrial Researcher s Approach. Practitioner s Approach. Example. Example. Goal: Maintain T ~Td
Data Mining Andrew Kusiak 2139 Seamans Center Iowa City, Iowa 52242-1527 Preamble: Control Application Goal: Maintain T ~Td Tel: 319-335 5934 Fax: 319-335 5669 andrew-kusiak@uiowa.edu http://www.icaen.uiowa.edu/~ankusiak
More informationLogic BIST. Sungho Kang Yonsei University
Logic BIST Sungho Kang Yonsei University Outline Introduction Basics Issues Weighted Random Pattern Generation BIST Architectures Deterministic BIST Conclusion 2 Built In Self Test Test/ Normal Input Pattern
More informationSwitching Neural Networks: A New Connectionist Model for Classification
Switching Neural Networks: A New Connectionist Model for Classification Marco Muselli Istituto di Elettronica e di Ingegneria dell Informazione e delle Telecomunicazioni, Consiglio Nazionale delle Ricerche,
More informationMachine Learning. Neural Networks
Machine Learning Neural Networks Bryan Pardo, Northwestern University, Machine Learning EECS 349 Fall 2007 Biological Analogy Bryan Pardo, Northwestern University, Machine Learning EECS 349 Fall 2007 THE
More informationOptimization of 1D and 2D Cellular Automata for Pseudo Random Number Generator.
IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 6, Ver. I (Nov - Dec. 2014), PP 28-33 e-issn: 2319 4200, p-issn No. : 2319 4197 Optimization of 1D and 2D Cellular Automata for Pseudo
More informationDiscrete Tranformation of Output in Cellular Automata
Discrete Tranformation of Output in Cellular Automata Aleksander Lunøe Waage Master of Science in Computer Science Submission date: July 2012 Supervisor: Gunnar Tufte, IDI Norwegian University of Science
More informationTheory of Additive Cellular Automata
Fundamenta Informaticae XXI (200) 00 02 00 IOS Press Theory of Additive Cellular Automata Niloy Ganguly Department of Computer Science and Engineering, Indian Institute of Technology, Waragpur, India Biplab
More informationCSCI 1590 Intro to Computational Complexity
CSCI 59 Intro to Computational Complexity Overview of the Course John E. Savage Brown University January 2, 29 John E. Savage (Brown University) CSCI 59 Intro to Computational Complexity January 2, 29
More informationKeywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm
Volume 4, Issue 5, May 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Huffman Encoding
More informationComputational Intelligence Lecture 3: Simple Neural Networks for Pattern Classification
Computational Intelligence Lecture 3: Simple Neural Networks for Pattern Classification Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Fall 2011 arzaneh Abdollahi
More informationCS:4420 Artificial Intelligence
CS:4420 Artificial Intelligence Spring 2018 Neural Networks Cesare Tinelli The University of Iowa Copyright 2004 18, Cesare Tinelli and Stuart Russell a a These notes were originally developed by Stuart
More informationUsing a Hopfield Network: A Nuts and Bolts Approach
Using a Hopfield Network: A Nuts and Bolts Approach November 4, 2013 Gershon Wolfe, Ph.D. Hopfield Model as Applied to Classification Hopfield network Training the network Updating nodes Sequencing of
More informationThe Power of Extra Analog Neuron. Institute of Computer Science Academy of Sciences of the Czech Republic
The Power of Extra Analog Neuron Jiří Šíma Institute of Computer Science Academy of Sciences of the Czech Republic (Artificial) Neural Networks (NNs) 1. mathematical models of biological neural networks
More informationIntelligent Modular Neural Network for Dynamic System Parameter Estimation
Intelligent Modular Neural Network for Dynamic System Parameter Estimation Andrzej Materka Technical University of Lodz, Institute of Electronics Stefanowskiego 18, 9-537 Lodz, Poland Abstract: A technique
More informationPrinciples of Pattern Recognition. C. A. Murthy Machine Intelligence Unit Indian Statistical Institute Kolkata
Principles of Pattern Recognition C. A. Murthy Machine Intelligence Unit Indian Statistical Institute Kolkata e-mail: murthy@isical.ac.in Pattern Recognition Measurement Space > Feature Space >Decision
More informationProbabilistic Analysis of Cellular Automata Rules and its Application in Pseudo Random Pattern Generation
Probabilistic Analysis of Cellular Automata Rules and its Application in Pseudo Random Pattern Generation Abhishek Seth, S. Bandyopadhyay, U. Maulik. Abstract The present work is an extension of the work
More informationBayesian Networks Inference with Probabilistic Graphical Models
4190.408 2016-Spring Bayesian Networks Inference with Probabilistic Graphical Models Byoung-Tak Zhang intelligence Lab Seoul National University 4190.408 Artificial (2016-Spring) 1 Machine Learning? Learning
More informationResearch Article Characterization of the Evolution of Nonlinear Uniform Cellular Automata in the Light of Deviant States
Hindawi Publishing Corporation International Journal of Mathematics and Mathematical Sciences Volume 2011, Article ID 605098, 16 pages doi:101155/2011/605098 Research Article Characterization of the Evolution
More informationOutline. EECS Components and Design Techniques for Digital Systems. Lec 18 Error Coding. In the real world. Our beautiful digital world.
Outline EECS 150 - Components and esign Techniques for igital Systems Lec 18 Error Coding Errors and error models Parity and Hamming Codes (SECE) Errors in Communications LFSRs Cyclic Redundancy Check
More informationInternational Journal of Combined Research & Development (IJCRD) eissn: x;pissn: Volume: 7; Issue: 7; July -2018
XOR Gate Design Using Reversible Logic in QCA and Verilog Code Yeshwanth GR BE Final Year Department of ECE, The Oxford College of Engineering Bommanahalli, Hosur Road, Bangalore -560068 yeshwath.g13@gmail.com
More informationLast update: October 26, Neural networks. CMSC 421: Section Dana Nau
Last update: October 26, 207 Neural networks CMSC 42: Section 8.7 Dana Nau Outline Applications of neural networks Brains Neural network units Perceptrons Multilayer perceptrons 2 Example Applications
More informationRadial View: Observing Fuzzy Cellular Automata with a New Visualization Method
Radial View: Observing Fuzzy Cellular Automata with a New Visualization Method Paola Flocchini and Vladimir Cezar School of Information Technology and Engineering University of Ottawa, 800 King Eduard,
More informationHopfield Network Recurrent Netorks
Hopfield Network Recurrent Netorks w 2 w n E.P. P.E. y (k) Auto-Associative Memory: Given an initial n bit pattern returns the closest stored (associated) pattern. No P.E. self-feedback! w 2 w n2 E.P.2
More informationImplementation of Lossless Huffman Coding: Image compression using K-Means algorithm and comparison vs. Random numbers and Message source
Implementation of Lossless Huffman Coding: Image compression using K-Means algorithm and comparison vs. Random numbers and Message source Ali Tariq Bhatti 1, Dr. Jung Kim 2 1,2 Department of Electrical
More informationNeural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21
Neural Networks Chapter 8, Section 7 TB Artificial Intelligence Slides from AIMA http://aima.cs.berkeley.edu / 2 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural
More informationNeural networks. Chapter 19, Sections 1 5 1
Neural networks Chapter 19, Sections 1 5 Chapter 19, Sections 1 5 1 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 19, Sections 1 5 2 Brains 10
More informationDetermine the size of an instance of the minimum spanning tree problem.
3.1 Algorithm complexity Consider two alternative algorithms A and B for solving a given problem. Suppose A is O(n 2 ) and B is O(2 n ), where n is the size of the instance. Let n A 0 be the size of the
More informationArtificial Neural Networks Examination, June 2005
Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either
More informationArtificial Neural Networks Examination, March 2004
Artificial Neural Networks Examination, March 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum
More informationArtificial Intelligence Hopfield Networks
Artificial Intelligence Hopfield Networks Andrea Torsello Network Topologies Single Layer Recurrent Network Bidirectional Symmetric Connection Binary / Continuous Units Associative Memory Optimization
More informationNeural Networks. Mark van Rossum. January 15, School of Informatics, University of Edinburgh 1 / 28
1 / 28 Neural Networks Mark van Rossum School of Informatics, University of Edinburgh January 15, 2018 2 / 28 Goals: Understand how (recurrent) networks behave Find a way to teach networks to do a certain
More informationPart 8: Neural Networks
METU Informatics Institute Min720 Pattern Classification ith Bio-Medical Applications Part 8: Neural Netors - INTRODUCTION: BIOLOGICAL VS. ARTIFICIAL Biological Neural Netors A Neuron: - A nerve cell as
More informationReification of Boolean Logic
526 U1180 neural networks 1 Chapter 1 Reification of Boolean Logic The modern era of neural networks began with the pioneer work of McCulloch and Pitts (1943). McCulloch was a psychiatrist and neuroanatomist;
More informationCMSC 421: Neural Computation. Applications of Neural Networks
CMSC 42: Neural Computation definition synonyms neural networks artificial neural networks neural modeling connectionist models parallel distributed processing AI perspective Applications of Neural Networks
More informationDRAFT. Algebraic computation models. Chapter 14
Chapter 14 Algebraic computation models Somewhat rough We think of numerical algorithms root-finding, gaussian elimination etc. as operating over R or C, even though the underlying representation of the
More informationARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD
ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided
More informationText Mining. Dr. Yanjun Li. Associate Professor. Department of Computer and Information Sciences Fordham University
Text Mining Dr. Yanjun Li Associate Professor Department of Computer and Information Sciences Fordham University Outline Introduction: Data Mining Part One: Text Mining Part Two: Preprocessing Text Data
More informationINTRODUCTION TO ARTIFICIAL INTELLIGENCE
v=1 v= 1 v= 1 v= 1 v= 1 v=1 optima 2) 3) 5) 6) 7) 8) 9) 12) 11) 13) INTRDUCTIN T ARTIFICIAL INTELLIGENCE DATA15001 EPISDE 8: NEURAL NETWRKS TDAY S MENU 1. NEURAL CMPUTATIN 2. FEEDFRWARD NETWRKS (PERCEPTRN)
More informationImage Encryption and Decryption Algorithm Using Two Dimensional Cellular Automata Rules In Cryptography
Image Encryption and Decryption Algorithm Using Two Dimensional Cellular Automata Rules In Cryptography P. Sanoop Kumar Department of CSE, Gayatri Vidya Parishad College of Engineering(A), Madhurawada-530048,Visakhapatnam,
More informationArtificial Intelligence (AI) Common AI Methods. Training. Signals to Perceptrons. Artificial Neural Networks (ANN) Artificial Intelligence
Artificial Intelligence (AI) Artificial Intelligence AI is an attempt to reproduce intelligent reasoning using machines * * H. M. Cartwright, Applications of Artificial Intelligence in Chemistry, 1993,
More informationNeural networks. Chapter 20, Section 5 1
Neural networks Chapter 20, Section 5 Chapter 20, Section 5 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 20, Section 5 2 Brains 0 neurons of
More informationComputational Intelligence Lecture 6: Associative Memory
Computational Intelligence Lecture 6: Associative Memory Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Fall 2011 Farzaneh Abdollahi Computational Intelligence
More informationAN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009
AN INTRODUCTION TO NEURAL NETWORKS Scott Kuindersma November 12, 2009 SUPERVISED LEARNING We are given some training data: We must learn a function If y is discrete, we call it classification If it is
More informationA Novel Activity Detection Method
A Novel Activity Detection Method Gismy George P.G. Student, Department of ECE, Ilahia College of,muvattupuzha, Kerala, India ABSTRACT: This paper presents an approach for activity state recognition of
More informationCS 4700: Foundations of Artificial Intelligence
CS 4700: Foundations of Artificial Intelligence Prof. Bart Selman selman@cs.cornell.edu Machine Learning: Neural Networks R&N 18.7 Intro & perceptron learning 1 2 Neuron: How the brain works # neurons
More informationSorting Network Development Using Cellular Automata
Sorting Network Development Using Cellular Automata Michal Bidlo, Zdenek Vasicek, and Karel Slany Brno University of Technology, Faculty of Information Technology Božetěchova 2, 61266 Brno, Czech republic
More informationMechanisms of Emergent Computation in Cellular Automata
Mechanisms of Emergent Computation in Cellular Automata Wim Hordijk, James P. Crutchfield, Melanie Mitchell Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, 87501 NM, USA email: {wim,chaos,mm}@santafe.edu
More informationEEE 241: Linear Systems
EEE 4: Linear Systems Summary # 3: Introduction to artificial neural networks DISTRIBUTED REPRESENTATION An ANN consists of simple processing units communicating with each other. The basic elements of
More informationLecture 7 Artificial neural networks: Supervised learning
Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in
More informationLearning and Memory in Neural Networks
Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units
More informationArtificial Neural Networks. Q550: Models in Cognitive Science Lecture 5
Artificial Neural Networks Q550: Models in Cognitive Science Lecture 5 "Intelligence is 10 million rules." --Doug Lenat The human brain has about 100 billion neurons. With an estimated average of one thousand
More informationCellular Automata. ,C ) (t ) ,..., C i +[ K / 2] Cellular Automata. x > N : C x ! N. = C x. x < 1: C x. = C N+ x.
and beyond Lindenmayer Systems The World of Simple Programs Christian Jacob Department of Computer Science Department of Biochemistry & Molecular Biology University of Calgary CPSC 673 Winter 2004 Random
More informationArtificial Neural Networks Examination, June 2004
Artificial Neural Networks Examination, June 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum
More informationOnline Dictionary Learning with Group Structure Inducing Norms
Online Dictionary Learning with Group Structure Inducing Norms Zoltán Szabó 1, Barnabás Póczos 2, András Lőrincz 1 1 Eötvös Loránd University, Budapest, Hungary 2 Carnegie Mellon University, Pittsburgh,
More informationNon-linear Measure Based Process Monitoring and Fault Diagnosis
Non-linear Measure Based Process Monitoring and Fault Diagnosis 12 th Annual AIChE Meeting, Reno, NV [275] Data Driven Approaches to Process Control 4:40 PM, Nov. 6, 2001 Sandeep Rajput Duane D. Bruns
More informationWeight Quantization for Multi-layer Perceptrons Using Soft Weight Sharing
Weight Quantization for Multi-layer Perceptrons Using Soft Weight Sharing Fatih Köksal, Ethem Alpaydın, and Günhan Dündar 2 Department of Computer Engineering 2 Department of Electrical and Electronics
More informationChapter 1. Introduction
Chapter 1 Introduction Symbolical artificial intelligence is a field of computer science that is highly related to quantum computation. At first glance, this statement appears to be a contradiction. However,
More informationModeling High-Dimensional Discrete Data with Multi-Layer Neural Networks
Modeling High-Dimensional Discrete Data with Multi-Layer Neural Networks Yoshua Bengio Dept. IRO Université de Montréal Montreal, Qc, Canada, H3C 3J7 bengioy@iro.umontreal.ca Samy Bengio IDIAP CP 592,
More informationDesigning Cellular Automata Structures using Quantum-dot Cellular Automata
Designing Cellular Automata Structures using Quantum-dot Cellular Automata Mayur Bubna, Subhra Mazumdar, Sudip Roy and Rajib Mall Department of Computer Sc. & Engineering Indian Institute of Technology,
More informationSections 18.6 and 18.7 Analysis of Artificial Neural Networks
Sections 18.6 and 18.7 Analysis of Artificial Neural Networks CS4811 - Artificial Intelligence Nilufer Onder Department of Computer Science Michigan Technological University Outline Univariate regression
More informationArtificial Neural Network Method of Rock Mass Blastability Classification
Artificial Neural Network Method of Rock Mass Blastability Classification Jiang Han, Xu Weiya, Xie Shouyi Research Institute of Geotechnical Engineering, Hohai University, Nanjing, Jiangshu, P.R.China
More informationCoexistence of Dynamics for Two- Dimensional Cellular Automata
Coexistence of Dynamics for Two- Dimensional Cellular Automata Ricardo Severino Department of Mathematics and Applications University of Minho Campus de Gualtar - 4710-057 Braga, Portugal Maria Joana Soares
More informationDesign and Implementation of Carry Adders Using Adiabatic and Reversible Logic Gates
Design and Implementation of Carry Adders Using Adiabatic and Reversible Logic Gates B.BharathKumar 1, ShaikAsra Tabassum 2 1 Research Scholar, Dept of ECE, Lords Institute of Engineering & Technology,
More informationOne-Dimensional Linear Hybrid Cellular Automata: Their Synthesis, Properties and Applications to Digital Circuits Testing
One-Dimensional Linear Hybrid Cellular Automata: Their Synthesis, Properties and Applications to Digital Circuits Testing M. Serra, K. Cattell, S. Zhang, J.C. Muzio, D.M. Miller Dept. of Computer Science
More informationSupport Vector Machines (SVM) in bioinformatics. Day 1: Introduction to SVM
1 Support Vector Machines (SVM) in bioinformatics Day 1: Introduction to SVM Jean-Philippe Vert Bioinformatics Center, Kyoto University, Japan Jean-Philippe.Vert@mines.org Human Genome Center, University
More informationBOOLEAN ALGEBRA INTRODUCTION SUBSETS
BOOLEAN ALGEBRA M. Ragheb 1/294/2018 INTRODUCTION Modern algebra is centered around the concept of an algebraic system: A, consisting of a set of elements: ai, i=1, 2,, which are combined by a set of operations
More informationBioinformatics Chapter 1. Introduction
Bioinformatics Chapter 1. Introduction Outline! Biological Data in Digital Symbol Sequences! Genomes Diversity, Size, and Structure! Proteins and Proteomes! On the Information Content of Biological Sequences!
More informationClassification of Random Boolean Networks
Classification of Random Boolean Networks Carlos Gershenson, School of Cognitive and Computer Sciences University of Sussex Brighton, BN1 9QN, U. K. C.Gershenson@sussex.ac.uk http://www.cogs.sussex.ac.uk/users/carlos
More informationA Genetic Algorithm with Expansion and Exploration Operators for the Maximum Satisfiability Problem
Applied Mathematical Sciences, Vol. 7, 2013, no. 24, 1183-1190 HIKARI Ltd, www.m-hikari.com A Genetic Algorithm with Expansion and Exploration Operators for the Maximum Satisfiability Problem Anna Gorbenko
More informationApplication of hopfield network in improvement of fingerprint recognition process Mahmoud Alborzi 1, Abbas Toloie- Eshlaghy 1 and Dena Bazazian 2
5797 Available online at www.elixirjournal.org Computer Science and Engineering Elixir Comp. Sci. & Engg. 41 (211) 5797-582 Application hopfield network in improvement recognition process Mahmoud Alborzi
More informationECE 407 Computer Aided Design for Electronic Systems. Simulation. Instructor: Maria K. Michael. Overview
407 Computer Aided Design for Electronic Systems Simulation Instructor: Maria K. Michael Overview What is simulation? Design verification Modeling Levels Modeling circuits for simulation True-value simulation
More informationA Posteriori Corrections to Classification Methods.
A Posteriori Corrections to Classification Methods. Włodzisław Duch and Łukasz Itert Department of Informatics, Nicholas Copernicus University, Grudziądzka 5, 87-100 Toruń, Poland; http://www.phys.uni.torun.pl/kmk
More informationEvolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction
Evolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction 3. Introduction Currency exchange rate is an important element in international finance. It is one of the chaotic,
More informationDecision Tree Learning
Decision Tree Learning Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University References: 1. Machine Learning, Chapter 3 2. Data Mining: Concepts, Models,
More informationNeural Networks Lecture 2:Single Layer Classifiers
Neural Networks Lecture 2:Single Layer Classifiers H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011. A. Talebi, Farzaneh Abdollahi Neural
More informationSample questions for Fundamentals of Machine Learning 2018
Sample questions for Fundamentals of Machine Learning 2018 Teacher: Mohammad Emtiyaz Khan A few important informations: In the final exam, no electronic devices are allowed except a calculator. Make sure
More informationModern Information Retrieval
Modern Information Retrieval Chapter 8 Text Classification Introduction A Characterization of Text Classification Unsupervised Algorithms Supervised Algorithms Feature Selection or Dimensionality Reduction
More informationCS6901: review of Theory of Computation and Algorithms
CS6901: review of Theory of Computation and Algorithms Any mechanically (automatically) discretely computation of problem solving contains at least three components: - problem description - computational
More informationIntroduction to Neural Networks
Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning
More informationStream Ciphers. Çetin Kaya Koç Winter / 20
Çetin Kaya Koç http://koclab.cs.ucsb.edu Winter 2016 1 / 20 Linear Congruential Generators A linear congruential generator produces a sequence of integers x i for i = 1,2,... starting with the given initial
More informationNCU EE -- DSP VLSI Design. Tsung-Han Tsai 1
NCU EE -- DSP VLSI Design. Tsung-Han Tsai 1 Multi-processor vs. Multi-computer architecture µp vs. DSP RISC vs. DSP RISC Reduced-instruction-set Register-to-register operation Higher throughput by using
More informationNeural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET
Unit-. Definition Neural network is a massively parallel distributed processing system, made of highly inter-connected neural computing elements that have the ability to learn and thereby acquire knowledge
More informationARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92
ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 BIOLOGICAL INSPIRATIONS Some numbers The human brain contains about 10 billion nerve cells (neurons) Each neuron is connected to the others through 10000
More informationChapter 0 Introduction. Fourth Academic Year/ Elective Course Electrical Engineering Department College of Engineering University of Salahaddin
Chapter 0 Introduction Fourth Academic Year/ Elective Course Electrical Engineering Department College of Engineering University of Salahaddin October 2014 Automata Theory 2 of 22 Automata theory deals
More informationbiologically-inspired computing lecture 18
Informatics -inspired lecture 18 Sections I485/H400 course outlook Assignments: 35% Students will complete 4/5 assignments based on algorithms presented in class Lab meets in I1 (West) 109 on Lab Wednesdays
More informationInadmissible Class of Boolean Functions under Stuck-at Faults
Inadmissible Class of Boolean Functions under Stuck-at Faults Debesh K. Das 1, Debabani Chowdhury 1, Bhargab B. Bhattacharya 2, Tsutomu Sasao 3 1 Computer Sc. & Engg. Dept., Jadavpur University, Kolkata
More informationSP-CNN: A Scalable and Programmable CNN-based Accelerator. Dilan Manatunga Dr. Hyesoon Kim Dr. Saibal Mukhopadhyay
SP-CNN: A Scalable and Programmable CNN-based Accelerator Dilan Manatunga Dr. Hyesoon Kim Dr. Saibal Mukhopadhyay Motivation Power is a first-order design constraint, especially for embedded devices. Certain
More informationIterative Laplacian Score for Feature Selection
Iterative Laplacian Score for Feature Selection Linling Zhu, Linsong Miao, and Daoqiang Zhang College of Computer Science and echnology, Nanjing University of Aeronautics and Astronautics, Nanjing 2006,
More informationBounded Approximation Algorithms
Bounded Approximation Algorithms Sometimes we can handle NP problems with polynomial time algorithms which are guaranteed to return a solution within some specific bound of the optimal solution within
More informationFundamentals of Digital Design
Fundamentals of Digital Design Digital Radiation Measurement and Spectroscopy NE/RHP 537 1 Binary Number System The binary numeral system, or base-2 number system, is a numeral system that represents numeric
More informationCMSC 422 Introduction to Machine Learning Lecture 4 Geometry and Nearest Neighbors. Furong Huang /
CMSC 422 Introduction to Machine Learning Lecture 4 Geometry and Nearest Neighbors Furong Huang / furongh@cs.umd.edu What we know so far Decision Trees What is a decision tree, and how to induce it from
More informationSUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION
SUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION 1 Outline Basic terminology Features Training and validation Model selection Error and loss measures Statistical comparison Evaluation measures 2 Terminology
More informationFrom statistics to data science. BAE 815 (Fall 2017) Dr. Zifei Liu
From statistics to data science BAE 815 (Fall 2017) Dr. Zifei Liu Zifeiliu@ksu.edu Why? How? What? How much? How many? Individual facts (quantities, characters, or symbols) The Data-Information-Knowledge-Wisdom
More informationFault Collapsing in Digital Circuits Using Fast Fault Dominance and Equivalence Analysis with SSBDDs
Fault Collapsing in Digital Circuits Using Fast Fault Dominance and Equivalence Analysis with SSBDDs Raimund Ubar, Lembit Jürimägi (&), Elmet Orasson, and Jaan Raik Department of Computer Engineering,
More informationGenetic Algorithms: Basic Principles and Applications
Genetic Algorithms: Basic Principles and Applications C. A. MURTHY MACHINE INTELLIGENCE UNIT INDIAN STATISTICAL INSTITUTE 203, B.T.ROAD KOLKATA-700108 e-mail: murthy@isical.ac.in Genetic algorithms (GAs)
More informationDIAGNOSIS OF FAULT IN TESTABLE REVERSIBLE SEQUENTIAL CIRCUITS USING MULTIPLEXER CONSERVATIVE QUANTUM DOT CELLULAR AUTOMATA
DIAGNOSIS OF FAULT IN TESTABLE REVERSIBLE SEQUENTIAL CIRCUITS USING MULTIPLEXER CONSERVATIVE QUANTUM DOT CELLULAR AUTOMATA Nikitha.S.Paulin 1, S.Abirami 2, Prabu Venkateswaran.S 3 1, 2 PG students / VLSI
More informationOn Detecting Multiple Faults in Baseline Interconnection Networks
On Detecting Multiple Faults in Baseline Interconnection Networks SHUN-SHII LIN 1 AND SHAN-TAI CHEN 2 1 National Taiwan Normal University, Taipei, Taiwan, ROC 2 Chung Cheng Institute of Technology, Tao-Yuan,
More information