Cellular Automata Evolution for Pattern Recognition

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "Cellular Automata Evolution for Pattern Recognition"

Transcription

1 Cellular Automata Evolution for Pattern Recognition Pradipta Maji Center for Soft Computing Research Indian Statistical Institute, Kolkata, , INDIA Under the supervision of Prof. P Pal Chaudhuri Prof. Debesh K Das Professor Emeritus Professor Dept. of Comp. Sci. & Tech. Dept. of Comp. Sci. & Engg. Bengal Engineering College (DU), Shibpur, INDIA Jadavpur University, INDIA

2 Introduction Cellular Automata (CA) promising research area Artificial Intelligence (AI) Artificial Life (ALife) Considerable research in modeling tool image processing language recognition pattern recognition VLSI testing Cellular Automata (CA) learns association from a set of examples apply this knowledge-base to handle unseen cases such associations effective for classifying patterns

3 Contribution of the Thesis Analysis and synthesis of linear boolean CA (MACA) CA with only XOR logic application of MACA in pattern recognition data mining image compression fault diagnosis of electronic circuit Analysis and synthesis of non-linear boolean CA (GMACA) CA with all possible logic application of non-linear CA in pattern recognition Analysis and synthesis of fuzzy CA (FMACA) CA with fuzzy logic application of fuzzy CA in pattern recognition

4 Cellular Automata (CA) A special type of computing model 50 s - J Von Neumann 80 s - S. Wolfram A CA displays three basic characteristics Simplicity: Basic unit of CA cell is simple Vast parallelism: CA achieves parallelism on a scale larger than massively parallel computers Locality: CA characterized by local connectivity of its cell all interactions take place on a purely local basis a cell can only communicate with its neighboring cells interconnection links usually carry only a small amount of information no cell has a global view of the entire system

5 Cellular Automata (CA) A computational model with discrete cells updated synchronously Uniform CA, hybrid / non-uniform CA, null boundary CA, periodic boundary CA Each cell can have 256 different rules state 3-neighborhood CA cell Clock Input output 0/1 From left neighbor Combinati onal Logic From right neighbor

6 For 2nd Cell Rule 230 PS NS CA State Transition

7 Different Types of CA Linear CA Based on XOR logic Total 7 rules (60, 90, 102, 150, 170, 204, 240) Can be expressed through matrix (T), characteristic polynomial Next state of the CA cell P(t+1) = T. P(t) Additive CA Based on XOR and XNOR logic Total 14 rules (linear rules + 195,165,153,105,85,51,15) Can be expressed through matrix, inversion vector, characteristic polynomial The next state of the CA cell P(t+1) = T. P(t) + F T = F =

8 Additive Cellular Automata XOR Logic XNOR Logic Rule 60 : q I (t+1) = q I-1 (t) q I (t) Rule 90 : q I (t+1) = q I-1 (t) q I+1 (t) Rule 195 : q I (t+1) = q I-1 (t) q I (t) Rule 165 : q I (t+1) = q I-1 (t) q I+1 (t) Rule 102 : q I (t+1) = q I (t) q I+1 (t) Rule 153 : q I (t+1) = q I (t) q I+1 (t) Rule 150 : q I (t+1) = q I-1 (t) q I (t) q I-1 (t) Rule 105 : q I (t+1) = q I-1 (t) q I (t) q I-1 (t) Rule 170 : q I (t+1) = q I-1 (t) Rule 204 : q I (t+1) = q I (t) Rule 85 : q I (t+1) = q I-1 (t) Rule 51 : q I (t+1) = q I (t) Rule 240 : q I (t+1) = q I+1 (t) Rule 15 : q I (t+1) = q I+1 (t)

9 CA - State Transition Diagram Group CA Non-group Cellular Automata Linear Non-linear Fuzzy 0 1 Non-group CA Associative Memory MACA GMACA FMACA Perform pattern recognition task

10 Pattern Recognition Pattern Recognition/Classification most important foundation stone of knowledge extraction methodology demands automatic identification of patterns of interest (objects, images) from its background (shapes, forms, outlines, etc) conventional approach machine compares given input pattern with each of stored patterns identifies the closest match time to recognize the closest match O(k) recognition slow Associative Memory Entire state space - divided Transient into some pivotal points Transient Transient 1. MACA (linear) 2. GMACA (non-linear) 3. FMACA (fuzzy) States close to pivot - associated with that pivot Time to recognize a pattern - Independent of number of stored patterns

11 Multiple Attractor CA (MACA) Employs linear CA rules State with self loop attractor Transient states and attractor form attractor basin Behaves as an associative memory Forms natural clusters

12 Multiple Attractor CA (MACA) Next state: P(t+1) = T. P(t) Characteristic Polynomial: X (n-m) (1+X) m where m=log 2 (k) n denotes number of CA cell k denotes number of attractor basins Depth d of MACA number of edges between a non-reachable state and an attractor state Attractor of a basin: P(t+d) = T d P(t) m-bit positions pseudo-exhaustive: extract PEF (pseudo-exhaustive field) from attractor state Problem: Complexity of identification of attractor basin is O(n 3 ) Exponential search space Redundant solutions

13 Chromosome of Genetic Algorithm T = Rule vector Matrix Characteristic polynomial x 3 (1+x) 2 Elementary divisors x 3 (1+x) 2 x 2 (1+x) (1+x) x x 2 (1+x) ( 1+x) x

14 Dependency Vector/Dependency String T = T1 = Characteristic polynomial x 2 (1+x) Characteristic polynomial x 3 (1+x) 2 T2 = Characteristic polynomial x(1+x) Matrix T is obtained from T1 and T2 by Block Diagonal Method

15 Dependency Vector/Dependency String T d = Dependency Vector DV1 = < > Dependency String DS = < > < 1 0 > Dependency String DS = < > T1 d = T2 d = Dependency Vector DV1 = < 1 0 >

16 Dependency Vector/Dependency String Zero basin of T1 + Zero basin of T Zero basin of T1 + Non-zero basin of T2 --- Non-zero basin of T1 + Zero basin of T Non-zero basin of T1 + Non-zero basin of T2 --- PEF Bits DV1 contributes 1 st PEF Bits DV2 contributes 2 nd PEF Bits PEF = [PEF1] [PEF2] = [DS.P] = [DV1.P1] [DV2.P2] P = [ ] DS = [ ] PEF = [PEF1][PEF2] = [<0 0 1><1 1 1>][<1 0><1 1>] = 1 1

17 Matrix/Rule from Dependency String 0 in DV T1 = [ 0 ] 1 in DV T2 = [ 1 ] 11 in DV T3 = [T1] in DV T4 = DS = < > [T1] DV = < > DV = < > T1 = T2 = T = CA Rule Vector <102, 102, 150, 0, 102, 170, 0>

18 Dependency Vector/Dependency String Characteristic polynomial x 4 (1+x) T1 = T2 = T3 = <170, 0, 150, 0, 240> <170, 0, 150, 0, 240> <170, 0, 150, 0, 240> Dependency Vector < > Identification of attractor basins in O(n) Reduction of search space

19 Image Compression Block diagram of codebook generation scheme Training Images Spatial Domain High Compression ratio Acceptable image quality 16 X 16 Set 8 X 8 Set 4 X 4 Set Applications - Human Portraits TSVQ 16 X 16 Codebook 8 X 8 Codebook 4 X 4 Codebook

20 Tree-Structured Vector Quantization N X N Set S1, S2, S3, S4 Cluster 1 Cluster 2 S1, S2 S3, S4 Centroid 1 Centroid 2 S1 S2 S3 S4 Clusters and centroids generation using Tree-Structured Vector Quantization (TSVQ) Logical structure of multi-class classifier equivalent to PTSVQ

21 MACA Based Two Stage Classifier Input Layer Classifier 1 Hidden Layer Classifier 2 Output Layer Classifier 1: n-bit DS consists of m DVs Classifier 2: m-bit DV No of Bits (n) Value of PEF (m) Memory Size Ratio Software Hardware MSR (software) = (n+m) / (n+2 m ) MSR (hardware) = (3n+3m-4) / (3n-2+2 m )

22 Image Compression Original Decompressed Execution Time (in milli seconds) Block Size Full Search TSVQ CA 4 X compression 96.43%, PSNR X X compression 95.66%, PSNR High compression Acceptable image quality Higher speed

23 MACA based Tree-Structured Classifier Selection of MACA: Diversity of i th attractor basin (node): M i = max{n ij } / j N ij where N ij - number of tuples of class j covered by i th attractor basin M i 1, i th attractor indicates class j for which N ij is maximum Figure of Merit: FM = 1/k i M i where k denotes number of attractor basins MACA 1 MACA 2 II IV MACA 3 MACA 4 I III IV I II III I II IV

24 Fault Diagnosis of Digital Circuit Fault Injection Diagnosis of an example CUT EC Set of Test Vect ors Module 1 Module 2 CUT EC S A Signature Set Pattern Classifier MACA Fault Injection CUT (n,p) C1908(25,1801) # Partition 6 MACA Dictionary Memory C6288(32,7648) C7552(108,7053) S4863(16,4123) S3271(14,2585) S6669(55,6358)

25 Fault Diagnosis of Analog Circuit OTA1 OTA2 Component OTA1 # Samples 8970 Detected 8962 Not detected 8 SR V in C1 V out C2 OTA2 C C Component # Samples Detected Not detected SR OTA3 OTA1 OTA2 OTA V in OTA1 X1 OTA2 X2 C1 C X1: Output of BPF X2: Output of LPF

26 Performance on STATLOG Dataset Classification Accuracy (%) Memory Overhead (Kbyte) Dataset Bayesian C4.5 MLP MACA Bayesian C4.5 MLP MACA Australian Diabetes DNA German Heart Satimage Shuttle Letter Vehicle Segment

27 MACATree VS C4.5 on Statlog Dataset STATLOG Dataset Classifn. Accuracy C4.5 MACA Memory Overhead C4.5 MACA No of Nodes C4.5 MACA Retrieval Time(ms) C4.5 MACA Australian Diabetes DNA German Heart Satimage Shuttle Letter Vehicle Segment Comparable classification accuracy Low memory overhead Lesser number of intermediate nodes Lesser retrieval time

28 Conclusion Advantages: Explore computational capability of MACA Introduction of Dependency Vector (DV)/String (DS) to characterize MACA Reduction of complexity to identify attractor basins from O(n 3 ) to O(n) Elegant evolutionary algorithm combination of DV/DS and GA MACA based tree-structured pattern classifier Application of MACA in Classification image compression fault diagnosis of electronic circuits Codon to amino acid mapping, S-box of AES Problems: Linear MACA employs only XOR logic, functionally incomplete Distribution of each attractor basin is even Can handle only binary patterns Solutions: Nonlinear MACA (GMACA) Fuzzy MACA (FMACA)

29 Generalized MACA (GMACA) Employs non-linear hybrid rules with all possible logic Cycle or attractor length greater than 1 Can perform pattern recognition task Behaves as an associative memory Rule vector: <202,168,218,42> P1 attractor-1 P2 attractor

30 Basins of Attraction (Theoretical) n = 50 Error correcting capability at single bit noise k = 10 Error correcting capability at multiple bit noise

31 Distribution of CA Rule (Theoretical) Degree of Homogeneity DH = 1- r/4 where r = number of 1 s of a rule More homogeneous less probability of occurrence

32 Synthesis of GMACA Phase I: Random Generation of a directed sub-graph Phase II: State transition table from sub-graph Phase III: GMACA rule vector from State transition table For 2nd Cell: Rule 232 Basin Basin g1 Present State Basin Next State Present State Basin Next State g2

33 Resolution of Collision: Genetic Algorithm if n 0 = n 1 ; next state is either `0 or `1 if n 0 > n 1 ; next state is `0 if n 0 < n 1 ; next state is `1 where n 0 = Occurrence of state `0 for a configuration n 1 = Occurrence of state `1 g1 g2.. gk Example chromosome format each g x a basin of a pattern P x k numbers of genes in a chromosome Each gene - a single cycle directed sub-graph with p number of nodes, where p = 1 + n

34 Maximum Permissible Noise/Height Minimum value of maximum permissible height h max = 2 Minimum value of maximum permissible noise r max = 1

35 Performance Analysis of GMACA Higher memorizing capacity than Hopfield network Cost of computation is constant depends on transient length of CA

36 Basins of Attraction (Experimental) n = 50 k = 10

37 Distribution of CA Rule (Experimental)

38 Conclusion Advantages: Explore computational capability of non-linear MACA Characterization of basins of attraction of GMACA Fundamental results to characterize GMACA rules Reverse engineering method to synthesize GMACA Combination of reverse engineering method and GA Higher memorizing capacity than Hopfield network Problems: Can handle only binary patterns Solutions: Fuzzy MACA (FMACA)

39 Fuzzy Cellular Automata (FCA) A linear array of cells Each cell assumes a state - a rational value in [0, 1] Combines both fuzzy logic and Cellular Automata Out of 256 rules, 16 rules are OR and NOR rules (including 0 and 255) Boolean Function Opeartion FCA Operation OR (a + b) min{1, (a + b)} AND (a.b) (a.b) NOT (~a) (1 a)

40 Fuzzy Cellular Automata (FCA) OR Logic NOR Logic Rule 170 : q I (t+1) = q I-1 (t) Rule 204 : q I (t+1) = q I (t) Rule 85 : q I (t+1) = q I-1 (t) Rule 51 : q I (t+1) = q I (t) Rule 238 : q I (t+1) = q I (t) + q I+1 (t) Rule 17 : q I (t+1) = q I (t) + q I+1 (t) Rule 240 : q I (t+1) = q I+1 (t) Rule 15 : q I (t+1) = q I+1 (t) Rule 250 : q I (t+1) = q I-1 (t) + q I+1 (t) Rule 252 : q I (t+1) = q I-1 (t) + q I (t) Rule 5 : q I (t+1) = q I-1 (t) + q I+1 (t) Rule 3 : q I (t+1) = q I-1 (t) + q I (t) Rule 254 : q I (t+1) = q I-1 (t) + q I (t) + q I+1 (t) Rule 1 : q I (t+1) = q I-1 (t) + q I (t) + q I+1 (t)

41 Fuzzy Cellular Automata (FCA) 16 OR and NOR rules can be represented by n x n matrix T and an n dimensional binary vector F S i (t) represents the state of i th cell at t th time instant S i (t+1) = F i -min{1, Σ j T ij.s j (t)} where T 1 if next state of ij = ith cell dependents on j th cell 0 otherwise F = Inversion vector, contains 1 where NOR rule is applied cell null boundary hybrid FCA <238,1,238,3> T = F =

42 Fuzzy Multiple Attractor CA (FMACA)

43 Fuzzy Multiple Attractor CA (FMACA) Dependency Vector (DV) corresponding to matrix Derived Complement Vector (DCV) corresponding to inversion vector Pivot cell (PC) represents an attractor basin uniquely State of Pivot Cell (PC) of attractor of the basin where a state belongs q m = min {1, Σ j DCV j -DV j.s j (t) } Size of attractor basins equal as well as unequal Matrix, inversion vector from DV/DCV

44 Fuzzy Multiple Attractor CA (FMACA) T = F = 0 DV = 1 DCV =

45 Fuzzy Multiple Attractor CA (FMACA) T = F = 1 DV = 1 DCV =

46 FMACA based Tree-Structured Classifier FMACA based tree-structured pattern classifier Can handle binary as well as real valued datasets Provides equal and unequal size of attractor basins Combination of GA and DV/DCV FMACA 1 FMACA 2 II IV FMACA 3 FMACA 4 I III IV I II III I II IV

47 Experimental Setup Randomly generate K number of centroids Around each centroid, generate t number of tuples 50 % patterns are taken for training 50 % patterns are taken for testing A 1 D min A 2 d max A K

48 Performance Analysis of FMACA Generalization of FMACA tree Dataset Depth Training Accuracy Testing Accuracy Breadth Depth: Number of layers from root to leaf Breadth: Number of intermediate nodes Can generalize dataset irrespective of classes, tuples, attributes (n=5,k=2,t=4000) Attributes (n) Size (t) No of Classes FMACA C Higher classification accuracy compared to C

49 Performance Analysis of FMACA Generation Time (ms) Retrieval Time (ms) Dataset FMACA C4.5 FMACA C4.5 (n=5,k=2,t=2000) High generation time - but, one time cost Lower retrieval time compared to C4.5 (n=5,k=2,t=20000) (n=6,k=2,t=2000) (n=6,k=2,t=20000) Attributes (n) Size (t) No of Classes FMACA C Lower memory overhead compared to C

50 Performance on STATLOG Dataset Classification Accuracy (%) Dataset FMACA MACA DNA Satimage Shuttle Letter Number of CA cells FMACA MACA No of Nodes of Tree FMACA MACA Memory Overhead Retrieval Time (ms) Comparable accuracy Lesser CA cells Lesser memory overhead Lesser retrieval time Dataset DNA Satimage Shuttle Letter FMACA MACA FMACA MACA

51 Conclusion Introduction of fuzzy CA in pattern recognition New mathematical tools Dependency matrix, Dependency vector Complement vector, Derived complement vector Reduction of complexity to identify attractors from O(n 3 ) to O(n) Both equal and unequal size of attractor basins Movement of patterns from one to another basin Reduction of search space Elegant evolutionary algorithm combination of DV/DCV and GA FMACA based tree-structured pattern classifier

52 Future Extensions Applications in pattern clustering, mix-mode learning Theoretical analysis of memorizing capacity of non-linear CA Combination of fuzzy set and fuzzy CA 1-D CA to 2-D CA Development of hybrid systems using CA CA + neural network + fuzzy set CA + fuzzy set + rough set Boolean CA to multi-valued / hierarchical CA Application of CA in Bioinformatics Medical Image Analysis Image Compression Data Mining

53 Thank You

Data Mining. Preamble: Control Application. Industrial Researcher s Approach. Practitioner s Approach. Example. Example. Goal: Maintain T ~Td

Data Mining. Preamble: Control Application. Industrial Researcher s Approach. Practitioner s Approach. Example. Example. Goal: Maintain T ~Td Data Mining Andrew Kusiak 2139 Seamans Center Iowa City, Iowa 52242-1527 Preamble: Control Application Goal: Maintain T ~Td Tel: 319-335 5934 Fax: 319-335 5669 andrew-kusiak@uiowa.edu http://www.icaen.uiowa.edu/~ankusiak

More information

Theory of Additive Cellular Automata

Theory of Additive Cellular Automata Fundamenta Informaticae XXI (200) 00 02 00 IOS Press Theory of Additive Cellular Automata Niloy Ganguly Department of Computer Science and Engineering, Indian Institute of Technology, Waragpur, India Biplab

More information

Using a Hopfield Network: A Nuts and Bolts Approach

Using a Hopfield Network: A Nuts and Bolts Approach Using a Hopfield Network: A Nuts and Bolts Approach November 4, 2013 Gershon Wolfe, Ph.D. Hopfield Model as Applied to Classification Hopfield network Training the network Updating nodes Sequencing of

More information

Probabilistic Analysis of Cellular Automata Rules and its Application in Pseudo Random Pattern Generation

Probabilistic Analysis of Cellular Automata Rules and its Application in Pseudo Random Pattern Generation Probabilistic Analysis of Cellular Automata Rules and its Application in Pseudo Random Pattern Generation Abhishek Seth, S. Bandyopadhyay, U. Maulik. Abstract The present work is an extension of the work

More information

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm Volume 4, Issue 5, May 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Huffman Encoding

More information

Outline. EECS Components and Design Techniques for Digital Systems. Lec 18 Error Coding. In the real world. Our beautiful digital world.

Outline. EECS Components and Design Techniques for Digital Systems. Lec 18 Error Coding. In the real world. Our beautiful digital world. Outline EECS 150 - Components and esign Techniques for igital Systems Lec 18 Error Coding Errors and error models Parity and Hamming Codes (SECE) Errors in Communications LFSRs Cyclic Redundancy Check

More information

Computational Intelligence Lecture 3: Simple Neural Networks for Pattern Classification

Computational Intelligence Lecture 3: Simple Neural Networks for Pattern Classification Computational Intelligence Lecture 3: Simple Neural Networks for Pattern Classification Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Fall 2011 arzaneh Abdollahi

More information

Intelligent Modular Neural Network for Dynamic System Parameter Estimation

Intelligent Modular Neural Network for Dynamic System Parameter Estimation Intelligent Modular Neural Network for Dynamic System Parameter Estimation Andrzej Materka Technical University of Lodz, Institute of Electronics Stefanowskiego 18, 9-537 Lodz, Poland Abstract: A technique

More information

Implementation of Lossless Huffman Coding: Image compression using K-Means algorithm and comparison vs. Random numbers and Message source

Implementation of Lossless Huffman Coding: Image compression using K-Means algorithm and comparison vs. Random numbers and Message source Implementation of Lossless Huffman Coding: Image compression using K-Means algorithm and comparison vs. Random numbers and Message source Ali Tariq Bhatti 1, Dr. Jung Kim 2 1,2 Department of Electrical

More information

Sorting Network Development Using Cellular Automata

Sorting Network Development Using Cellular Automata Sorting Network Development Using Cellular Automata Michal Bidlo, Zdenek Vasicek, and Karel Slany Brno University of Technology, Faculty of Information Technology Božetěchova 2, 61266 Brno, Czech republic

More information

Determine the size of an instance of the minimum spanning tree problem.

Determine the size of an instance of the minimum spanning tree problem. 3.1 Algorithm complexity Consider two alternative algorithms A and B for solving a given problem. Suppose A is O(n 2 ) and B is O(2 n ), where n is the size of the instance. Let n A 0 be the size of the

More information

The Power of Extra Analog Neuron. Institute of Computer Science Academy of Sciences of the Czech Republic

The Power of Extra Analog Neuron. Institute of Computer Science Academy of Sciences of the Czech Republic The Power of Extra Analog Neuron Jiří Šíma Institute of Computer Science Academy of Sciences of the Czech Republic (Artificial) Neural Networks (NNs) 1. mathematical models of biological neural networks

More information

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau Last update: October 26, 207 Neural networks CMSC 42: Section 8.7 Dana Nau Outline Applications of neural networks Brains Neural network units Perceptrons Multilayer perceptrons 2 Example Applications

More information

Hopfield Network Recurrent Netorks

Hopfield Network Recurrent Netorks Hopfield Network Recurrent Netorks w 2 w n E.P. P.E. y (k) Auto-Associative Memory: Given an initial n bit pattern returns the closest stored (associated) pattern. No P.E. self-feedback! w 2 w n2 E.P.2

More information

DRAFT. Algebraic computation models. Chapter 14

DRAFT. Algebraic computation models. Chapter 14 Chapter 14 Algebraic computation models Somewhat rough We think of numerical algorithms root-finding, gaussian elimination etc. as operating over R or C, even though the underlying representation of the

More information

Artificial Neural Networks Examination, June 2005

Artificial Neural Networks Examination, June 2005 Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either

More information

Reification of Boolean Logic

Reification of Boolean Logic 526 U1180 neural networks 1 Chapter 1 Reification of Boolean Logic The modern era of neural networks began with the pioneer work of McCulloch and Pitts (1943). McCulloch was a psychiatrist and neuroanatomist;

More information

Mechanisms of Emergent Computation in Cellular Automata

Mechanisms of Emergent Computation in Cellular Automata Mechanisms of Emergent Computation in Cellular Automata Wim Hordijk, James P. Crutchfield, Melanie Mitchell Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, 87501 NM, USA email: {wim,chaos,mm}@santafe.edu

More information

Part 8: Neural Networks

Part 8: Neural Networks METU Informatics Institute Min720 Pattern Classification ith Bio-Medical Applications Part 8: Neural Netors - INTRODUCTION: BIOLOGICAL VS. ARTIFICIAL Biological Neural Netors A Neuron: - A nerve cell as

More information

Chapter 1. Introduction

Chapter 1. Introduction Chapter 1 Introduction Symbolical artificial intelligence is a field of computer science that is highly related to quantum computation. At first glance, this statement appears to be a contradiction. However,

More information

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided

More information

Coexistence of Dynamics for Two- Dimensional Cellular Automata

Coexistence of Dynamics for Two- Dimensional Cellular Automata Coexistence of Dynamics for Two- Dimensional Cellular Automata Ricardo Severino Department of Mathematics and Applications University of Minho Campus de Gualtar - 4710-057 Braga, Portugal Maria Joana Soares

More information

INTRODUCTION TO ARTIFICIAL INTELLIGENCE

INTRODUCTION TO ARTIFICIAL INTELLIGENCE v=1 v= 1 v= 1 v= 1 v= 1 v=1 optima 2) 3) 5) 6) 7) 8) 9) 12) 11) 13) INTRDUCTIN T ARTIFICIAL INTELLIGENCE DATA15001 EPISDE 8: NEURAL NETWRKS TDAY S MENU 1. NEURAL CMPUTATIN 2. FEEDFRWARD NETWRKS (PERCEPTRN)

More information

Artificial Intelligence (AI) Common AI Methods. Training. Signals to Perceptrons. Artificial Neural Networks (ANN) Artificial Intelligence

Artificial Intelligence (AI) Common AI Methods. Training. Signals to Perceptrons. Artificial Neural Networks (ANN) Artificial Intelligence Artificial Intelligence (AI) Artificial Intelligence AI is an attempt to reproduce intelligent reasoning using machines * * H. M. Cartwright, Applications of Artificial Intelligence in Chemistry, 1993,

More information

Inadmissible Class of Boolean Functions under Stuck-at Faults

Inadmissible Class of Boolean Functions under Stuck-at Faults Inadmissible Class of Boolean Functions under Stuck-at Faults Debesh K. Das 1, Debabani Chowdhury 1, Bhargab B. Bhattacharya 2, Tsutomu Sasao 3 1 Computer Sc. & Engg. Dept., Jadavpur University, Kolkata

More information

SP-CNN: A Scalable and Programmable CNN-based Accelerator. Dilan Manatunga Dr. Hyesoon Kim Dr. Saibal Mukhopadhyay

SP-CNN: A Scalable and Programmable CNN-based Accelerator. Dilan Manatunga Dr. Hyesoon Kim Dr. Saibal Mukhopadhyay SP-CNN: A Scalable and Programmable CNN-based Accelerator Dilan Manatunga Dr. Hyesoon Kim Dr. Saibal Mukhopadhyay Motivation Power is a first-order design constraint, especially for embedded devices. Certain

More information

On Detecting Multiple Faults in Baseline Interconnection Networks

On Detecting Multiple Faults in Baseline Interconnection Networks On Detecting Multiple Faults in Baseline Interconnection Networks SHUN-SHII LIN 1 AND SHAN-TAI CHEN 2 1 National Taiwan Normal University, Taipei, Taiwan, ROC 2 Chung Cheng Institute of Technology, Tao-Yuan,

More information

Motivation. Evolution has rediscovered several times multicellularity as a way to build complex living systems

Motivation. Evolution has rediscovered several times multicellularity as a way to build complex living systems Cellular Systems 1 Motivation Evolution has rediscovered several times multicellularity as a way to build complex living systems Multicellular systems are composed by many copies of a unique fundamental

More information

Non-linear Measure Based Process Monitoring and Fault Diagnosis

Non-linear Measure Based Process Monitoring and Fault Diagnosis Non-linear Measure Based Process Monitoring and Fault Diagnosis 12 th Annual AIChE Meeting, Reno, NV [275] Data Driven Approaches to Process Control 4:40 PM, Nov. 6, 2001 Sandeep Rajput Duane D. Bruns

More information

Decision Tree Learning

Decision Tree Learning Decision Tree Learning Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University References: 1. Machine Learning, Chapter 3 2. Data Mining: Concepts, Models,

More information

Bounded Approximation Algorithms

Bounded Approximation Algorithms Bounded Approximation Algorithms Sometimes we can handle NP problems with polynomial time algorithms which are guaranteed to return a solution within some specific bound of the optimal solution within

More information

Neural Networks Lecture 2:Single Layer Classifiers

Neural Networks Lecture 2:Single Layer Classifiers Neural Networks Lecture 2:Single Layer Classifiers H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011. A. Talebi, Farzaneh Abdollahi Neural

More information

Chapter 7 Logic Circuits

Chapter 7 Logic Circuits Chapter 7 Logic Circuits Goal. Advantages of digital technology compared to analog technology. 2. Terminology of Digital Circuits. 3. Convert Numbers between Decimal, Binary and Other forms. 5. Binary

More information

NCU EE -- DSP VLSI Design. Tsung-Han Tsai 1

NCU EE -- DSP VLSI Design. Tsung-Han Tsai 1 NCU EE -- DSP VLSI Design. Tsung-Han Tsai 1 Multi-processor vs. Multi-computer architecture µp vs. DSP RISC vs. DSP RISC Reduced-instruction-set Register-to-register operation Higher throughput by using

More information

Introduction. Spatial Multi-Agent Systems. The Need for a Theory

Introduction. Spatial Multi-Agent Systems. The Need for a Theory Introduction Spatial Multi-Agent Systems A spatial multi-agent system is a decentralized system composed of numerous identically programmed agents that either form or are embedded in a geometric space.

More information

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required.

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required. In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required. In humans, association is known to be a prominent feature of memory.

More information

Artificial Neural Network Method of Rock Mass Blastability Classification

Artificial Neural Network Method of Rock Mass Blastability Classification Artificial Neural Network Method of Rock Mass Blastability Classification Jiang Han, Xu Weiya, Xie Shouyi Research Institute of Geotechnical Engineering, Hohai University, Nanjing, Jiangshu, P.R.China

More information

Problem. Problem Given a dictionary and a word. Which page (if any) contains the given word? 3 / 26

Problem. Problem Given a dictionary and a word. Which page (if any) contains the given word? 3 / 26 Binary Search Introduction Problem Problem Given a dictionary and a word. Which page (if any) contains the given word? 3 / 26 Strategy 1: Random Search Randomly select a page until the page containing

More information

Linear Finite State Machines 1. X. Sun E. Kontopidi M. Serra J. Muzio. Abstract

Linear Finite State Machines 1. X. Sun E. Kontopidi M. Serra J. Muzio. Abstract The Concatenation and Partitioning of Linear Finite State Machines 1 X. Sun E. Kontopidi M. Serra J. Muzio Dept. of Electrical Engineering University of Alberta Edmonton, AB T6G 2G7 Dept. of Comp. Science

More information

Application of hopfield network in improvement of fingerprint recognition process Mahmoud Alborzi 1, Abbas Toloie- Eshlaghy 1 and Dena Bazazian 2

Application of hopfield network in improvement of fingerprint recognition process Mahmoud Alborzi 1, Abbas Toloie- Eshlaghy 1 and Dena Bazazian 2 5797 Available online at www.elixirjournal.org Computer Science and Engineering Elixir Comp. Sci. & Engg. 41 (211) 5797-582 Application hopfield network in improvement recognition process Mahmoud Alborzi

More information

Pattern Recognition and Machine Learning. Learning and Evaluation of Pattern Recognition Processes

Pattern Recognition and Machine Learning. Learning and Evaluation of Pattern Recognition Processes Pattern Recognition and Machine Learning James L. Crowley ENSIMAG 3 - MMIS Fall Semester 2016 Lesson 1 5 October 2016 Learning and Evaluation of Pattern Recognition Processes Outline Notation...2 1. The

More information

Relating Entropy Theory to Test Data Compression

Relating Entropy Theory to Test Data Compression Relating Entropy Theory to Test Data Compression Kedarnath J. Balakrishnan and Nur A. Touba Computer Engineering Research Center University of Texas, Austin, TX 7872 Email: {kjbala, touba}@ece.utexas.edu

More information

Neural Networks Lecture 6: Associative Memory II

Neural Networks Lecture 6: Associative Memory II Neural Networks Lecture 6: Associative Memory II H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011. A. Talebi, Farzaneh Abdollahi Neural

More information

On the Hopfield algorithm. Foundations and examples

On the Hopfield algorithm. Foundations and examples General Mathematics Vol. 13, No. 2 (2005), 35 50 On the Hopfield algorithm. Foundations and examples Nicolae Popoviciu and Mioara Boncuţ Dedicated to Professor Dumitru Acu on his 60th birthday Abstract

More information

Classification of Random Boolean Networks

Classification of Random Boolean Networks in Artificial Life VIII, Standish, Abbass, Bedau (eds)(mit Press) 2002. pp 1 8 1 Classification of Random Boolean Networks Carlos Gershenson, School of Cognitive and Computer Sciences University of Sussex

More information

Support Vector Machines (SVM) in bioinformatics. Day 1: Introduction to SVM

Support Vector Machines (SVM) in bioinformatics. Day 1: Introduction to SVM 1 Support Vector Machines (SVM) in bioinformatics Day 1: Introduction to SVM Jean-Philippe Vert Bioinformatics Center, Kyoto University, Japan Jean-Philippe.Vert@mines.org Human Genome Center, University

More information

COMPARING PERFORMANCE OF NEURAL NETWORKS RECOGNIZING MACHINE GENERATED CHARACTERS

COMPARING PERFORMANCE OF NEURAL NETWORKS RECOGNIZING MACHINE GENERATED CHARACTERS Proceedings of the First Southern Symposium on Computing The University of Southern Mississippi, December 4-5, 1998 COMPARING PERFORMANCE OF NEURAL NETWORKS RECOGNIZING MACHINE GENERATED CHARACTERS SEAN

More information

Elementary Cellular Automata with

Elementary Cellular Automata with letourneau_pdf.nb 1 Elementary Cellular Automata with Memory Paul-Jean Letourneau Department of Physics, University of Calgary Elementary Cellular Automata (ECA) In order to explain how an Elementary Cellular

More information

Introduction to Artificial Neural Networks

Introduction to Artificial Neural Networks Facultés Universitaires Notre-Dame de la Paix 27 March 2007 Outline 1 Introduction 2 Fundamentals Biological neuron Artificial neuron Artificial Neural Network Outline 3 Single-layer ANN Perceptron Adaline

More information

Discovery of the Boolean Functions to the Best Density-Classification Rules Using Gene Expression Programming

Discovery of the Boolean Functions to the Best Density-Classification Rules Using Gene Expression Programming Discovery of the Boolean Functions to the Best Density-Classification Rules Using Gene Expression Programming Cândida Ferreira Gepsoft, 37 The Ridings, Bristol BS13 8NU, UK candidaf@gepsoft.com http://www.gepsoft.com

More information

Neural Networks Lecture 4: Radial Bases Function Networks

Neural Networks Lecture 4: Radial Bases Function Networks Neural Networks Lecture 4: Radial Bases Function Networks H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011. A. Talebi, Farzaneh Abdollahi

More information

Biological Systems: Open Access

Biological Systems: Open Access Biological Systems: Open Access Biological Systems: Open Access Liu and Zheng, 2016, 5:1 http://dx.doi.org/10.4172/2329-6577.1000153 ISSN: 2329-6577 Research Article ariant Maps to Identify Coding and

More information

DIGITAL LOGIC CIRCUITS

DIGITAL LOGIC CIRCUITS DIGITAL LOGIC CIRCUITS Digital logic circuits BINARY NUMBER SYSTEM electronic circuits that handle information encoded in binary form (deal with signals that have only two values, and ) Digital. computers,

More information

CSC 1700 Analysis of Algorithms: Warshall s and Floyd s algorithms

CSC 1700 Analysis of Algorithms: Warshall s and Floyd s algorithms CSC 1700 Analysis of Algorithms: Warshall s and Floyd s algorithms Professor Henry Carter Fall 2016 Recap Space-time tradeoffs allow for faster algorithms at the cost of space complexity overhead Dynamic

More information

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS Learning Neural Networks Classifier Short Presentation INPUT: classification data, i.e. it contains an classification (class) attribute.

More information

Prime Clocks. Michael Stephen Fiske. 10th GI Conference on Autonomous Systems October 23, AEMEA Institute. San Francisco, California

Prime Clocks. Michael Stephen Fiske. 10th GI Conference on Autonomous Systems October 23, AEMEA Institute. San Francisco, California Prime Clocks Michael Stephen Fiske 10th GI Conference on Autonomous Systems October 23, 2017 AEMEA Institute San Francisco, California Motivation for Prime Clocks The mindset of mainstream computer science

More information

The Perceptron. Volker Tresp Summer 2014

The Perceptron. Volker Tresp Summer 2014 The Perceptron Volker Tresp Summer 2014 1 Introduction One of the first serious learning machines Most important elements in learning tasks Collection and preprocessing of training data Definition of a

More information

Chapter 6 Classification and Prediction (2)

Chapter 6 Classification and Prediction (2) Chapter 6 Classification and Prediction (2) Outline Classification and Prediction Decision Tree Naïve Bayes Classifier Support Vector Machines (SVM) K-nearest Neighbors Accuracy and Error Measures Feature

More information

Radial Basis Function Networks. Ravi Kaushik Project 1 CSC Neural Networks and Pattern Recognition

Radial Basis Function Networks. Ravi Kaushik Project 1 CSC Neural Networks and Pattern Recognition Radial Basis Function Networks Ravi Kaushik Project 1 CSC 84010 Neural Networks and Pattern Recognition History Radial Basis Function (RBF) emerged in late 1980 s as a variant of artificial neural network.

More information

CHAPTER1: Digital Logic Circuits Combination Circuits

CHAPTER1: Digital Logic Circuits Combination Circuits CS224: Computer Organization S.KHABET CHAPTER1: Digital Logic Circuits Combination Circuits 1 PRIMITIVE LOGIC GATES Each of our basic operations can be implemented in hardware using a primitive logic gate.

More information

Comparison of Shannon, Renyi and Tsallis Entropy used in Decision Trees

Comparison of Shannon, Renyi and Tsallis Entropy used in Decision Trees Comparison of Shannon, Renyi and Tsallis Entropy used in Decision Trees Tomasz Maszczyk and W lodzis law Duch Department of Informatics, Nicolaus Copernicus University Grudzi adzka 5, 87-100 Toruń, Poland

More information

Applications of Two Dimensional Cellular Automata rules for Block Cipher in Cryptography

Applications of Two Dimensional Cellular Automata rules for Block Cipher in Cryptography Applications of Two Dimensional Cellular Automata rules for Block Cipher in Cryptography Sambhu Prasad Panda 1, Madhusmita Sahu 2, Manas Kumar Swain 3 C V Raman Computer Academy 1,2, C V Raman College

More information

CS 4700: Foundations of Artificial Intelligence

CS 4700: Foundations of Artificial Intelligence CS 4700: Foundations of Artificial Intelligence Prof. Bart Selman selman@cs.cornell.edu Machine Learning: Neural Networks R&N 18.7 Intro & perceptron learning 1 2 Neuron: How the brain works # neurons

More information

Bayesian Classifiers and Probability Estimation. Vassilis Athitsos CSE 4308/5360: Artificial Intelligence I University of Texas at Arlington

Bayesian Classifiers and Probability Estimation. Vassilis Athitsos CSE 4308/5360: Artificial Intelligence I University of Texas at Arlington Bayesian Classifiers and Probability Estimation Vassilis Athitsos CSE 4308/5360: Artificial Intelligence I University of Texas at Arlington 1 Data Space Suppose that we have a classification problem The

More information

SRM UNIVERSITY DEPARTMENT OF AUTOMOBILE ENGINEERING PRE AND CO REQUISITE CHART

SRM UNIVERSITY DEPARTMENT OF AUTOMOBILE ENGINEERING PRE AND CO REQUISITE CHART SRM UNIVERSITY PRE AND CO REQUISITE CHART YEAR 1 SEM 2 SEM 1 15MA101 4 Cr 15PD101 1 Cr 15PY101 3 Cr 15PY101L 1 Cr 15CY101L 3 Cr 15CY101L 1 Cr 15ME105L 3 Cr Calculus and Solid Geometry Soft Skills I Physics

More information

Modelling with cellular automata

Modelling with cellular automata Modelling with cellular automata Shan He School for Computational Science University of Birmingham Module 06-23836: Computational Modelling with MATLAB Outline Outline of Topics Concepts about cellular

More information

Introduction: Computer Science is a cluster of related scientific and engineering disciplines concerned with the study and application of computations. These disciplines range from the pure and basic scientific

More information

Algorithms for Classification: The Basic Methods

Algorithms for Classification: The Basic Methods Algorithms for Classification: The Basic Methods Outline Simplicity first: 1R Naïve Bayes 2 Classification Task: Given a set of pre-classified examples, build a model or classifier to classify new cases.

More information

Opportunities and challenges in quantum-enhanced machine learning in near-term quantum computers

Opportunities and challenges in quantum-enhanced machine learning in near-term quantum computers Opportunities and challenges in quantum-enhanced machine learning in near-term quantum computers Alejandro Perdomo-Ortiz Senior Research Scientist, Quantum AI Lab. at NASA Ames Research Center and at the

More information

Multi-Dimensional Neural Networks: Unified Theory

Multi-Dimensional Neural Networks: Unified Theory Multi-Dimensional Neural Networks: Unified Theory Garimella Ramamurthy Associate Professor IIIT-Hyderebad India Slide 1 Important Publication Book based on my MASTERPIECE Title: Multi-Dimensional Neural

More information

Artificial Neural Networks Examination, March 2002

Artificial Neural Networks Examination, March 2002 Artificial Neural Networks Examination, March 2002 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

An introduction to clustering techniques

An introduction to clustering techniques - ABSTRACT Cluster analysis has been used in a wide variety of fields, such as marketing, social science, biology, pattern recognition etc. It is used to identify homogenous groups of cases to better understand

More information

EECS150 - Digital Design Lecture 26 - Faults and Error Correction. Types of Faults in Digital Designs

EECS150 - Digital Design Lecture 26 - Faults and Error Correction. Types of Faults in Digital Designs EECS150 - Digital Design Lecture 26 - Faults and Error Correction April 25, 2013 John Wawrzynek 1 Types of Faults in Digital Designs Design Bugs (function, timing, power draw) detected and corrected at

More information

Quiz 2 Solutions Room 10 Evans Hall, 2:10pm Tuesday April 2 (Open Katz only, Calculators OK, 1hr 20mins)

Quiz 2 Solutions Room 10 Evans Hall, 2:10pm Tuesday April 2 (Open Katz only, Calculators OK, 1hr 20mins) UNIVERSITY OF CALIFORNIA AT BERKELEY ERKELEY DAVIS IRVINE LOS ANGELES RIVERSIDE SAN DIEGO SAN FRANCISCO SANTA BARBARA SANTA CRUZ Department of Electrical Engineering and Computer Sciences Quiz 2 Solutions

More information

University of Genova - DITEN. Smart Patrolling. video and SIgnal Processing for Telecommunications ISIP40

University of Genova - DITEN. Smart Patrolling. video and SIgnal Processing for Telecommunications ISIP40 University of Genova - DITEN Smart Patrolling 1 Smart Patrolling Detection of the intruder Tracking of the intruder A cognitive node will active an operator, describing on his mobile terminal the characteristic

More information

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis Introduction to Natural Computation Lecture 9 Multilayer Perceptrons and Backpropagation Peter Lewis 1 / 25 Overview of the Lecture Why multilayer perceptrons? Some applications of multilayer perceptrons.

More information

Theory of Computation Lecture Notes. Problems and Algorithms. Class Information

Theory of Computation Lecture Notes. Problems and Algorithms. Class Information Theory of Computation Lecture Notes Prof. Yuh-Dauh Lyuu Dept. Computer Science & Information Engineering and Department of Finance National Taiwan University Problems and Algorithms c 2004 Prof. Yuh-Dauh

More information

Voronoi Networks and Their Probability of Misclassification

Voronoi Networks and Their Probability of Misclassification IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 11, NO. 6, NOVEMBER 2000 1361 Voronoi Networks and Their Probability of Misclassification K. Krishna, M. A. L. Thathachar, Fellow, IEEE, and K. R. Ramakrishnan

More information

Storage Capacity of Letter Recognition in Hopfield Networks

Storage Capacity of Letter Recognition in Hopfield Networks Storage Capacity of Letter Recognition in Hopfield Networks Gang Wei (gwei@cs.dal.ca) Zheyuan Yu (zyu@cs.dal.ca) Faculty of Computer Science, Dalhousie University, Halifax, N.S., Canada B3H 1W5 Abstract:

More information

Neural Networks. Prof. Dr. Rudolf Kruse. Computational Intelligence Group Faculty for Computer Science

Neural Networks. Prof. Dr. Rudolf Kruse. Computational Intelligence Group Faculty for Computer Science Neural Networks Prof. Dr. Rudolf Kruse Computational Intelligence Group Faculty for Computer Science kruse@iws.cs.uni-magdeburg.de Rudolf Kruse Neural Networks 1 Hopfield Networks Rudolf Kruse Neural Networks

More information

New Delhi & Affiliated to VTU, Belgavi ) Oorgaum, Kolar Dist ,Karnataka

New Delhi & Affiliated to VTU, Belgavi ) Oorgaum, Kolar Dist ,Karnataka Design and Implementation of Logic Gates and Adder Circuits on FPGA Using ANN Neelu Farha 1., Ann Louisa Paul J 2., Naadiya Kousar L S 3., Devika S 4., Prof. Ruckmani Divakaran 5 1,2,3,4,5 Department of

More information

Computer Science. Questions for discussion Part II. Computer Science COMPUTER SCIENCE. Section 4.2.

Computer Science. Questions for discussion Part II. Computer Science COMPUTER SCIENCE. Section 4.2. COMPUTER SCIENCE S E D G E W I C K / W A Y N E PA R T I I : A L G O R I T H M S, T H E O R Y, A N D M A C H I N E S Computer Science Computer Science An Interdisciplinary Approach Section 4.2 ROBERT SEDGEWICK

More information

Learning from Examples

Learning from Examples Learning from Examples Data fitting Decision trees Cross validation Computational learning theory Linear classifiers Neural networks Nonparametric methods: nearest neighbor Support vector machines Ensemble

More information

Classification Using Decision Trees

Classification Using Decision Trees Classification Using Decision Trees 1. Introduction Data mining term is mainly used for the specific set of six activities namely Classification, Estimation, Prediction, Affinity grouping or Association

More information

Multilayer Perceptron

Multilayer Perceptron Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Single Perceptron 3 Boolean Function Learning 4

More information

Evolutionary Computation. DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia)

Evolutionary Computation. DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia) Evolutionary Computation DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia) andrea.roli@unibo.it Evolutionary Computation Inspiring principle: theory of natural selection Species face

More information

Adders, subtractors comparators, multipliers and other ALU elements

Adders, subtractors comparators, multipliers and other ALU elements CSE4: Components and Design Techniques for Digital Systems Adders, subtractors comparators, multipliers and other ALU elements Instructor: Mohsen Imani UC San Diego Slides from: Prof.Tajana Simunic Rosing

More information

Phylogenetic Tree Generation using Different Scoring Methods

Phylogenetic Tree Generation using Different Scoring Methods International Journal of Computer Applications (975 8887) Phylogenetic Tree Generation using Different Scoring Methods Rajbir Singh Associate Prof. & Head Department of IT LLRIET, Moga Sinapreet Kaur Student

More information

Brief Introduction to Machine Learning

Brief Introduction to Machine Learning Brief Introduction to Machine Learning Yuh-Jye Lee Lab of Data Science and Machine Intelligence Dept. of Applied Math. at NCTU August 29, 2016 1 / 49 1 Introduction 2 Binary Classification 3 Support Vector

More information

1 Handling of Continuous Attributes in C4.5. Algorithm

1 Handling of Continuous Attributes in C4.5. Algorithm .. Spring 2009 CSC 466: Knowledge Discovery from Data Alexander Dekhtyar.. Data Mining: Classification/Supervised Learning Potpourri Contents 1. C4.5. and continuous attributes: incorporating continuous

More information

Automata-Theoretic LTL Model-Checking

Automata-Theoretic LTL Model-Checking Automata-Theoretic LTL Model-Checking Arie Gurfinkel arie@cmu.edu SEI/CMU Automata-Theoretic LTL Model-Checking p.1 LTL - Linear Time Logic (Pn 77) Determines Patterns on Infinite Traces Atomic Propositions

More information

Lecture 2. Judging the Performance of Classifiers. Nitin R. Patel

Lecture 2. Judging the Performance of Classifiers. Nitin R. Patel Lecture 2 Judging the Performance of Classifiers Nitin R. Patel 1 In this note we will examine the question of how to udge the usefulness of a classifier and how to compare different classifiers. Not only

More information

10-810: Advanced Algorithms and Models for Computational Biology. Optimal leaf ordering and classification

10-810: Advanced Algorithms and Models for Computational Biology. Optimal leaf ordering and classification 10-810: Advanced Algorithms and Models for Computational Biology Optimal leaf ordering and classification Hierarchical clustering As we mentioned, its one of the most popular methods for clustering gene

More information

Radial Basis Function (RBF) Networks

Radial Basis Function (RBF) Networks CSE 5526: Introduction to Neural Networks Radial Basis Function (RBF) Networks 1 Function approximation We have been using MLPs as pattern classifiers But in general, they are function approximators Depending

More information

DIGITAL LOGIC CIRCUITS

DIGITAL LOGIC CIRCUITS DIGITAL LOGIC CIRCUITS Introduction Logic Gates Boolean Algebra Map Specification Combinational Circuits Flip-Flops Sequential Circuits Memory Components Integrated Circuits Digital Computers 2 LOGIC GATES

More information

Probabilistic Methods in Bioinformatics. Pabitra Mitra

Probabilistic Methods in Bioinformatics. Pabitra Mitra Probabilistic Methods in Bioinformatics Pabitra Mitra pabitra@cse.iitkgp.ernet.in Probability in Bioinformatics Classification Categorize a new object into a known class Supervised learning/predictive

More information

Improved TBL algorithm for learning context-free grammar

Improved TBL algorithm for learning context-free grammar Proceedings of the International Multiconference on ISSN 1896-7094 Computer Science and Information Technology, pp. 267 274 2007 PIPS Improved TBL algorithm for learning context-free grammar Marcin Jaworski

More information

Edge of Chaos Computation in Mixed-Mode VLSI - A Hard Liquid

Edge of Chaos Computation in Mixed-Mode VLSI - A Hard Liquid Edge of Chaos Computation in Mixed-Mode VLSI - A Hard Liquid Felix Schürmann, Karlheinz Meier, Johannes Schemmel Kirchhoff Institute for Physics University of Heidelberg Im Neuenheimer Feld 227, 6912 Heidelberg,

More information

High Speed Time Efficient Reversible ALU Based Logic Gate Structure on Vertex Family

High Speed Time Efficient Reversible ALU Based Logic Gate Structure on Vertex Family International Journal of Engineering Research and Development e-issn: 2278-067X, p-issn: 2278-800X, www.ijerd.com Volume 11, Issue 04 (April 2015), PP.72-77 High Speed Time Efficient Reversible ALU Based

More information

Data Compression Techniques

Data Compression Techniques Data Compression Techniques Part 2: Text Compression Lecture 5: Context-Based Compression Juha Kärkkäinen 14.11.2017 1 / 19 Text Compression We will now look at techniques for text compression. These techniques

More information