USING ARTIFICIAL NEURAL NETWORKS FOR PLANT TAXONOMIC DETERMINATION BASED ON CHLOROPHYLL FLUORESCENCE INDUCTION CURVES

Size: px
Start display at page:

Download "USING ARTIFICIAL NEURAL NETWORKS FOR PLANT TAXONOMIC DETERMINATION BASED ON CHLOROPHYLL FLUORESCENCE INDUCTION CURVES"

Transcription

1 USING ARTIFICIAL NEURAL NETWORKS FOR PLANT TAXONOMIC DETERMINATION BASED ON CHLOROPHYLL FLUORESCENCE INDUCTION CURVES Mariya Kirova 1, 3, Georgina Ceppi 2, Petko Chernev 1, Vasilij Goltsev 1*, Reto Strasser 2, 1 Dept Biophysics and Radiobiology, Biological Faculty, Sofia University, Sofia, Bulgaria; 2 Lab. Bioenergetics, Science Faculty, University of Geneva, Geneva, Switzerland. 3 Biomed future Ltd., Sofia, Bulgaria (current address) Correspondence to Vasilij Goltsev 1 goltsev@biofac.uni-sofia.bg ABSTRACT Photosynthetic apparatus is a highly conservative element of the plant cell and its characteristics are comparatively similar no matter where a plant stands in the taxonomic classification. A problem of great interest is whether a highly informative method can be applied for taxonomic classification of plants based on their photosynthetic characteristics. We used as such one method artificial neural networks and as a source of information about the photosynthetic apparatus, chlorophyll fluorescence induction curves. We used feedforward neural network with back-propagation of error signals through the connections of the network. The designed network has two layers, five hidden neurons, and logarithmic sigmoidal transfer function. As input the network receives the numerical equivalent of the induction curves. In our studies we used induction curves from 35 different species which belonged to 19 different families from 17 orders divided into 11 superorders, all of which were from one of the two major groups of flowering plants (Monocotyledones, Dicotyledones). All together we used 2016 curves (61 curves per plant for most of the species). When a neural network is trained with the whole data set, the error is very high (over 50 %). That s why we developed a system of neural networks each of which divides a taxonomic rank to its subgroups. This step-bystep approach starts with analysis of the whole data set and ends with classification of each family into its species used in the studies. This system allows taxonomic classification with high accuracy (the error is under 5 %). Considering the written above, we can conclude that photosynthetic apparatus contains information about the genotype of the plants and can be used for their taxonomic classification. Keywords: Artificial neural networks, JIP-test, Mathematical modeling, Taxonomy, Variable chlorophyll fluorescence Introduction Photosynthetic apparatus is the main structural and functional element of the plant cell. Its reaction centers are highly conservative with low species specific characteristics. On the other hand, the antenna complexes characteristics are highly variable and specific for varies groups of plant species (1). This gives us the possibility to use chlorophyll fluorescence, which depends on the reaction centers and the electron transport chains but which is emitted by the antenna complexes, for taxonomic classification. This emission doesn t contain obvious species-specific characteristics. That s why we are suggesting the application of highly informative method for division of plant objects into systematical groups based on their photosynthetic characteristics manifested trough chlorophyll fluorescence transients (2). These characteristics could be analyzed through artificial neural networks which give us the opportunity to check whether such possibility for plant classification exists or not. Materials and methods In the current work we have used chlorophyll fluorescence induction curves measured with a fluorimeter Handy PEA (Plant Efficiency Analyzer, Hansatech Instruments Ltd.,King s Lynn, Norfolk, UK) (3). All the used plants were in normal physiological state. These were 35 different species which belonged to 19 different families from 17 orders divided into 11 superorders, all of which were from one of 941

2 the two major groups of flowering plants (Monocotyledones, Dicotyledones). All together we used 2016 curves (61 curves per plant for most of the species). The JIP test is a method for in vivo description of the photosynthetic apparatus behaviour (3, 4). This test could be described as a set of mathematical operations that allows the user to transform information derived from the measured OJIP-transients into estimates of energy fluxes (absorption, trapping, electron transport and dissipation) per reaction center or per cross section. The most popular parameter of the JIP-test which is used to characterize the total yield of the system is the so-called performance index, PI. We created, trained and tested the artificial neural networks in a standard software product MATLAB. We created a feedforward neural network with backpropagation. In order to increase the network efficiency the input data underwent preliminary principal component analysis, PCA. The designed network had two layers, five hidden neurons, logarithmic sigmoidal transfer function and 640 epochs at training. As input the network received the numerical equivalent of the induction curves. Three quarters of the input data were used for network training. To avoid possible overfitting we used Bayesian regularization (5, 6). Additionally, for better generalization, the network was trained with early stopping. The last quarter of the input data was used to examine the ability of the trained network to recognize correctly unknown curves. Results and Discussion In order to examine whether there is a connection between the test parameters and the taxonomic identity of a particular species/group, we applied the JIP test to the used set of chlorophyll fluorescence induction curves. The JIP test was made for a chosen set of the input curves and then the derived parameters were analyzed in two different manners (Fig. 1; Fig. 2). Fig 1. Three dimensional parametric visualization of the JIP test results for six plant species from two families. A. Three parametric representation of the JIP test results for the plant species Geranium endressi, Geranium himayense, Pelargonium peltatum, Pelargonium hotatum (Geraniaceae). B. Three parametric representation of the JIP test results for the plant species Ludisia discolor, Oncidium marshallianum (Orchidaceae). C. Three parametric representation of the JIP test results for plant objects from two different families (Geraniaceae, Orchidaceae) 942

3 To check whether they had species species-specific behaviour we chose three photosynthetically important JIP test parameters (PI ABS,, ) and presented them in a three parametric diagram (Fig. 1). This comparative analysis was made for two pairs of plant groups, species from one family (Fig. 1A), and for two taxonomically distant (meaning genetically distant as well) families (Fig 1.B). Fig. 1A shows the existence of areas in the three dimensional space which characterize the values of the three parameters for the four Geraniaceae species used in the studies and their intersection is insufficient. When this approach was applied to the species from the two families simultaneously in the three parameter space there is a sufficient intersection between the values of the parameters (Fig. 1C). Considering these results, we concluded that the three-parameter test could be used to distinguish clearly enough between the objects from one taxonomic group. At the same time, if this test was applied to objects with greater genome difference (e.g. species from two different families), the species specificity disappears. Therefore the threeparametric description of a particular object is not enough for its clear taxonomic classification. A Δ(Ro) B ϕ(po) Δ(Ro) 1.2 ϕ(po) C Δ(Ro) Liliopsida Lilianae Commelinanae D ϕ(po) Δ(Ro) Lilianae Aspargales Liliales ϕ(po) Aspargales Orchidaceae Iridaceae Orchidaceae Ludisia discolor Oncidium marshalianum Fig. 2. Variations in the JIP test parameters for the representatives of the different taxonomic levels. On each of the axes is plotted one of the chosen JIP test parameters. The results are presented as a relative ratio of the average values of the same parameters for the previous taxonomic group (one level higher). A. Spider plot for superorders Lilianae and Commelinanae, the parameters are presented as relative parts from the ones for Liliopsida. B. Spider plot for orders Aspargales and Liliales, the parameters are presented as relative parts from the ones for Liliopsida. C. Spider plot for families Orchidaceae and Iridaceae, the parameters are presented as relative parts from the ones for Aspargales. D. Spider plot for Oncidium discolor and Iris germanica, the parameters are presented as relative parts from the ones for Orchidaceae. Yet, could we use analysis and comparison of greater number of JIP test parameters for taxonomic classification? For presentation of the multiparametric JIP test results we used a spider plot graph (Fig. 2). Eight JIP test parameters were chosen (φ Ро, Δ Ео,, RC/CS o, Δ Ro,,, PI(total)). Each axis of the plot corresponded to one of the eight parameters. The results were represented as a relative part of the average values for the particular taxonomic level. 943

4 The data for the different taxonomic levels were plotted, from superorder to species. This approach of presentation allowed us to follow the dependence of the different JIP test parameters on the taxonomic classification of the object for which the parameters had been evaluated. The average value of each parameter for a particular taxonomic level was used as the comparative value ( 1). Our results showed (Fig. 2 A, B, C, D) that the different JIP test parameters had different dependence on the taxonomic classification of the test object. This created a specific profile of the alteration of these chlorophyll fluorescence characteristics for a particular test object. This profile could be considered species specific and could contribute to the taxonomic classification of a plant. The information from the chlorophyll fluorescence induction curves could be derived by mathematical approaches without preliminary calculation of the JIP test parameters. This could be achieved by artificial neural networks. For the neural network training we used the induction curves described in the Materials and methods section. When the network was trained with the whole data set, the training wasn t efficient enough and the error was quite high (55.4 % with Bayesian regularization; 53.3% with early stopping). To increase the recognition efficiency we developed a system of several neural networks which consequently processed the input data set. Each of these networks divided the data set for a particular taxonomic group to its subgroups used in the experiment. On the next stage each of these subgroups was divided into its sub-subgroups. This step-by-step method started with processing of the whole input data set and ended with classification of each family to its species used in these studies. The training results for the system of neural networks are summarized in Table 1. Table 1 Summary of the results from the neural networks training. Above each line it is noted the error made by the network trained with Bayesian regularization and under each line it is noted the error of the neural network working with application of early stopping. 944

5 Conclusion The described results show that the constellation of the photosynthetic process functional characteristics contains enough information about the taxonomic classification of the studied plants. Application of trained artificial neural networks allows this species-specific information to be derived successfully for an automation screening analysis of plant populations to be achieved. Acknowledgment We thank Bulgarian National Science Fund, Project No DO / for financial support. R.J.S acknowledges support by the Swiss National Science Foundation, Project Nr: REFERENCES 1. Ke B. (2001) Photosynthesis. Photobiochemisry and Photobiophysics. Kluwer Acad. Publ. Dordrecht, p Strasser RJ, Srivastava A, Govindjee (1995) Photochem Photobiol 61: Strasser R., Srivastava A., Tsimilli-Michael M. (2000) Probing Photosynthesis. CRC, p Tsimilli-Michael M., Strasser R. (2008) Mycorrhiza. Springer, Berlin Heidelberg, p McQuarrie A., Tsai C. (1998) Regression and Time Series Model Selection. World Scientific, Singapore. 6. Schwarz G. (1978) Annals of Statistics, 6,

KINETIC MODEL OF ELECTRON-TRANSPORT REACTIONS IN THYLAKOID MEMBRANES DETERMINING CHLOROPHYLL FLUORESCENCE TRANSIENTS

KINETIC MODEL OF ELECTRON-TRANSPORT REACTIONS IN THYLAKOID MEMBRANES DETERMINING CHLOROPHYLL FLUORESCENCE TRANSIENTS KINETIC MODEL OF ELECTRON-TRANSPORT REACTIONS IN THYLAKOID MEMBRANES DETERMINING CHLOROPHYLL FLUORESCENCE TRANSIENTS M. Gurmanova 1, P. Chernev 1, I. Zaharieva and V. Goltsev 1 1 Dept. of Biophysics and

More information

Effects of Salt Stress on Photosystem II Efficiency and CO 2 Assimilation in Two Syrian Barley Landraces

Effects of Salt Stress on Photosystem II Efficiency and CO 2 Assimilation in Two Syrian Barley Landraces 768 Effects of Salt Stress on Photosystem II Efficiency and CO 2 Assimilation in Two Syrian Barley Landraces Hazem M Kalaji a, *, Govindjee b, Karolina Bosa a, Janusz Kościelniak c, Krystyna Żuk-Gołaszewska

More information

DROUGHT INDUCED DAMAGES OF PHOTOSYNTHESIS IN BEAN AND PLANTAIN PLANTS ANALYZED IN VIVO BY CHLOROPHYLL A FLUORESCENCE

DROUGHT INDUCED DAMAGES OF PHOTOSYNTHESIS IN BEAN AND PLANTAIN PLANTS ANALYZED IN VIVO BY CHLOROPHYLL A FLUORESCENCE 39 Bulgarian Journal of Agricultural Science, 19 (2) 2013, 39 44 Agricultural Academy DROUGHT INDUCED DAMAGES OF PHOTOSYNTHESIS IN BEAN AND PLANTAIN PLANTS ANALYZED IN VIVO BY CHLOROPHYLL A FLUORESCENCE

More information

DIFFERENCES IN DIURNAL PATTERNS OF CHLOROPHYLL FLUORESCENCE IN LEAVES FROM DIFFERENT SIDES OF THE GINKGO (GINKGO BILOBA L.) CANOPY

DIFFERENCES IN DIURNAL PATTERNS OF CHLOROPHYLL FLUORESCENCE IN LEAVES FROM DIFFERENT SIDES OF THE GINKGO (GINKGO BILOBA L.) CANOPY Bangladesh J. Bot. 45(1): 63-68, 216 (March) DIFFRNCS IN DIURNAL PATTRNS OF CHLOROPHYLL FLUORSCNC IN LAVS FROM DIFFRNT SIDS OF TH GINKGO (GINKGO BILOBA L.) CANOPY XIAN-SONG YANG* AND GUO-XIANG CHN 1 College

More information

STIMULATION OF PHOTOSYNTHETIC CHARACTERISTICS OF GINKGO BILOBA L. DURING LEAF GROWTH

STIMULATION OF PHOTOSYNTHETIC CHARACTERISTICS OF GINKGO BILOBA L. DURING LEAF GROWTH Bangladesh J. Bot. 43(1): 73-77, 2014 (June) STIMULATION OF PHOTOSYNTHETIC CHARACTERISTICS OF GINKGO BILOBA L. DURING LEAF GROWTH XIAN-SONG YANG* AND GUO-XIANG CHEN 1 Department of Biology and Food-Engineering,

More information

Neural Networks with Applications to Vision and Language. Feedforward Networks. Marco Kuhlmann

Neural Networks with Applications to Vision and Language. Feedforward Networks. Marco Kuhlmann Neural Networks with Applications to Vision and Language Feedforward Networks Marco Kuhlmann Feedforward networks Linear separability x 2 x 2 0 1 0 1 0 0 x 1 1 0 x 1 linearly separable not linearly separable

More information

EPL442: Computational

EPL442: Computational EPL442: Computational Learning Systems Lab 2 Vassilis Vassiliades Department of Computer Science University of Cyprus Outline Artificial Neuron Feedforward Neural Network Back-propagation Algorithm Notes

More information

Neural Networks. Nicholas Ruozzi University of Texas at Dallas

Neural Networks. Nicholas Ruozzi University of Texas at Dallas Neural Networks Nicholas Ruozzi University of Texas at Dallas Handwritten Digit Recognition Given a collection of handwritten digits and their corresponding labels, we d like to be able to correctly classify

More information

Artificial Neural Networks

Artificial Neural Networks Introduction ANN in Action Final Observations Application: Poverty Detection Artificial Neural Networks Alvaro J. Riascos Villegas University of los Andes and Quantil July 6 2018 Artificial Neural Networks

More information

Lecture 4: Perceptrons and Multilayer Perceptrons

Lecture 4: Perceptrons and Multilayer Perceptrons Lecture 4: Perceptrons and Multilayer Perceptrons Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning Perceptrons, Artificial Neuronal Networks Lecture 4: Perceptrons

More information

SPSS, University of Texas at Arlington. Topics in Machine Learning-EE 5359 Neural Networks

SPSS, University of Texas at Arlington. Topics in Machine Learning-EE 5359 Neural Networks Topics in Machine Learning-EE 5359 Neural Networks 1 The Perceptron Output: A perceptron is a function that maps D-dimensional vectors to real numbers. For notational convenience, we add a zero-th dimension

More information

Machine Learning Basics III

Machine Learning Basics III Machine Learning Basics III Benjamin Roth CIS LMU München Benjamin Roth (CIS LMU München) Machine Learning Basics III 1 / 62 Outline 1 Classification Logistic Regression 2 Gradient Based Optimization Gradient

More information

Learning and Memory in Neural Networks

Learning and Memory in Neural Networks Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units

More information

Dreem Challenge report (team Bussanati)

Dreem Challenge report (team Bussanati) Wavelet course, MVA 04-05 Simon Bussy, simon.bussy@gmail.com Antoine Recanati, arecanat@ens-cachan.fr Dreem Challenge report (team Bussanati) Description and specifics of the challenge We worked on the

More information

Article from. Predictive Analytics and Futurism. July 2016 Issue 13

Article from. Predictive Analytics and Futurism. July 2016 Issue 13 Article from Predictive Analytics and Futurism July 2016 Issue 13 Regression and Classification: A Deeper Look By Jeff Heaton Classification and regression are the two most common forms of models fitted

More information

Choosing Variables with a Genetic Algorithm for Econometric models based on Neural Networks learning and adaptation.

Choosing Variables with a Genetic Algorithm for Econometric models based on Neural Networks learning and adaptation. Choosing Variables with a Genetic Algorithm for Econometric models based on Neural Networks learning and adaptation. Daniel Ramírez A., Israel Truijillo E. LINDA LAB, Computer Department, UNAM Facultad

More information

Artifical Neural Networks

Artifical Neural Networks Neural Networks Artifical Neural Networks Neural Networks Biological Neural Networks.................................. Artificial Neural Networks................................... 3 ANN Structure...........................................

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) Human Brain Neurons Input-Output Transformation Input Spikes Output Spike Spike (= a brief pulse) (Excitatory Post-Synaptic Potential)

More information

Numerical Interpolation of Aircraft Aerodynamic Data through Regression Approach on MLP Network

Numerical Interpolation of Aircraft Aerodynamic Data through Regression Approach on MLP Network , pp.12-17 http://dx.doi.org/10.14257/astl.2018.151.03 Numerical Interpolation of Aircraft Aerodynamic Data through Regression Approach on MLP Network Myeong-Jae Jo 1, In-Kyum Kim 1, Won-Hyuck Choi 2 and

More information

COGS Q250 Fall Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November.

COGS Q250 Fall Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November. COGS Q250 Fall 2012 Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November. For the first two questions of the homework you will need to understand the learning algorithm using the delta

More information

Artificial Neural Networks Examination, June 2005

Artificial Neural Networks Examination, June 2005 Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either

More information

Classification of Ordinal Data Using Neural Networks

Classification of Ordinal Data Using Neural Networks Classification of Ordinal Data Using Neural Networks Joaquim Pinto da Costa and Jaime S. Cardoso 2 Faculdade Ciências Universidade Porto, Porto, Portugal jpcosta@fc.up.pt 2 Faculdade Engenharia Universidade

More information

Analysis of Multilayer Neural Network Modeling and Long Short-Term Memory

Analysis of Multilayer Neural Network Modeling and Long Short-Term Memory Analysis of Multilayer Neural Network Modeling and Long Short-Term Memory Danilo López, Nelson Vera, Luis Pedraza International Science Index, Mathematical and Computational Sciences waset.org/publication/10006216

More information

Artificial Neural Networks Examination, March 2004

Artificial Neural Networks Examination, March 2004 Artificial Neural Networks Examination, March 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

HYPERGRAPH BASED SEMI-SUPERVISED LEARNING ALGORITHMS APPLIED TO SPEECH RECOGNITION PROBLEM: A NOVEL APPROACH

HYPERGRAPH BASED SEMI-SUPERVISED LEARNING ALGORITHMS APPLIED TO SPEECH RECOGNITION PROBLEM: A NOVEL APPROACH HYPERGRAPH BASED SEMI-SUPERVISED LEARNING ALGORITHMS APPLIED TO SPEECH RECOGNITION PROBLEM: A NOVEL APPROACH Hoang Trang 1, Tran Hoang Loc 1 1 Ho Chi Minh City University of Technology-VNU HCM, Ho Chi

More information

epochs epochs

epochs epochs Neural Network Experiments To illustrate practical techniques, I chose to use the glass dataset. This dataset has 214 examples and 6 classes. Here are 4 examples from the original dataset. The last values

More information

4. Multilayer Perceptrons

4. Multilayer Perceptrons 4. Multilayer Perceptrons This is a supervised error-correction learning algorithm. 1 4.1 Introduction A multilayer feedforward network consists of an input layer, one or more hidden layers, and an output

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward

More information

Harvest Network Courses. Innovation Strategies & Systems Biology of Photosynthesis by Photon Systems Instruments April 1-6, 2010, Brno, Czech Republic

Harvest Network Courses. Innovation Strategies & Systems Biology of Photosynthesis by Photon Systems Instruments April 1-6, 2010, Brno, Czech Republic Harvest Network Courses Innovation Strategies & Systems Biology of Photosynthesis by Photon Systems Instruments April 1-6, 2010, Brno, Czech Republic Modeling of the Fast Chlorophyll a Fluorescence Rise

More information

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis Introduction to Natural Computation Lecture 9 Multilayer Perceptrons and Backpropagation Peter Lewis 1 / 25 Overview of the Lecture Why multilayer perceptrons? Some applications of multilayer perceptrons.

More information

WEIGHTS OF TESTS Vesela Angelova

WEIGHTS OF TESTS Vesela Angelova International Journal "Information Models and Analyses" Vol.1 / 2012 193 WEIGHTS OF TESTS Vesela Angelova Abstract: Terminal test is subset of features in training table that is enough to distinguish objects

More information

Protein Structure Prediction Using Multiple Artificial Neural Network Classifier *

Protein Structure Prediction Using Multiple Artificial Neural Network Classifier * Protein Structure Prediction Using Multiple Artificial Neural Network Classifier * Hemashree Bordoloi and Kandarpa Kumar Sarma Abstract. Protein secondary structure prediction is the method of extracting

More information

Artificial Neural Networks. Edward Gatt

Artificial Neural Networks. Edward Gatt Artificial Neural Networks Edward Gatt What are Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning Very

More information

THE EFFECT OF WATER STRESS ON STOMATAL RESISTANCE AND CHLOROPHYLL FLUORESCENCE AND THEIR ASSOCIATION WITH ALFALFA YIELD

THE EFFECT OF WATER STRESS ON STOMATAL RESISTANCE AND CHLOROPHYLL FLUORESCENCE AND THEIR ASSOCIATION WITH ALFALFA YIELD NARDI FUNDULEA, ROMANIA ROMANIAN AGRICULTURAL RESEARCH, NO. 31, 2014 www.incda-fundulea.ro Print ISSN 1222-4227; Online ISSN 2067-5720 THE EFFECT OF WATER STRESS ON STOMATAL RESISTANCE AND CHLOROPHYLL

More information

Introduction to Artificial Neural Networks

Introduction to Artificial Neural Networks Facultés Universitaires Notre-Dame de la Paix 27 March 2007 Outline 1 Introduction 2 Fundamentals Biological neuron Artificial neuron Artificial Neural Network Outline 3 Single-layer ANN Perceptron Adaline

More information

Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks. Cannot approximate (learn) non-linear functions

Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks. Cannot approximate (learn) non-linear functions BACK-PROPAGATION NETWORKS Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks Cannot approximate (learn) non-linear functions Difficult (if not impossible) to design

More information

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm Volume 4, Issue 5, May 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Huffman Encoding

More information

A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation

A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation 1 Introduction A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation J Wesley Hines Nuclear Engineering Department The University of Tennessee Knoxville, Tennessee,

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks Threshold units Gradient descent Multilayer networks Backpropagation Hidden layer representations Example: Face Recognition Advanced topics 1 Connectionist Models Consider humans:

More information

Department of Plant Physiology, Faculty of Biology, University of Sofia, Sofia, Bulgaria

Department of Plant Physiology, Faculty of Biology, University of Sofia, Sofia, Bulgaria Genetics and Plant Physiology 2014, Volume 4 (1 2), pp. 44 56 Special Issue (Part 1) Conference Plant Physiology and Genetics Achievements and Challenges 24-26 September 2014 Sofia, Bulgaria 2014 Published

More information

Data Mining Part 5. Prediction

Data Mining Part 5. Prediction Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,

More information

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen Neural Networks - I Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Neural Networks 1 /

More information

Neural Networks. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington

Neural Networks. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington Neural Networks CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington 1 Perceptrons x 0 = 1 x 1 x 2 z = h w T x Output: z x D A perceptron

More information

Introduction to Neural Networks

Introduction to Neural Networks CUONG TUAN NGUYEN SEIJI HOTTA MASAKI NAKAGAWA Tokyo University of Agriculture and Technology Copyright by Nguyen, Hotta and Nakagawa 1 Pattern classification Which category of an input? Example: Character

More information

ESTIMATION OF HOURLY MEAN AMBIENT TEMPERATURES WITH ARTIFICIAL NEURAL NETWORKS 1. INTRODUCTION

ESTIMATION OF HOURLY MEAN AMBIENT TEMPERATURES WITH ARTIFICIAL NEURAL NETWORKS 1. INTRODUCTION Mathematical and Computational Applications, Vol. 11, No. 3, pp. 215-224, 2006. Association for Scientific Research ESTIMATION OF HOURLY MEAN AMBIENT TEMPERATURES WITH ARTIFICIAL NEURAL NETWORKS Ömer Altan

More information

ECLT 5810 Classification Neural Networks. Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann

ECLT 5810 Classification Neural Networks. Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann ECLT 5810 Classification Neural Networks Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann Neural Networks A neural network is a set of connected input/output

More information

Neural Network Based Response Surface Methods a Comparative Study

Neural Network Based Response Surface Methods a Comparative Study . LS-DYNA Anwenderforum, Ulm Robustheit / Optimierung II Neural Network Based Response Surface Methods a Comparative Study Wolfram Beyer, Martin Liebscher, Michael Beer, Wolfgang Graf TU Dresden, Germany

More information

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009 AN INTRODUCTION TO NEURAL NETWORKS Scott Kuindersma November 12, 2009 SUPERVISED LEARNING We are given some training data: We must learn a function If y is discrete, we call it classification If it is

More information

Introduction To Artificial Neural Networks

Introduction To Artificial Neural Networks Introduction To Artificial Neural Networks Machine Learning Supervised circle square circle square Unsupervised group these into two categories Supervised Machine Learning Supervised Machine Learning Supervised

More information

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided

More information

Neural Networks, Computation Graphs. CMSC 470 Marine Carpuat

Neural Networks, Computation Graphs. CMSC 470 Marine Carpuat Neural Networks, Computation Graphs CMSC 470 Marine Carpuat Binary Classification with a Multi-layer Perceptron φ A = 1 φ site = 1 φ located = 1 φ Maizuru = 1 φ, = 2 φ in = 1 φ Kyoto = 1 φ priest = 0 φ

More information

Active Sonar Target Classification Using Classifier Ensembles

Active Sonar Target Classification Using Classifier Ensembles International Journal of Engineering Research and Technology. ISSN 0974-3154 Volume 11, Number 12 (2018), pp. 2125-2133 International Research Publication House http://www.irphouse.com Active Sonar Target

More information

Course 395: Machine Learning - Lectures

Course 395: Machine Learning - Lectures Course 395: Machine Learning - Lectures Lecture 1-2: Concept Learning (M. Pantic) Lecture 3-4: Decision Trees & CBC Intro (M. Pantic & S. Petridis) Lecture 5-6: Evaluating Hypotheses (S. Petridis) Lecture

More information

University, Udaipur , Rajasthan, India 2 Dr. B. Lal Institute of Biotechnology, Malviya Industrial Area, Jaipur , Rajasthan, India

University, Udaipur , Rajasthan, India 2 Dr. B. Lal Institute of Biotechnology, Malviya Industrial Area, Jaipur , Rajasthan, India PROBING THE EFFECTS OF SALINITY STRESS ON PHOTOSYNTHESIS IN SPIRODELA POLYRHIZA L. - A POTENTIAL FEEDSTOCK FOR BIOFUEL PRODUCTION *Vineet Soni 1, Aditi Dwivedi 2, Sunita Parihar 1, Manisha Rathore 1, Jalpa

More information

Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter

Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter (Chair) STF - China Fellow francesco.dimaio@polimi.it

More information

Regression-Based Neural Network Simulation for Vibration Frequencies of the Rotating Blade

Regression-Based Neural Network Simulation for Vibration Frequencies of the Rotating Blade Regression-Based Neural Network Simulation for Vibration Frequencies of the Rotating Blade Atma Sahu and S. Chakravarty Abstract The aim of this paper is to demonstrate the use of regression-based neural

More information

Artificial Neural Network

Artificial Neural Network Artificial Neural Network Contents 2 What is ANN? Biological Neuron Structure of Neuron Types of Neuron Models of Neuron Analogy with human NN Perceptron OCR Multilayer Neural Network Back propagation

More information

Nonlinear Classification

Nonlinear Classification Nonlinear Classification INFO-4604, Applied Machine Learning University of Colorado Boulder October 5-10, 2017 Prof. Michael Paul Linear Classification Most classifiers we ve seen use linear functions

More information

Ch.6 Deep Feedforward Networks (2/3)

Ch.6 Deep Feedforward Networks (2/3) Ch.6 Deep Feedforward Networks (2/3) 16. 10. 17. (Mon.) System Software Lab., Dept. of Mechanical & Information Eng. Woonggy Kim 1 Contents 6.3. Hidden Units 6.3.1. Rectified Linear Units and Their Generalizations

More information

Modelling the shape of electron beam welding joints by neural networks

Modelling the shape of electron beam welding joints by neural networks Journal of Physics: Conference Series PAPER OPEN ACCESS Modelling the shape of electron beam welding joints by neural networks To cite this article: T S Tsonevska et al 2018 J. Phys.: Conf. Ser. 1089 012008

More information

Lecture 7 Artificial neural networks: Supervised learning

Lecture 7 Artificial neural networks: Supervised learning Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in

More information

Artificial Neural Network Approach for Land Cover Classification of Fused Hyperspectral and Lidar Data

Artificial Neural Network Approach for Land Cover Classification of Fused Hyperspectral and Lidar Data Artificial Neural Network Approach for Land Cover Classification of Fused Hyperspectral and Lidar Data Paris Giampouras 1,2, Eleni Charou 1, and Anastasios Kesidis 3 1 Computational Intelligence Laboratory,

More information

Artificial Neural Network

Artificial Neural Network Artificial Neural Network Eung Je Woo Department of Biomedical Engineering Impedance Imaging Research Center (IIRC) Kyung Hee University Korea ejwoo@khu.ac.kr Neuron and Neuron Model McCulloch and Pitts

More information

Neural Networks DWML, /25

Neural Networks DWML, /25 DWML, 2007 /25 Neural networks: Biological and artificial Consider humans: Neuron switching time 0.00 second Number of neurons 0 0 Connections per neuron 0 4-0 5 Scene recognition time 0. sec 00 inference

More information

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Neural Networks Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Brains as Computational Devices Brains advantages with respect to digital computers: Massively parallel Fault-tolerant Reliable

More information

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS Learning Neural Networks Classifier Short Presentation INPUT: classification data, i.e. it contains an classification (class) attribute.

More information

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks INFOB2KI 2017-2018 Utrecht University The Netherlands ARTIFICIAL INTELLIGENCE Artificial Neural Networks Lecturer: Silja Renooij These slides are part of the INFOB2KI Course Notes available from www.cs.uu.nl/docs/vakken/b2ki/schema.html

More information

COMP-4360 Machine Learning Neural Networks

COMP-4360 Machine Learning Neural Networks COMP-4360 Machine Learning Neural Networks Jacky Baltes Autonomous Agents Lab University of Manitoba Winnipeg, Canada R3T 2N2 Email: jacky@cs.umanitoba.ca WWW: http://www.cs.umanitoba.ca/~jacky http://aalab.cs.umanitoba.ca

More information

Artificial Neural Networks Examination, June 2004

Artificial Neural Networks Examination, June 2004 Artificial Neural Networks Examination, June 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

Unit 8: Introduction to neural networks. Perceptrons

Unit 8: Introduction to neural networks. Perceptrons Unit 8: Introduction to neural networks. Perceptrons D. Balbontín Noval F. J. Martín Mateos J. L. Ruiz Reina A. Riscos Núñez Departamento de Ciencias de la Computación e Inteligencia Artificial Universidad

More information

Neural Network Tutorial & Application in Nuclear Physics. Weiguang Jiang ( 蒋炜光 ) UTK / ORNL

Neural Network Tutorial & Application in Nuclear Physics. Weiguang Jiang ( 蒋炜光 ) UTK / ORNL Neural Network Tutorial & Application in Nuclear Physics Weiguang Jiang ( 蒋炜光 ) UTK / ORNL Machine Learning Logistic Regression Gaussian Processes Neural Network Support vector machine Random Forest Genetic

More information

Comparing Robustness of Pairwise and Multiclass Neural-Network Systems for Face Recognition

Comparing Robustness of Pairwise and Multiclass Neural-Network Systems for Face Recognition Comparing Robustness of Pairwise and Multiclass Neural-Network Systems for Face Recognition J. Uglov, V. Schetinin, C. Maple Computing and Information System Department, University of Bedfordshire, Luton,

More information

I. Molecules and Cells: Cells are the structural and functional units of life; cellular processes are based on physical and chemical changes.

I. Molecules and Cells: Cells are the structural and functional units of life; cellular processes are based on physical and chemical changes. I. Molecules and Cells: Cells are the structural and functional units of life; cellular processes are based on physical and chemical changes. A. Chemistry of Life B. Cells 1. Water How do the unique chemical

More information

POWER SYSTEM DYNAMIC SECURITY ASSESSMENT CLASSICAL TO MODERN APPROACH

POWER SYSTEM DYNAMIC SECURITY ASSESSMENT CLASSICAL TO MODERN APPROACH Abstract POWER SYSTEM DYNAMIC SECURITY ASSESSMENT CLASSICAL TO MODERN APPROACH A.H.M.A.Rahim S.K.Chakravarthy Department of Electrical Engineering K.F. University of Petroleum and Minerals Dhahran. Dynamic

More information

Neural Networks biological neuron artificial neuron 1

Neural Networks biological neuron artificial neuron 1 Neural Networks biological neuron artificial neuron 1 A two-layer neural network Output layer (activation represents classification) Weighted connections Hidden layer ( internal representation ) Input

More information

CMSC 421: Neural Computation. Applications of Neural Networks

CMSC 421: Neural Computation. Applications of Neural Networks CMSC 42: Neural Computation definition synonyms neural networks artificial neural networks neural modeling connectionist models parallel distributed processing AI perspective Applications of Neural Networks

More information

Multitask Learning of Environmental Spatial Data

Multitask Learning of Environmental Spatial Data 9th International Congress on Environmental Modelling and Software Brigham Young University BYU ScholarsArchive 6th International Congress on Environmental Modelling and Software - Leipzig, Germany - July

More information

Unit III. A Survey of Neural Network Model

Unit III. A Survey of Neural Network Model Unit III A Survey of Neural Network Model 1 Single Layer Perceptron Perceptron the first adaptive network architecture was invented by Frank Rosenblatt in 1957. It can be used for the classification of

More information

Feedforward Neural Nets and Backpropagation

Feedforward Neural Nets and Backpropagation Feedforward Neural Nets and Backpropagation Julie Nutini University of British Columbia MLRG September 28 th, 2016 1 / 23 Supervised Learning Roadmap Supervised Learning: Assume that we are given the features

More information

Backpropagation and his application in ECG classification

Backpropagation and his application in ECG classification University of Ostrava Institute for Research and Applications of Fuzzy Modeling Backpropagation and his application in ECG classification Ondřej Polakovič Research report No. 75 2005 Submitted/to appear:

More information

Calculation of the heat power consumption in the heat exchanger using artificial neural network

Calculation of the heat power consumption in the heat exchanger using artificial neural network 9 th International Conference on Quantitative Infraed Thermography July -5,, Krakow - Poland Calculation of the heat power consumption in the heat exchanger using artificial neural network by S. Dudzik*

More information

Detection of Chlorophyll Content of Rice Leaves by Chlorophyll Fluorescence Spectrum Based on PCA-ANN Zhou Lina1,a Cheng Shuchao1,b Yu Haiye2,c 1

Detection of Chlorophyll Content of Rice Leaves by Chlorophyll Fluorescence Spectrum Based on PCA-ANN Zhou Lina1,a Cheng Shuchao1,b Yu Haiye2,c 1 7th International Conference on Mechatronics, Control and Materials (ICMCM 2016) Detection of Chlorophyll Content of Rice Leaves by Chlorophyll Fluorescence Spectrum Based on PCA-ANN Zhou Lina1,a Cheng

More information

Artificial Neural Networks. MGS Lecture 2

Artificial Neural Networks. MGS Lecture 2 Artificial Neural Networks MGS 2018 - Lecture 2 OVERVIEW Biological Neural Networks Cell Topology: Input, Output, and Hidden Layers Functional description Cost functions Training ANNs Back-Propagation

More information

AI Programming CS F-20 Neural Networks

AI Programming CS F-20 Neural Networks AI Programming CS662-2008F-20 Neural Networks David Galles Department of Computer Science University of San Francisco 20-0: Symbolic AI Most of this class has been focused on Symbolic AI Focus or symbols

More information

The application of neural networks to the paper-making industry

The application of neural networks to the paper-making industry The application of neural networks to the paper-making industry P. J. Edwards y, A.F. Murray y, G. Papadopoulos y, A.R. Wallace y and J. Barnard x y Dept. of Electronics and Electrical Eng., Edinburgh

More information

Address for Correspondence

Address for Correspondence Research Article APPLICATION OF ARTIFICIAL NEURAL NETWORK FOR INTERFERENCE STUDIES OF LOW-RISE BUILDINGS 1 Narayan K*, 2 Gairola A Address for Correspondence 1 Associate Professor, Department of Civil

More information

Introduction to Neural Networks

Introduction to Neural Networks Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning

More information

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption ANDRÉ NUNES DE SOUZA, JOSÉ ALFREDO C. ULSON, IVAN NUNES

More information

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur Module 12 Machine Learning Lesson 39 Neural Networks - III 12.4.4 Multi-Layer Perceptrons In contrast to perceptrons, multilayer networks can learn not only multiple decision boundaries, but the boundaries

More information

Backpropagation Neural Net

Backpropagation Neural Net Backpropagation Neural Net As is the case with most neural networks, the aim of Backpropagation is to train the net to achieve a balance between the ability to respond correctly to the input patterns that

More information

A New Hybrid System for Recognition of Handwritten-Script

A New Hybrid System for Recognition of Handwritten-Script computing@tanet.edu.te.ua www.tanet.edu.te.ua/computing ISSN 177-69 A New Hybrid System for Recognition of Handwritten-Script Khalid Saeed 1) and Marek Tabdzki ) Faculty of Computer Science, Bialystok

More information

Confidence Estimation Methods for Neural Networks: A Practical Comparison

Confidence Estimation Methods for Neural Networks: A Practical Comparison , 6-8 000, Confidence Estimation Methods for : A Practical Comparison G. Papadopoulos, P.J. Edwards, A.F. Murray Department of Electronics and Electrical Engineering, University of Edinburgh Abstract.

More information

Compressed Sensing and Neural Networks

Compressed Sensing and Neural Networks and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Jeff Clune Assistant Professor Evolving Artificial Intelligence Laboratory Announcements Be making progress on your projects! Three Types of Learning Unsupervised Supervised Reinforcement

More information

Apprentissage, réseaux de neurones et modèles graphiques (RCP209) Neural Networks and Deep Learning

Apprentissage, réseaux de neurones et modèles graphiques (RCP209) Neural Networks and Deep Learning Apprentissage, réseaux de neurones et modèles graphiques (RCP209) Neural Networks and Deep Learning Nicolas Thome Prenom.Nom@cnam.fr http://cedric.cnam.fr/vertigo/cours/ml2/ Département Informatique Conservatoire

More information

Miller & Levine Biology

Miller & Levine Biology A Correlation of To the Science Biology A Correlation of, 2014 to the, Table of Contents From Molecules to Organisms: Structures and Processes... 3 Ecosystems: Interactions, Energy, and Dynamics... 4 Heredity:

More information

Forecasting & Futurism

Forecasting & Futurism Article from: Forecasting & Futurism December 2013 Issue 8 A NEAT Approach to Neural Network Structure By Jeff Heaton Jeff Heaton Neural networks are a mainstay of artificial intelligence. These machine-learning

More information

Nucleic acid hybridization assays, detecting genotypes C12Q 1/68. Attention is drawn to the following places, which may be of interest for search:

Nucleic acid hybridization assays, detecting genotypes C12Q 1/68. Attention is drawn to the following places, which may be of interest for search: A01H NEW PLANTS OR PROCESSES FOR OBTAINING THEM; PLANT REPRODUCTION BY TISSUE CULTURE TECHNIQUES New non-transgenic plants (including multicellular algae, multicellular fungi and lichens), plant varieties,

More information

Jakub Hajic Artificial Intelligence Seminar I

Jakub Hajic Artificial Intelligence Seminar I Jakub Hajic Artificial Intelligence Seminar I. 11. 11. 2014 Outline Key concepts Deep Belief Networks Convolutional Neural Networks A couple of questions Convolution Perceptron Feedforward Neural Network

More information