RECOGNITION ALGORITHM FOR DEVELOPING A BRAIN- COMPUTER INTERFACE USING FUNCTIONAL NEAR INFRARED SPECTROSCOPY

Size: px
Start display at page:

Download "RECOGNITION ALGORITHM FOR DEVELOPING A BRAIN- COMPUTER INTERFACE USING FUNCTIONAL NEAR INFRARED SPECTROSCOPY"

Transcription

1 The 212 International Conference on Green Technology and Sustainable Development (GTSD212) RECOGNITION ALGORITHM FOR DEVELOPING A BRAIN- COMPUTER INTERFACE USING FUNCTIONAL NEAR INFRARED SPECTROSCOPY Cuong 1 and Nguyen Thanh Hai 2, Vo Van Toi 2 1 University of Technical Education HCMC, Vietnam 2 International University, VNU, HCMC, Vietnam ABSTRACT Researches of human Brain Computer Interface (BCI) for the objective of diagnosis and rehabilitation have been recently increased. Cerebral oxygenation and blood flow on particular regions of human brain can be measured using a non-invasive technique fnirs (functional Near Infrared Spectroscopy). In this paper, a study of recognition algorithm will be described for recognition whether one taps his/her left hand or right hand. Data with noises and artifacts collected from a multi-channel system will be pre-processed using a Savitzky- Golay filter for getting more smoothly data. Characteristics of the filtered signals during left and right hand tapping process will be figured out using a polynomial regression algorithm. Coefficients of the polynomial, which correspond to Oxygen-Hemoglobin (Oxy- Hb) concentration, will be applied for the recognition of hand tapping. Then Artificial Neural Networks (ANN) will be employed to validate the obtained coefficient data for hand tapping recognition. Experimental results have been done many trials on a subject to illustrate the effectiveness of the proposed method. KEYWORDS: polynomial regression algorithm, artificial neural networks, hand tapping recognition, functional Near Infrared Spectroscopy. 1. INTRODUCTION In recently decades, a lot of achievements have been obtained in imaging and cognitive neuroscience of human brain. Brain s activities can be shown by a number of different kinds of non-invasive technologies, such as: functional Near- Infrared Spectroscopy (fnirs), Magnetic Resonance Imaging (MRI) and ElectroEncephaloGraphy (EEG). Many researchers have been attracted by these technologies with many approaches to explore problems due to human brain. FNIRS has become the convenient technology for experimental brain purposes. This non-invasive technique uses the method of emitting near infrared light into the human brain to measure cerebral hemodynamics as well as to detect localized blood volume and oxygenation changes. The change of oxy- Hb along task period depending on location of channel on the cortex has been studied: sustained activation in the motor cortex, transient activation during the initial segments in the somatosensory cortex, and accumulating activation in the frontal lobe [1]. Oxy- Hb concentration at the aforementioned sites in the brain can also be used as a predictive factor allows prediction of subjects investment behavior with a considerable degree of precision [2]. Wavelet decomposition can be used as feature extractions and neural networks can be used for cognition brain tasks as a classification module [3]. Also, wavelet can be used to remove artifacts [4]. Base on the slope of straight line, hand side tapping can be distinguished []. Oxy- Hb and DeOxy- Hb can also be used directly with SVM algorithm for the recognition of hand tapping [6]. In this paper, we proposed the recognition algorithm for developing a brain computer interface using fnirs. First of all, 26

2 The 212 International Conference on Green Technology and Sustainable Development (GTSD212) Savitzky- Golay filter is used to reduce noises as well as artifacts. After that, features of Oxy- Hb are found by a polynomial regression based on its coefficients. Finally, Artificial Neural Network is used to determine whether left hand or right hand is tapped. This process is shown in Figure 1 below. hemispheres using the fnirs technique as shown in Figure 3. This system operates at three different wavelengths of 78 nm, 8 nm and 83 nm. Moreover, the distance between pairs of emitter and detector probes was set at 3 cm and all probes were attached with holders arranged on different sides of human brain hemispheres depending on users. A subject (male, 2 years old, 62 kg weights, right-handed) was participated into this study. After reading and understanding the experiment protocol and the fnirs technique, he will start doing hand tapping. The subject was required to perform hand tapping motions, both left and right sides as motor activities. In these hand tapping motions, a protocol includes 2s (Rest) 2s (Task) 2s (Rest), it means that the subject relaxed in 2s, tapped his hand up/down about 1 times in 2s, and then rested 2s, as shown in Figure 4. Figure 1. Recognition algorithm block diagram 2. MATERIALS AND METHODS 2.1 Data Acquisition We used a multichannel fnirs instrument, FOIRE- 3 in order to acquire oxygen hemoglobin. This machine is made by Shimadzu Corporation in Japan and located at Room 14 of Biomedical Engineering Department, International University, Vietnam as shown in Figure 2. Figure 4. Setting of experiment protocol Data were collected from 2 channels. 1 channels are of the left brain side and that of the opposite side will be obtained for hand tapping recognition. Figure a. Probes location (red- emitter, blue- detector) and channels on the motor control area of the left hemisphere Figure b. Probes location and channels (yellow) on the motor control area of the right hemisphere Figure 2. fnirs FOIRE- 3 system Figure 3. Subject head with installed probes Oxy- Hb alters in motor control area of human brain were captured from a set of the holder with 2 channels for both 261 However, we just chose 4 channels of each side which had reliable information to analyze and to estimate features. More clearly, the left brain channels are 2,, 6, 9 and the 12, 1, 16, 19 channels are of the right brain side predicted on the motor

3 The 212 International Conference on Green Technology and Sustainable Development (GTSD212) control area of the hemispheres as in Figure a- b. 2.2 Data Pre-processing Brain data of a subject acquired from the channels have noises and artifacts. In order to obtain brain data with less annoyance, a filter as the Savitzky Golay filter was applied in this paper. Savitzky- Golay filters [7] are also known as polynomial smoothing. The idea of polynomial smoothing is replacing samples of signal by the values that lie on the smoothing curve. In general, we can evaluate a polynomial with order of d to smooth an arbitrary odd length N of data vector x with the condition N d+1. It is replaced by a vector having M points on two sides of x, N= 2M +1: T x = x,, x 1, x, x, ] (1) [ M 1 x M N data samples in x are then fitted by the polynomial with order of d as follow d xˆ m = c + c1m + + cdm, M m M (2) The values are estimated x ˆ m = b T mx, m= - M,,, M (3) In which T T 1 T B = SG = GS = SF S = b, b, b (4) T F = S S G = SF 1 [ ] M, M () With this type of filter: length 11, order of 3, the original and smoothed signals are shown in Figure 6. Sudden rising in original signals causing by noises are smoothed. 262 Figure 6. The signal of channel 2 before and after filtered 2.3 Polynomial regression Polynomial regression [8] (PR) algorithm presents the relationship between amplitude and time of a signal. In this paper, the PR algorithm was applied to analyze brain data corresponding to hand tapping tasks. From the processed data, one can distinguish the difference between the left and right hand tapping times corresponding to the difference of the oxy- Hb concentration. Assumed that we have two-dimensional data, (x 1, y 1 ),, (x n, y n ), where each of x and y has no information about the other. Our problem is fitting a polynomial curve generated by a typical data. Thus, the relationship between x and y can be found out. Based on the coefficients of the regression curve with the order of, one can estimate the hand tapping task. The polynomial regression equation between independent variable x and y fitted can be expressed as follows: y h ˆ h ˆ x h ˆ x 2 h ˆ m ˆ = m x (6) In which, hˆ, hˆ, hˆ,, hˆ 1 2 m are estimated values of h, h 1, h 2,, h m and there are m regressors and n observations, ( xi1, xi2, 2, xim, yi ), i = 1, 2 m 2,, n corresponding to ( xi, xi, 2, xi, yi ). In this equation, the powers of x play the role of different independent variables. The polynomial regression (PR) model can

4 The 212 International Conference on Green Technology and Sustainable Development (GTSD212) be written as a system of linear equations y = Xh + ε (7) Where ε = [ ε ε ] T error. 1 2 ε n is a vector of The ordinary least square ĥ of h is given by the arguments that minimize the residual sum of squares and the distributive law is employed. In the case of the inverse of X ' X exists, it is given by hˆ ' 1 ' = ( X X ) X y (8) From these coefficients, one can determine problems of left hand (LH) tapping or right hand (RH) tapping with given brain data measured using fnirs technology. Figure 7 represents the regressed signal of the channel-2 corresponding to Eq. (9). Similarly, the regression signals of channels, 6, 9, 12, 1, 16 and 19 can be shown. 4 3 =.112x.183x x Amplitude y c 2.247x x.146 Smoothed vs Regressed signal (9) Smoothed signal Regressed signal Time Figure 7. The regression signal of filtered channel Artificial Neural Network Artificial neural networks are the very powerful tools for the problems of classification and pattern recognition. We can use the estimated coefficients from PR algorithm as the features for recognition with a multilayer feed forward network. This network used here consists of an input layer, one hidden layer, and the output layer. Figure 8 shows this architecture. 263 Figure 8. Architecture of classification network Input samples are the features from channel coefficients mentioned above. The number of hidden nodes is carefully chosen. Practically, it can be chosen as an average of number of input nodes and output nodes. With the hidden layer, we used the double sigmoid function while sigmoid function was used for the output layer. Standard back propagation is used for training the above 3- layer network. It is a gradient descent algorithm, in which the network weights are moved along the negative of the gradient of the performance function. With this argument, the training is based on the minimization of the error function: E = N ( o n d n ) n= 1 2, (1) Where N is number of samples, o is network output, d is desired output. With left hand tapping, the output is desired to get the value of [1; ] and [; 1] is the desirable value of right tapping. 3. RESULTS AND DISCUSSION In this work, the PR algorithm with the order- polynomial has six coefficients and its equation is as follows y = h m x m= m (11)

5 The 212 International Conference on Green Technology and Sustainable Development (GTSD212) where x is from -7 seconds with resolution of.7. Amplitude Channel 2 - Channel Channel 6 - Channel Time Channel - Channel Channel 9 - Channel Figure 9. Regressed polynomial of 8 channels of the left hand tapping. Blue curves are the signals on the left brain side and red curves are of the right brain side. This equation was applied to consider for the regressed signals of channels 2,, 6, 9 (left hemisphere) and channels 12, 1, 16, 19 (right hemisphere) and compared right hand tapping with left hand tapping as shown in Figure 9 and 1. Amplitude Channel 2 - Channel Channel 6 - Channel Time Channel - Channel x 1-3 Channel 9 - Channel Figure 1. Regressed polynomial of 8 channels of the right hand tapping. Blue curves are the signals on the left brain side and red curves are of the right brain side. Results in Figure 9 and 1 are not enough to distinguish cases of right tapping or left tapping. Training data, which are coefficients of the regressed polynomials, were applied to identify hand tapping tasks. The coefficients arrangement can be shown in Table 1. Table 1. The regressed coefficients arrangement of hand tapping tasks for obtaining input of recognition networks Left hemisphere coefficients Right hemisphere coefficients Ch-2 Ch- Ch-6 Ch-9 Ch-1 2 Ch-1 Ch-1 6 Ch-1 9 h 21 h 26 h 1 h 61 h 9 h 6 h 66 1 h 96 h 121 h 126 h 11 h 16 h 161 h 166 h 191 h 196 In each times of hand tapping, the regressed coefficients will be obtained corresponding to oxy- Hb changes from two hemispheres for training using the above neural network (NN). Moreover, six coefficients of each channel in Table 1 give us a vector. Therefore, one input of the NN for training is a vector of 48 elements. By choosing the number of hidden nodes is of 1, the performance of the proposed network can be shown in Figure 11. Assume that v r is the vector of the right hand tapping and v l is the vector of the left hand tapping. In one run of experiment, the subject performed 2 times tapping, 1- left tapping, 1- right tapping as shown in Figure 12. So, a set of the left tapping coefficients S l including 1 vectors from v l1 to v l1 and that of the right tapping coefficients S r including 1 vectors from v r1 to v r1. With 8 sample vectors, we used the methodology of 4 runs of 2- fold cross recognition. For identifying left hand tapping, one used 9 vectors of the S l set combined with the 1-vectors of S r set and the remaining vector of the S l set is used to be a sample vector for identification. In the case of identifying right hand tapping, S l, 9 vectors of the S r set and the remaining one is used to identify. 264

6 The 212 International Conference on Green Technology and Sustainable Development (GTSD212) Figure 11. Performance of the network of 48 input nodes, 1 hidden nodes and 2 output nodes. With error of.1, it stop training at epoch 164. Figure 12. One run of experiment consists of 1 times of left tapping, 1 times of right tapping Experimental results are represented in Table 2. Table 2. Results of 4 runs Run Hand tapping side Accuracy (%) 1 Right 8 Left 7 2 Right 9 Left 8 3 Right 9 Left 7 4 Right 8 Left 7 Average Right 8 Left 72. The accuracy of hand tapping tasks is more than or equal to 7%. In this work, the accuracy in the case of right tapping is always greater than that of left hand tapping. We got the average accuracy of 8% in the case of right hand tapping, and just 72.% in left tapping. But with these values, we can say that the proposed algorithm used to recognize right hand or left hand tapping is reliable. 4. CONCLUSION 26 In this paper, original brain signals of hand tapping tasks were filtered by the Savitzky Golay filter to produce the smooth signals. Moreover, the smoothed signals of the left and right hand tapping tasks corresponding to oxy-hb changes in human brain were analyzed using the PR algorithm when the LH or the RH moves up/down. For the more accurately recognition of the hand tapping times, the ANN was employed to validate data. Experimental results with hand tapping times showed that one could distinguish when the subject was doing the left or right hand tapping tasks based on different coefficients of the curves. Our future work will be a problem of improving the accuracy of the proposed algorithm and doing it on many subjects.. ACKNOWLEDGMENTS The authors would like to acknowledge the support of the Shimadzu through the Shimadzu-IU Research Collaborative Program. We would also like to thank the students of Biomedical Engineering Department at International University for their supports during data collection. 6. REFERENCES [1] C. H. Rodolphe J. Gentili, Hasan Ayaz, Patricia A. Shewokis and José L. Contreras-Vidal, "Hemodynamic Correlates of Visuomotor Motor Adaptation by Functional Near Infrared Spectroscopy," presented at the 32nd Annual International Conference of the IEEE EMBS, 21, pp [2] K. S. T. SHIMOKAWA, T. MISAWA, K. MIYAGAWA, "Predictability of investment behavior from brain information measured by functional near-infrared spectroscopy: a bayesian neural network model," Neuroscinece research, 29, pp [3] Truong Quang Dang Khoa, M. N., "Functional Near Infrared Spectroscope for Cognition Brain Tasks by Wavelets Analysis and Neural Networks,"

7 The 212 International Conference on Green Technology and Sustainable Development (GTSD212) International journal of Biological and Life science, 28, pp [4] S. M. Behnam Molavi, Guy A. Dumont, Fellow, "Wavelet Based Motion Artifact Removal for Functional Near Infrared Spectroscopy," presented at the 32nd Annual International Conference of the IEEE EMBS, 21, pp. -8. [] C. Q. Ngo, T. H. N,T. V. Vo, "Linear Regression Algorithm for Hand Tapping Recognition Using Functional Near Infrared Spectroscopy," presented at the Fourth International Conference on the Development of Biomedical Engineering, VietNam, 212, pp [6] H. Z. Ranganatha Sitaram, Cuntai Guan,Manoj Thulasidas,Yoko Hoshi,Akihiro Ishikawa,Koji Shimizu,Niels Birbaumer,, "Temporal classification of multichannel near-infrared spectroscopy signals of motor imagery for developing a brain computer interface," NeuroImage 34, 27, pp [7] S. J. Orfanidis, "Signal Processing Applications," in Introduction to signal processing, ed: Pearson Education Inc, 21, pp [8] G. C. R. Douglas C. Montgomery, "Applied Statistics and Probability for Engineers" in Applied Statistics and Probability for Engineers, ed: John Sons Inc, 23, pp Contact : Ngo Quoc Cuong Cellphone : (84) Employee : Lecturer ngoquoccuong17@gmail.com; cuongnq@hcmute.edu.vn 266

Imagent for fnirs and EROS measurements

Imagent for fnirs and EROS measurements TECHNICAL NOTE Imagent for fnirs and EROS measurements 1. Brain imaging using Infrared Photons Brain imaging techniques can be broadly classified in two groups. One group includes the techniques that have

More information

HMM and IOHMM Modeling of EEG Rhythms for Asynchronous BCI Systems

HMM and IOHMM Modeling of EEG Rhythms for Asynchronous BCI Systems HMM and IOHMM Modeling of EEG Rhythms for Asynchronous BCI Systems Silvia Chiappa and Samy Bengio {chiappa,bengio}@idiap.ch IDIAP, P.O. Box 592, CH-1920 Martigny, Switzerland Abstract. We compare the use

More information

FEEDBACK GMDH-TYPE NEURAL NETWORK AND ITS APPLICATION TO MEDICAL IMAGE ANALYSIS OF LIVER CANCER. Tadashi Kondo and Junji Ueno

FEEDBACK GMDH-TYPE NEURAL NETWORK AND ITS APPLICATION TO MEDICAL IMAGE ANALYSIS OF LIVER CANCER. Tadashi Kondo and Junji Ueno International Journal of Innovative Computing, Information and Control ICIC International c 2012 ISSN 1349-4198 Volume 8, Number 3(B), March 2012 pp. 2285 2300 FEEDBACK GMDH-TYPE NEURAL NETWORK AND ITS

More information

Artificial Neural Networks

Artificial Neural Networks Introduction ANN in Action Final Observations Application: Poverty Detection Artificial Neural Networks Alvaro J. Riascos Villegas University of los Andes and Quantil July 6 2018 Artificial Neural Networks

More information

Research Article Multifractals Properties on the Near Infrared Spectroscopy of Human Brain Hemodynamic

Research Article Multifractals Properties on the Near Infrared Spectroscopy of Human Brain Hemodynamic Mathematical Problems in Engineering Volume 212, Article ID 67761, 12 pages doi:1.1155/212/67761 Research Article Multifractals Properties on the Near Infrared Spectroscopy of Human Brain Hemodynamic Truong

More information

Introduction to Biomedical Engineering

Introduction to Biomedical Engineering Introduction to Biomedical Engineering Biosignal processing Kung-Bin Sung 6/11/2007 1 Outline Chapter 10: Biosignal processing Characteristics of biosignals Frequency domain representation and analysis

More information

MEG Source Localization Using an MLP With Distributed Output Representation

MEG Source Localization Using an MLP With Distributed Output Representation MEG Source Localization Using an MLP With Distributed Output Representation Sung Chan Jun, Barak A. Pearlmutter, Guido Nolte CBLL Meeting October 5, 2005 Source localization Definition: Identification

More information

Imagent functional Near Infrared Imaging System (fnirs) Brain Imaging using Infrared Photons

Imagent functional Near Infrared Imaging System (fnirs) Brain Imaging using Infrared Photons Imagent functional Near Infrared Imaging System (fnirs) Brain Imaging using Infrared Photons The Principle It has been known for more than 100 years that specific types of neurological deficits are induced

More information

Artificial Neural Network

Artificial Neural Network Artificial Neural Network Contents 2 What is ANN? Biological Neuron Structure of Neuron Types of Neuron Models of Neuron Analogy with human NN Perceptron OCR Multilayer Neural Network Back propagation

More information

Recipes for the Linear Analysis of EEG and applications

Recipes for the Linear Analysis of EEG and applications Recipes for the Linear Analysis of EEG and applications Paul Sajda Department of Biomedical Engineering Columbia University Can we read the brain non-invasively and in real-time? decoder 1001110 if YES

More information

NONLINEAR CLASSIFICATION AND REGRESSION. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition

NONLINEAR CLASSIFICATION AND REGRESSION. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition NONLINEAR CLASSIFICATION AND REGRESSION Nonlinear Classification and Regression: Outline 2 Multi-Layer Perceptrons The Back-Propagation Learning Algorithm Generalized Linear Models Radial Basis Function

More information

Feed-forward Network Functions

Feed-forward Network Functions Feed-forward Network Functions Sargur Srihari Topics 1. Extension of linear models 2. Feed-forward Network Functions 3. Weight-space symmetries 2 Recap of Linear Models Linear Models for Regression, Classification

More information

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks INFOB2KI 2017-2018 Utrecht University The Netherlands ARTIFICIAL INTELLIGENCE Artificial Neural Networks Lecturer: Silja Renooij These slides are part of the INFOB2KI Course Notes available from www.cs.uu.nl/docs/vakken/b2ki/schema.html

More information

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis Introduction to Natural Computation Lecture 9 Multilayer Perceptrons and Backpropagation Peter Lewis 1 / 25 Overview of the Lecture Why multilayer perceptrons? Some applications of multilayer perceptrons.

More information

CSE446: Neural Networks Spring Many slides are adapted from Carlos Guestrin and Luke Zettlemoyer

CSE446: Neural Networks Spring Many slides are adapted from Carlos Guestrin and Luke Zettlemoyer CSE446: Neural Networks Spring 2017 Many slides are adapted from Carlos Guestrin and Luke Zettlemoyer Human Neurons Switching time ~ 0.001 second Number of neurons 10 10 Connections per neuron 10 4-5 Scene

More information

Address for Correspondence

Address for Correspondence Research Article APPLICATION OF ARTIFICIAL NEURAL NETWORK FOR INTERFERENCE STUDIES OF LOW-RISE BUILDINGS 1 Narayan K*, 2 Gairola A Address for Correspondence 1 Associate Professor, Department of Civil

More information

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann (Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for

More information

Portugaliae Electrochimica Acta 26/4 (2008)

Portugaliae Electrochimica Acta 26/4 (2008) Portugaliae Electrochimica Acta 6/4 (008) 6-68 PORTUGALIAE ELECTROCHIMICA ACTA Comparison of Regression Model and Artificial Neural Network Model for the Prediction of Volume Percent of Diamond Deposition

More information

k k k 1 Lecture 9: Applying Backpropagation Lecture 9: Applying Backpropagation 3 Lecture 9: Applying Backpropagation

k k k 1 Lecture 9: Applying Backpropagation Lecture 9: Applying Backpropagation 3 Lecture 9: Applying Backpropagation K-Class Classification Problem Let us denote the -th class by C, with n exemplars or training samples, forming the sets T for = 1,, K: {( x, ) p = 1 n } T = d,..., p p The complete training set is T =

More information

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Neural Networks Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Brains as Computational Devices Brains advantages with respect to digital computers: Massively parallel Fault-tolerant Reliable

More information

What is NIRS? First-Level Statistical Models 5/18/18

What is NIRS? First-Level Statistical Models 5/18/18 First-Level Statistical Models Theodore Huppert, PhD (huppertt@upmc.edu) University of Pittsburgh Departments of Radiology and Bioengineering What is NIRS? Light Intensity SO 2 and Heart Rate 2 1 5/18/18

More information

Standard P300 scalp. Fz, Cz, Pz Posterior Electrodes: PO 7, PO 8, O Z

Standard P300 scalp. Fz, Cz, Pz Posterior Electrodes: PO 7, PO 8, O Z Brain Computer Interface Step Wise Linear Discriminant Analysis and Linear Regression Instructor: Dr. Ravi Sankar Student: Kun Li & David Seebran Brain-Computer Interface Utilize the electrical l activity

More information

Lecture 4: Perceptrons and Multilayer Perceptrons

Lecture 4: Perceptrons and Multilayer Perceptrons Lecture 4: Perceptrons and Multilayer Perceptrons Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning Perceptrons, Artificial Neuronal Networks Lecture 4: Perceptrons

More information

Artificial Neural Networks. Edward Gatt

Artificial Neural Networks. Edward Gatt Artificial Neural Networks Edward Gatt What are Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning Very

More information

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided

More information

Introduction to Neural Networks

Introduction to Neural Networks CUONG TUAN NGUYEN SEIJI HOTTA MASAKI NAKAGAWA Tokyo University of Agriculture and Technology Copyright by Nguyen, Hotta and Nakagawa 1 Pattern classification Which category of an input? Example: Character

More information

Design Collocation Neural Network to Solve Singular Perturbed Problems with Initial Conditions

Design Collocation Neural Network to Solve Singular Perturbed Problems with Initial Conditions Article International Journal of Modern Engineering Sciences, 204, 3(): 29-38 International Journal of Modern Engineering Sciences Journal homepage:www.modernscientificpress.com/journals/ijmes.aspx ISSN:

More information

Course 395: Machine Learning - Lectures

Course 395: Machine Learning - Lectures Course 395: Machine Learning - Lectures Lecture 1-2: Concept Learning (M. Pantic) Lecture 3-4: Decision Trees & CBC Intro (M. Pantic & S. Petridis) Lecture 5-6: Evaluating Hypotheses (S. Petridis) Lecture

More information

HST 583 FUNCTIONAL MAGNETIC RESONANCE IMAGING DATA ANALYSIS AND ACQUISITION A REVIEW OF STATISTICS FOR FMRI DATA ANALYSIS

HST 583 FUNCTIONAL MAGNETIC RESONANCE IMAGING DATA ANALYSIS AND ACQUISITION A REVIEW OF STATISTICS FOR FMRI DATA ANALYSIS HST 583 FUNCTIONAL MAGNETIC RESONANCE IMAGING DATA ANALYSIS AND ACQUISITION A REVIEW OF STATISTICS FOR FMRI DATA ANALYSIS EMERY N. BROWN AND CHRIS LONG NEUROSCIENCE STATISTICS RESEARCH LABORATORY DEPARTMENT

More information

Speaker Representation and Verification Part II. by Vasileios Vasilakakis

Speaker Representation and Verification Part II. by Vasileios Vasilakakis Speaker Representation and Verification Part II by Vasileios Vasilakakis Outline -Approaches of Neural Networks in Speaker/Speech Recognition -Feed-Forward Neural Networks -Training with Back-propagation

More information

ECE521 Lectures 9 Fully Connected Neural Networks

ECE521 Lectures 9 Fully Connected Neural Networks ECE521 Lectures 9 Fully Connected Neural Networks Outline Multi-class classification Learning multi-layer neural networks 2 Measuring distance in probability space We learnt that the squared L2 distance

More information

CS:4420 Artificial Intelligence

CS:4420 Artificial Intelligence CS:4420 Artificial Intelligence Spring 2018 Neural Networks Cesare Tinelli The University of Iowa Copyright 2004 18, Cesare Tinelli and Stuart Russell a a These notes were originally developed by Stuart

More information

Neural Networks and the Back-propagation Algorithm

Neural Networks and the Back-propagation Algorithm Neural Networks and the Back-propagation Algorithm Francisco S. Melo In these notes, we provide a brief overview of the main concepts concerning neural networks and the back-propagation algorithm. We closely

More information

Artificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino

Artificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino Artificial Neural Networks Data Base and Data Mining Group of Politecnico di Torino Elena Baralis Politecnico di Torino Artificial Neural Networks Inspired to the structure of the human brain Neurons as

More information

Protein Structure Prediction Using Multiple Artificial Neural Network Classifier *

Protein Structure Prediction Using Multiple Artificial Neural Network Classifier * Protein Structure Prediction Using Multiple Artificial Neural Network Classifier * Hemashree Bordoloi and Kandarpa Kumar Sarma Abstract. Protein secondary structure prediction is the method of extracting

More information

Lecture 5: Logistic Regression. Neural Networks

Lecture 5: Logistic Regression. Neural Networks Lecture 5: Logistic Regression. Neural Networks Logistic regression Comparison with generative models Feed-forward neural networks Backpropagation Tricks for training neural networks COMP-652, Lecture

More information

2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks. Todd W. Neller

2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks. Todd W. Neller 2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks Todd W. Neller Machine Learning Learning is such an important part of what we consider "intelligence" that

More information

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS Learning Neural Networks Classifier Short Presentation INPUT: classification data, i.e. it contains an classification (class) attribute.

More information

MIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October,

MIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October, MIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October, 23 2013 The exam is closed book. You are allowed a one-page cheat sheet. Answer the questions in the spaces provided on the question sheets. If you run

More information

Basic MRI physics and Functional MRI

Basic MRI physics and Functional MRI Basic MRI physics and Functional MRI Gregory R. Lee, Ph.D Assistant Professor, Department of Radiology June 24, 2013 Pediatric Neuroimaging Research Consortium Objectives Neuroimaging Overview MR Physics

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) Human Brain Neurons Input-Output Transformation Input Spikes Output Spike Spike (= a brief pulse) (Excitatory Post-Synaptic Potential)

More information

A Novel Activity Detection Method

A Novel Activity Detection Method A Novel Activity Detection Method Gismy George P.G. Student, Department of ECE, Ilahia College of,muvattupuzha, Kerala, India ABSTRACT: This paper presents an approach for activity state recognition of

More information

ROTARY INVERTED PENDULUM AND CONTROL OF ROTARY INVERTED PENDULUM BY ARTIFICIAL NEURAL NETWORK

ROTARY INVERTED PENDULUM AND CONTROL OF ROTARY INVERTED PENDULUM BY ARTIFICIAL NEURAL NETWORK Proc. Natl. Conf. Theor. Phys. 37 (2012, pp. 243-249 ROTARY INVERTED PENDULUM AND CONTROL OF ROTARY INVERTED PENDULUM BY ARTIFICIAL NEURAL NETWORK NGUYEN DUC QUYEN (1, NGO VAN THUYEN (1 (1 University of

More information

Chapter 35 Bayesian STAI Anxiety Index Predictions Based on Prefrontal Cortex NIRS Data for the Resting State

Chapter 35 Bayesian STAI Anxiety Index Predictions Based on Prefrontal Cortex NIRS Data for the Resting State Chapter 35 Bayesian STAI Anxiety Index Predictions Based on Prefrontal Cortex NIRS Data for the Resting State Masakaze Sato, Wakana Ishikawa, Tomohiko Suzuki, Takashi Matsumoto, Takeo Tsujii, and Kaoru

More information

Machine Learning and Data Mining. Multi-layer Perceptrons & Neural Networks: Basics. Prof. Alexander Ihler

Machine Learning and Data Mining. Multi-layer Perceptrons & Neural Networks: Basics. Prof. Alexander Ihler + Machine Learning and Data Mining Multi-layer Perceptrons & Neural Networks: Basics Prof. Alexander Ihler Linear Classifiers (Perceptrons) Linear Classifiers a linear classifier is a mapping which partitions

More information

What Do Neural Networks Do? MLP Lecture 3 Multi-layer networks 1

What Do Neural Networks Do? MLP Lecture 3 Multi-layer networks 1 What Do Neural Networks Do? MLP Lecture 3 Multi-layer networks 1 Multi-layer networks Steve Renals Machine Learning Practical MLP Lecture 3 7 October 2015 MLP Lecture 3 Multi-layer networks 2 What Do Single

More information

Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter

Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter (Chair) STF - China Fellow francesco.dimaio@polimi.it

More information

Simple Neural Nets For Pattern Classification

Simple Neural Nets For Pattern Classification CHAPTER 2 Simple Neural Nets For Pattern Classification Neural Networks General Discussion One of the simplest tasks that neural nets can be trained to perform is pattern classification. In pattern classification

More information

Computational statistics

Computational statistics Computational statistics Lecture 3: Neural networks Thierry Denœux 5 March, 2016 Neural networks A class of learning methods that was developed separately in different fields statistics and artificial

More information

Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!

Artificial Neural Networks and Nonparametric Methods CMPSCI 383 Nov 17, 2011! Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011! 1 Todayʼs lecture" How the brain works (!)! Artificial neural networks! Perceptrons! Multilayer feed-forward networks! Error

More information

Part 8: Neural Networks

Part 8: Neural Networks METU Informatics Institute Min720 Pattern Classification ith Bio-Medical Applications Part 8: Neural Netors - INTRODUCTION: BIOLOGICAL VS. ARTIFICIAL Biological Neural Netors A Neuron: - A nerve cell as

More information

Artificial Neural Networks (ANN) Xiaogang Su, Ph.D. Department of Mathematical Science University of Texas at El Paso

Artificial Neural Networks (ANN) Xiaogang Su, Ph.D. Department of Mathematical Science University of Texas at El Paso Artificial Neural Networks (ANN) Xiaogang Su, Ph.D. Department of Mathematical Science University of Texas at El Paso xsu@utep.edu Fall, 2018 Outline Introduction A Brief History ANN Architecture Terminology

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward

More information

Neural Networks. Nethra Sambamoorthi, Ph.D. Jan CRMportals Inc., Nethra Sambamoorthi, Ph.D. Phone:

Neural Networks. Nethra Sambamoorthi, Ph.D. Jan CRMportals Inc., Nethra Sambamoorthi, Ph.D. Phone: Neural Networks Nethra Sambamoorthi, Ph.D Jan 2003 CRMportals Inc., Nethra Sambamoorthi, Ph.D Phone: 732-972-8969 Nethra@crmportals.com What? Saying it Again in Different ways Artificial neural network

More information

PV021: Neural networks. Tomáš Brázdil

PV021: Neural networks. Tomáš Brázdil 1 PV021: Neural networks Tomáš Brázdil 2 Course organization Course materials: Main: The lecture Neural Networks and Deep Learning by Michael Nielsen http://neuralnetworksanddeeplearning.com/ (Extremely

More information

CS 4700: Foundations of Artificial Intelligence

CS 4700: Foundations of Artificial Intelligence CS 4700: Foundations of Artificial Intelligence Prof. Bart Selman selman@cs.cornell.edu Machine Learning: Neural Networks R&N 18.7 Intro & perceptron learning 1 2 Neuron: How the brain works # neurons

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks Stephan Dreiseitl University of Applied Sciences Upper Austria at Hagenberg Harvard-MIT Division of Health Sciences and Technology HST.951J: Medical Decision Support Knowledge

More information

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA   1/ 21 Neural Networks Chapter 8, Section 7 TB Artificial Intelligence Slides from AIMA http://aima.cs.berkeley.edu / 2 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural

More information

Combination of M-Estimators and Neural Network Model to Analyze Inside/Outside Bark Tree Diameters

Combination of M-Estimators and Neural Network Model to Analyze Inside/Outside Bark Tree Diameters Combination of M-Estimators and Neural Network Model to Analyze Inside/Outside Bark Tree Diameters Kyriaki Kitikidou, Elias Milios, Lazaros Iliadis, and Minas Kaymakis Democritus University of Thrace,

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward

More information

Neural Networks biological neuron artificial neuron 1

Neural Networks biological neuron artificial neuron 1 Neural Networks biological neuron artificial neuron 1 A two-layer neural network Output layer (activation represents classification) Weighted connections Hidden layer ( internal representation ) Input

More information

Artificial Neural Networks. MGS Lecture 2

Artificial Neural Networks. MGS Lecture 2 Artificial Neural Networks MGS 2018 - Lecture 2 OVERVIEW Biological Neural Networks Cell Topology: Input, Output, and Hidden Layers Functional description Cost functions Training ANNs Back-Propagation

More information

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, 2012 Sasidharan Sreedharan www.sasidharan.webs.com 3/1/2012 1 Syllabus Artificial Intelligence Systems- Neural Networks, fuzzy logic,

More information

Holdout and Cross-Validation Methods Overfitting Avoidance

Holdout and Cross-Validation Methods Overfitting Avoidance Holdout and Cross-Validation Methods Overfitting Avoidance Decision Trees Reduce error pruning Cost-complexity pruning Neural Networks Early stopping Adjusting Regularizers via Cross-Validation Nearest

More information

18.6 Regression and Classification with Linear Models

18.6 Regression and Classification with Linear Models 18.6 Regression and Classification with Linear Models 352 The hypothesis space of linear functions of continuous-valued inputs has been used for hundreds of years A univariate linear function (a straight

More information

Sound Quality Prediction of Vehicle Interior Noise under Multiple Working Conditions Using Back-Propagation Neural Network Model

Sound Quality Prediction of Vehicle Interior Noise under Multiple Working Conditions Using Back-Propagation Neural Network Model Journal of Transportation Technologies, 205, 5, 34-39 Published Online April 205 in SciRes. http://www.scirp.org/journal/jtts http://dx.doi.org/0.4236/jtts.205.5203 Sound Quality Prediction of Vehicle

More information

Modeling Economic Time Series Using a Focused Time Lagged FeedForward Neural Network

Modeling Economic Time Series Using a Focused Time Lagged FeedForward Neural Network Proceedings of Student Research Day, CSIS, Pace University, May 9th, 23 Modeling Economic Time Series Using a Focused Time Lagged FeedForward Neural Network N. Moseley ABSTRACT, - Artificial neural networks

More information

Discriminative Direction for Kernel Classifiers

Discriminative Direction for Kernel Classifiers Discriminative Direction for Kernel Classifiers Polina Golland Artificial Intelligence Lab Massachusetts Institute of Technology Cambridge, MA 02139 polina@ai.mit.edu Abstract In many scientific and engineering

More information

GMDH-type Neural Networks with a Feedback Loop and their Application to the Identification of Large-spatial Air Pollution Patterns.

GMDH-type Neural Networks with a Feedback Loop and their Application to the Identification of Large-spatial Air Pollution Patterns. GMDH-type Neural Networks with a Feedback Loop and their Application to the Identification of Large-spatial Air Pollution Patterns. Tadashi Kondo 1 and Abhijit S.Pandya 2 1 School of Medical Sci.,The Univ.of

More information

Analysis of Multilayer Neural Network Modeling and Long Short-Term Memory

Analysis of Multilayer Neural Network Modeling and Long Short-Term Memory Analysis of Multilayer Neural Network Modeling and Long Short-Term Memory Danilo López, Nelson Vera, Luis Pedraza International Science Index, Mathematical and Computational Sciences waset.org/publication/10006216

More information

Brain Computer Interface Using Tensor Decompositions and Multi-way Analysis

Brain Computer Interface Using Tensor Decompositions and Multi-way Analysis 1 Brain Computer Interface Using Tensor Decompositions and Multi-way Analysis Andrzej CICHOCKI Laboratory for Advanced Brain Signal Processing http://www.bsp.brain.riken.jp/~cia/. RIKEN, Brain Science

More information

Lecture 7 Artificial neural networks: Supervised learning

Lecture 7 Artificial neural networks: Supervised learning Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in

More information

Pattern Recognition and Machine Learning

Pattern Recognition and Machine Learning Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability

More information

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 BIOLOGICAL INSPIRATIONS Some numbers The human brain contains about 10 billion nerve cells (neurons) Each neuron is connected to the others through 10000

More information

Unit III. A Survey of Neural Network Model

Unit III. A Survey of Neural Network Model Unit III A Survey of Neural Network Model 1 Single Layer Perceptron Perceptron the first adaptive network architecture was invented by Frank Rosenblatt in 1957. It can be used for the classification of

More information

Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks. Cannot approximate (learn) non-linear functions

Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks. Cannot approximate (learn) non-linear functions BACK-PROPAGATION NETWORKS Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks Cannot approximate (learn) non-linear functions Difficult (if not impossible) to design

More information

Neural Networks Introduction

Neural Networks Introduction Neural Networks Introduction H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011 H. A. Talebi, Farzaneh Abdollahi Neural Networks 1/22 Biological

More information

Machine Learning for Large-Scale Data Analysis and Decision Making A. Neural Networks Week #6

Machine Learning for Large-Scale Data Analysis and Decision Making A. Neural Networks Week #6 Machine Learning for Large-Scale Data Analysis and Decision Making 80-629-17A Neural Networks Week #6 Today Neural Networks A. Modeling B. Fitting C. Deep neural networks Today s material is (adapted)

More information

Contents. Introduction The General Linear Model. General Linear Linear Model Model. The General Linear Model, Part I. «Take home» message

Contents. Introduction The General Linear Model. General Linear Linear Model Model. The General Linear Model, Part I. «Take home» message DISCOS SPM course, CRC, Liège, 2009 Contents The General Linear Model, Part I Introduction The General Linear Model Data & model Design matrix Parameter estimates & interpretation Simple contrast «Take

More information

Neural Networks. Nicholas Ruozzi University of Texas at Dallas

Neural Networks. Nicholas Ruozzi University of Texas at Dallas Neural Networks Nicholas Ruozzi University of Texas at Dallas Handwritten Digit Recognition Given a collection of handwritten digits and their corresponding labels, we d like to be able to correctly classify

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Jeff Clune Assistant Professor Evolving Artificial Intelligence Laboratory Announcements Be making progress on your projects! Three Types of Learning Unsupervised Supervised Reinforcement

More information

The Application of Extreme Learning Machine based on Gaussian Kernel in Image Classification

The Application of Extreme Learning Machine based on Gaussian Kernel in Image Classification he Application of Extreme Learning Machine based on Gaussian Kernel in Image Classification Weijie LI, Yi LIN Postgraduate student in College of Survey and Geo-Informatics, tongji university Email: 1633289@tongji.edu.cn

More information

How to do backpropagation in a brain

How to do backpropagation in a brain How to do backpropagation in a brain Geoffrey Hinton Canadian Institute for Advanced Research & University of Toronto & Google Inc. Prelude I will start with three slides explaining a popular type of deep

More information

EPL442: Computational

EPL442: Computational EPL442: Computational Learning Systems Lab 2 Vassilis Vassiliades Department of Computer Science University of Cyprus Outline Artificial Neuron Feedforward Neural Network Back-propagation Algorithm Notes

More information

Chapter 3 Supervised learning:

Chapter 3 Supervised learning: Chapter 3 Supervised learning: Multilayer Networks I Backpropagation Learning Architecture: Feedforward network of at least one layer of non-linear hidden nodes, e.g., # of layers L 2 (not counting the

More information

Multilayer Perceptron

Multilayer Perceptron Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Single Perceptron 3 Boolean Function Learning 4

More information

Classification of Mental Tasks from EEG Signals Using Spectral Analysis, PCA and SVM

Classification of Mental Tasks from EEG Signals Using Spectral Analysis, PCA and SVM BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 18, No 1 Sofia 2018 Print ISSN: 1311-9702; Online ISSN: 1314-4081 DOI: 10.2478/cait-2018-0007 Classification of Mental Tasks

More information

Data-driven methods in application to flood defence systems monitoring and analysis Pyayt, A.

Data-driven methods in application to flood defence systems monitoring and analysis Pyayt, A. UvA-DARE (Digital Academic Repository) Data-driven methods in application to flood defence systems monitoring and analysis Pyayt, A. Link to publication Citation for published version (APA): Pyayt, A.

More information

Slice Oriented Tensor Decomposition of EEG Data for Feature Extraction in Space, Frequency and Time Domains

Slice Oriented Tensor Decomposition of EEG Data for Feature Extraction in Space, Frequency and Time Domains Slice Oriented Tensor Decomposition of EEG Data for Feature Extraction in Space, and Domains Qibin Zhao, Cesar F. Caiafa, Andrzej Cichocki, and Liqing Zhang 2 Laboratory for Advanced Brain Signal Processing,

More information

Neural networks. Chapter 20, Section 5 1

Neural networks. Chapter 20, Section 5 1 Neural networks Chapter 20, Section 5 Chapter 20, Section 5 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 20, Section 5 2 Brains 0 neurons of

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression

More information

BACKPROPAGATION. Neural network training optimization problem. Deriving backpropagation

BACKPROPAGATION. Neural network training optimization problem. Deriving backpropagation BACKPROPAGATION Neural network training optimization problem min J(w) w The application of gradient descent to this problem is called backpropagation. Backpropagation is gradient descent applied to J(w)

More information

EL-GY 6813/BE-GY 6203 Medical Imaging, Fall 2016 Final Exam

EL-GY 6813/BE-GY 6203 Medical Imaging, Fall 2016 Final Exam EL-GY 6813/BE-GY 6203 Medical Imaging, Fall 2016 Final Exam (closed book, 1 sheets of notes double sided allowed, no calculator or other electronic devices allowed) 1. Ultrasound Physics (15 pt) A) (9

More information

Bearing fault diagnosis based on EMD-KPCA and ELM

Bearing fault diagnosis based on EMD-KPCA and ELM Bearing fault diagnosis based on EMD-KPCA and ELM Zihan Chen, Hang Yuan 2 School of Reliability and Systems Engineering, Beihang University, Beijing 9, China Science and Technology on Reliability & Environmental

More information

Lecture 4: Feed Forward Neural Networks

Lecture 4: Feed Forward Neural Networks Lecture 4: Feed Forward Neural Networks Dr. Roman V Belavkin Middlesex University BIS4435 Biological neurons and the brain A Model of A Single Neuron Neurons as data-driven models Neural Networks Training

More information

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau Last update: October 26, 207 Neural networks CMSC 42: Section 8.7 Dana Nau Outline Applications of neural networks Brains Neural network units Perceptrons Multilayer perceptrons 2 Example Applications

More information

Engineering Part IIB: Module 4F10 Statistical Pattern Processing Lecture 6: Multi-Layer Perceptrons I

Engineering Part IIB: Module 4F10 Statistical Pattern Processing Lecture 6: Multi-Layer Perceptrons I Engineering Part IIB: Module 4F10 Statistical Pattern Processing Lecture 6: Multi-Layer Perceptrons I Phil Woodland: pcw@eng.cam.ac.uk Michaelmas 2012 Engineering Part IIB: Module 4F10 Introduction In

More information

Applied Statistics. Multivariate Analysis - part II. Troels C. Petersen (NBI) Statistics is merely a quantization of common sense 1

Applied Statistics. Multivariate Analysis - part II. Troels C. Petersen (NBI) Statistics is merely a quantization of common sense 1 Applied Statistics Multivariate Analysis - part II Troels C. Petersen (NBI) Statistics is merely a quantization of common sense 1 Fisher Discriminant You want to separate two types/classes (A and B) of

More information

Nicoladie Tam 1, *, Luca Pollonini 2 2, 3, 4. , George Zouridakis. 1. Introduction

Nicoladie Tam 1, *, Luca Pollonini 2 2, 3, 4. , George Zouridakis. 1. Introduction American Journal of Biomedical Science and Engineering 2017; 3(6): 54-63 http://www.aascit.org/journal/ajbse ISSN: 2381-103X (Print); ISSN: 2381-1048 (Online) Computational Method for Representing the

More information

Dynamic Data Modeling, Recognition, and Synthesis. Rui Zhao Thesis Defense Advisor: Professor Qiang Ji

Dynamic Data Modeling, Recognition, and Synthesis. Rui Zhao Thesis Defense Advisor: Professor Qiang Ji Dynamic Data Modeling, Recognition, and Synthesis Rui Zhao Thesis Defense Advisor: Professor Qiang Ji Contents Introduction Related Work Dynamic Data Modeling & Analysis Temporal localization Insufficient

More information