Multivariate Analysis, TMVA, and Artificial Neural Networks

Similar documents
Applied Statistics. Multivariate Analysis - part II. Troels C. Petersen (NBI) Statistics is merely a quantization of common sense 1

Artifical Neural Networks

Advanced statistical methods for data analysis Lecture 2

Multivariate Methods in Statistical Data Analysis

Neural Networks and the Back-propagation Algorithm

Neural networks. Chapter 19, Sections 1 5 1

HEP Data Mining with TMVA

y(x n, w) t n 2. (1)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Multivariate Data Analysis and Machine Learning in High Energy Physics (III)

Computational statistics

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks. Todd W. Neller

Lecture 4: Perceptrons and Multilayer Perceptrons

Neural networks. Chapter 20. Chapter 20 1

Multilayer Perceptrons and Backpropagation

MIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October,

Revision: Neural Network

Artificial neural networks

Multilayer Perceptron

Machine Learning

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Neural Networks DWML, /25

<Special Topics in VLSI> Learning for Deep Neural Networks (Back-propagation)

Pattern Classification

CSC 411 Lecture 10: Neural Networks

Supervised Learning in Neural Networks

What Do Neural Networks Do? MLP Lecture 3 Multi-layer networks 1

How to do backpropagation in a brain

Learning from Data: Multi-layer Perceptrons

Lecture 4: Feed Forward Neural Networks

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Classification with Perceptrons. Reading:

4. Multilayer Perceptrons

Multi-layer Neural Networks

Statistical Tools in Collider Experiments. Multivariate analysis in high energy physics

Neural networks. Chapter 20, Section 5 1

Lab 5: 16 th April Exercises on Neural Networks

Introduction to Neural Networks

Artificial Intelligence

Learning and Neural Networks

DEEP LEARNING AND NEURAL NETWORKS: BACKGROUND AND HISTORY

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009

Computational Intelligence Winter Term 2017/18

Sections 18.6 and 18.7 Analysis of Artificial Neural Networks

Neural Networks biological neuron artificial neuron 1

Rapid Introduction to Machine Learning/ Deep Learning

Artificial Neural Networks. Historical description

Computational Intelligence

Feedforward Neural Nets and Backpropagation

Machine learning approaches to the Higgs boson self coupling

Sections 18.6 and 18.7 Artificial Neural Networks

Machine Learning. Neural Networks. Le Song. CSE6740/CS7641/ISYE6740, Fall Lecture 7, September 11, 2012 Based on slides from Eric Xing, CMU

Learning Neural Networks

Machine Learning for Large-Scale Data Analysis and Decision Making A. Neural Networks Week #6

Lecture 10. Neural networks and optimization. Machine Learning and Data Mining November Nando de Freitas UBC. Nonlinear Supervised Learning

Statistical Tools in Collider Experiments. Multivariate analysis in high energy physics

Statistical NLP for the Web

Machine Learning (CSE 446): Neural Networks

Artificial Neural Network

Machine Learning. Neural Networks

Artificial Neural Networks. Edward Gatt

A Fractal-ANN approach for quality control

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen

Neural Networks. Nicholas Ruozzi University of Texas at Dallas

Sections 18.6 and 18.7 Artificial Neural Networks

Multivariate Analysis Techniques in HEP

Mark Gales October y (x) x 1. x 2 y (x) Inputs. Outputs. x d. y (x) Second Output layer layer. layer.

Linear discriminant functions

Multilayer Neural Networks

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

The Toolkit for Multivariate Data Analysis, TMVA 4

Artificial Neural Networks

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

Artificial Neural Networks

Artificial Neural Networks

AI Programming CS F-20 Neural Networks

Address for Correspondence

The Multi-Layer Perceptron

CSC321 Lecture 5: Multilayer Perceptrons

Machine learning in Higgs analyses

Deep Learning. Basics and Intuition. Constantin Gonzalez Principal Solutions Architect, Amazon Web Services

Introduction to Neural Networks

Artificial Neural Networks 2

Classification goals: Make 1 guess about the label (Top-1 error) Make 5 guesses about the label (Top-5 error) No Bounding Box

Machine Learning

Neural Networks. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington

EVERYTHING YOU NEED TO KNOW TO BUILD YOUR FIRST CONVOLUTIONAL NEURAL NETWORK (CNN)

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

Deep Neural Networks (1) Hidden layers; Back-propagation

Simulating Neural Networks. Lawrence Ward P465A

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92

Statistical Methods in Particle Physics

Introduction to Logistic Regression and Support Vector Machine

COMP9444 Neural Networks and Deep Learning 2. Perceptrons. COMP9444 c Alan Blair, 2017

Non-Linearity. CS 188: Artificial Intelligence. Non-Linear Separators. Non-Linear Separators. Deep Learning I

CS 6501: Deep Learning for Computer Graphics. Basics of Neural Networks. Connelly Barnes

Transcription:

http://tmva.sourceforge.net/ Multivariate Analysis, TMVA, and Artificial Neural Networks Matt Jachowski jachowski@stanford.edu 1

Multivariate Analysis Techniques dedicated to analysis of data with multiple variables Active field many recently developed techniques rely on computational ability of modern computers 2

Multivariate Analysis and HEP Goal is to classify events as signal or background Single event defined by several variables (energy, transverse momentum, etc.) Use all the variables to classify the event Multivariate analysis! 3

Multivariate Analysis and HEP Rectangular cuts optimization common 4

Multivariate Analysis and HEP Likelihood Estimator analysis also common Use of more complicated methods (Neural Networks, Boosted Decision Trees) not so common (though growing) why? Difficult to implement Physicists are skeptical of new methods 5

Toolkit for Multivariate Analysis (TMVA) ROOT-integrated software package with several MVA techniques Automatic training, testing, and evaluation of MVA methods Guidelines and documentation to describe methods for users this isn t a black box! 6

Toolkit for Multivariate Analysis (TMVA) Easy to configure methods Easy to plug-in HEP data Easy to compare different MVA methods 7

TMVA in Action 8

TMVA and Me TMVA started in October 2005 Still young Very active group of developers My involvement Decorrelation for Cuts Method (mini project) New Artificial Neural Network implementation (main project) 9

Decorrelated Cuts Method Some MVA methods suffer if data has linear correlations i.e. Likelihood Estimator, Cuts Linear correlations can be easily transformed away I implemented this for the Cuts Method 10

Decorrelated Cuts Method Find the square root of the covariance matrix (C=C C ) D T = S CS C'= S DS T Decorrelate the data x'= C'x Apply cuts to decorrelated data 11

Artificial Neural Networks (ANNs) Robust non-linear MVA technique 12

13

Training an ANN Challenge is training the network Like human brain, network learns from seeing data over and over again Technical details: Ask me if you re really interested 14

MLP MLP (Multi-Layer Perceptron) my ANN implementation for TMVA MLP is TMVA s main ANN MLP serves as base for any future ANN developments in TMVA 15

MLP Information & Statistics Implemented in C++ Object-Oriented 4,000+ lines of code 16 classes 16

Acknowledgements Joerg Stelzer Andreas Hoecker CERN University of Michigan Ford NSF 17

Questions? (I have lots of technical slides in reserve that I would be glad to talk about) 18

19

Synapses and Neurons v 0 y 0 v = j f ( y,..., y, w n 0,..., w 0 j nj ) w 0j v 1 y 1. w 1j v j y j.. y n w nj y j = ϕ ( v j ) v n 20

Synapses and Neurons v j = f ( y,..., y, w,..., w ) 0 n 0 j nj = n i= 0 w ij y i y j y j = ϕ( v j ) = 1+ 1 e v j v j 21

Universal Approximation Theorem Every continuous function that maps intervals of real numbers to some output interval of real numbers can be approximated arbitrarily closely by a multi-layer perceptron with just one hidden layer (with non-linear activation functions). output f ( x) = w σ ( b + v x) j j j weights between hidden and output layer j non-linear activation function weights between input and hidden layer inputs 22 bias

Training an MLP Training Event: Network: f x, x, x, x ) = ( 0 1 2 3 g x, x, x, x ) = ( 0 1 2 3 d y x 0 x 1 x 2 y x 3 error e = d y 23

Training an MLP Adjust weights to minimize error (or an estimator that is some function of the error) e j ( n) = d ( n) y ( n) j j ε ( n) = 1 2 2 e ( n) j j { output _ neurons} ε avg = 1 ε ( n) N 1 24 N n=

Back-Propagation Algorithm Make correction in direction of steepest descent w ij ( n + 1) = w ( n) + Δw ( n) ij ij Δw ij ( n) = η ε w ij ij ( n) ( n) Corrections made to output layer first, propagated backwards 25