The z-vertex Trigger for Belle II
|
|
- Hugh Hardy
- 5 years ago
- Views:
Transcription
1 The z-vertex Trigger for Belle II Sebastian Skambraks Technische Universität München IMPRS Young Scientist Workshop at Ringberg Castle July 10, / 13
2 Outline Introduction Motivation Signal Flow Multi Layer Perceptron - MLP Theory Setup Preprocessing Least Square fit - LS Results MLP and LS results Neuro Team F. Abudinen (LMU), Y. Chen (TUM), M. Feindt (KIT), R. Frühwirth (HEPHY), M. Heck (KIT), C. Kiesling (MPI), A. Knoll (TUM), S. Neuhaus (TUM), S. Paul (TUM), T. Röder (TUM), J. Schieck (HEPHY), S. Skambraks (TUM) 2 / 13
3 Introduction Goals build a z-vertex track trigger achieve high precision (spatial resolution z 2 cm) get a fast decision (< 1 µs) Method Input: CDC Track Segment data [IDs & clock cycle (2 ns timing)] Algorithms: 1. Bayes Classifier / Hough Transformation (pattern recognition) 2. LS - Least Square fit (linear estimation) 3. MLP - Multi Layer Perceptron (nonlinear correction) # of events / 5 mm Z distribution z (cm) Offline z distribution in the Belle Experiment a. a) T. Abe et al., Belle II Technical Design Report, KEK-REPORT , arxiv: v1 [physics.ins-det] (2010). 3 / 13
4 Main approach Pattern recognition - Bayes Classifier / Hough Transform sectorize Input in the track parameters (p T, ϕ, ϑ) P(Sector Hits) = P(Hits Sector) P(Sector) P(Hits) (1) Fitter - Least Squares n = (X T X) 1 X T y (2) X and y contain the hits, n defines a track parameter sector Neural Network - MLP z(hits, p T, ϕ, ϑ) = NN(f (Hits, p T, ϕ, ϑ)) (3) output float value interpreted as scaled z-position NN input transformation (function f ) requires preprocessing 4 / 13
5 Signal flow in the CDC Trigger CDC Axial TSF TS axial 2D Finder Pattern Recognition 3D Trigger 1. 2D Fit 2. 3D Fit Digitizer Merger TS axial, p T, ϕ NeuroTrigger Global Decision Logic Stereo TSF TS stereo < 5 µs 1. Preprocessing: HHimprove (p T, ϕ) & HHsectorize phase space 2. MLP HHoutput: improved z The neural network trigger will be implemented on a Virtex 7 FPGA z t 5 / 13
6 Signal flow in the CDC Trigger CDC Axial TSF TS axial 2D Finder Pattern Recognition 3D Trigger 1. 2D Fit 2. 3D Fit Digitizer Merger TS axial, p T, ϕ NeuroTrigger Global Decision Logic Stereo TSF TS stereo < 5 µs 1. Preprocessing: HHimprove (p T, ϕ) & HHsectorize phase space 2. MLP HHoutput: improved z The neural network trigger will be implemented on a Virtex 7 FPGA z t 5 / 13
7 MLP - Multi Layer Perceptron Properties input layer hidden layer output layer supervised machine learning function approximation short deterministic runtime one neuron: y = tanh( i=1 w i x i + w 0 ) t i i ϕ rel µ i w ji w kj z Input 3 nodes per SL (t, ϕ rel, µ) with t: drift time, ϕ rel : relative wire position, µ: 2D arc length.. 6 / 13
8 MLP - Setup Sectorization the track parameter space is sectorized in (p T, ϕ, ϑ) for each sector an expert MLP is trained asymmetry in ϑ, and p T can be taken into account preprocessing selects the proper MLP Two different sectors in (p T, ϕ) (left) and in ϑ (right). 7 / 13
9 Expert MLP - Capabilities a) b) Figure: z-vertex prediction with an expert MLP for a small sector in two p T regions with φ [0, 360], θ [56, 62] and z [ 10, 10] cm. a) p T [0.3, 0.317] GeV. b) p T [3.5, 9.625] GeV.! high accuracy on the z-vertex within a small sector 8 / 13
10 Preprocessing TSF 2D Finder TS axial, TS stereo Preprocessing NN input MLP z p T, ϕ Tasks Figure: Information flow in the NeuroTrigger 1. match Track Segments to tracks 2. improve (p T, ϕ) estimate (2D fit) Least Squares fit including drift times 3. provide 3D estimate (ϑ, z) 4. prepare Neural Net Input (& choose sector) 9 / 13
11 Preprocessing - Least Square fit solve the linear equation y = X n by: n = (X T X) 1 X T y 2D fit circle fit; center at c; track from origin. p T, ϕ c x, c y x i, y i : cartesian coordinates of axial hits in the (r, ϕ) plane. y c (x 2 i + y 2 i ) = 2c x x i + 2c y y i 3D fit line fit in the (µ, z) plane. µ i, z i : stereo hits transformed by 2D fit result. ϕ z-vertex z x z i = cot(ϑ) µ i + z 0 µ 10 / 13
12 Results - 2D LS Fit (ϕ, p T ) 90% RMS ϕ [ ] p T p T [%] nonlinear xt inner detector no background p T [GeV] 9 nonlinear xt 8 inner detector 7 no background p T [GeV] p T [0.3, 5] GeV ϕ [0, 90] ϑ [35, 123] z [ 50, 50] cm 11 / 13
13 Results - 3D LS Fit (ϑ, z) 90% RMS θ [ ] z [cm] nonlinear xt inner detector no background p T [GeV] nonlinear xt inner detector no background p T [GeV] p T [0.3, 5] GeV ϕ [0, 90] ϑ [35, 123] z [ 50, 50] cm 12 / 13
14 LS Fitter efficiency Cuts lead to efficiency decrease min 3 axial hits in different layers max 10 axial hits total min 2 stereo hits in different layers max 8 stereo hits total min 5 hits total ε [%] nonlinear xt inner detector no background p T [GeV] 13 / 13
15 z-90% RMS with LS fit and MLP z [cm] sector 1 1 MLP, inner detector, nonlinear xt LS Fit, inner detector, nonlinear xt MLP, small (p T, ϑ) sectors no inner detector, nonlinear xt sector p T [GeV] p T [0.3, 5] GeV ϕ [0, 90] ϑ [35, 123] z [ 50, 50] cm 14 / 13
16 Conclusion MLP MLP requires preprocessing MLP can improve z-rms in low pt region Sectorization improves MLP prediction Preprocessing LS fit useful for preprocessing LS fit achieves good z-rms for high p T tracks Outlook MLP optimization for low p T tracks further preprocessing studies hardware implementation on Virtex 7 FPGA 15 / 13
IEEE TRANSACTIONS ON NUCLEAR SCIENCE, VOL.?, NO.?, JUNE # of events / 5 mm. arxiv: v2 [physics.ins-det] 13 Jun 2014
IEEE TRANSACTIONS ON NUCLEAR SCIENCE, VOL.?, NO.?, JUNE 2014 1 A z-vertex Trigger for Belle II S. Skambraks, F. Abudinén, Y. Chen, M. Feindt, R. Frühwirth, M. Heck, C. Kiesling, A. Knoll, S. Neuhaus, S.
More informationNeural networks. Chapter 19, Sections 1 5 1
Neural networks Chapter 19, Sections 1 5 Chapter 19, Sections 1 5 1 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 19, Sections 1 5 2 Brains 10
More informationNeural networks. Chapter 20, Section 5 1
Neural networks Chapter 20, Section 5 Chapter 20, Section 5 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 20, Section 5 2 Brains 0 neurons of
More informationNeural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21
Neural Networks Chapter 8, Section 7 TB Artificial Intelligence Slides from AIMA http://aima.cs.berkeley.edu / 2 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural
More informationNeural networks. Chapter 20. Chapter 20 1
Neural networks Chapter 20 Chapter 20 1 Outline Brains Neural networks Perceptrons Multilayer networks Applications of neural networks Chapter 20 2 Brains 10 11 neurons of > 20 types, 10 14 synapses, 1ms
More informationArtificial neural networks
Artificial neural networks Chapter 8, Section 7 Artificial Intelligence, spring 203, Peter Ljunglöf; based on AIMA Slides c Stuart Russel and Peter Norvig, 2004 Chapter 8, Section 7 Outline Brains Neural
More informationPulse Shape Analysis
Pulse Shape Analysis Fabiana Cossavella Max-Planck Institut für Physik, München 26 March 2011 OUTLINE: motivation description of the procedure results Fabiana Cossavella Pulse Shape Analysis 1/11 Motivation:
More informationMachine Learning. Neural Networks
Machine Learning Neural Networks Bryan Pardo, Northwestern University, Machine Learning EECS 349 Fall 2007 Biological Analogy Bryan Pardo, Northwestern University, Machine Learning EECS 349 Fall 2007 THE
More informationMultilayer Neural Networks
Multilayer Neural Networks Multilayer Neural Networks Discriminant function flexibility NON-Linear But with sets of linear parameters at each layer Provably general function approximators for sufficient
More informationArtificial Neural Networks
Introduction ANN in Action Final Observations Application: Poverty Detection Artificial Neural Networks Alvaro J. Riascos Villegas University of los Andes and Quantil July 6 2018 Artificial Neural Networks
More informationStudies of Higgs hadronic decay channels at CLIC. IMPRS Young Scientist Workshop at Ringberg Castle 18 th July 2014 Marco Szalay
Studies of Higgs hadronic decay channels at CLIC IMPRS Young Scientist Workshop at Ringberg Castle 8 th July 24 Outline Physics at e + e - colliders CLIC - collider and detectors BDT intermezzo Higgs branching
More informationDigital Architecture for Real-Time CNN-based Face Detection for Video Processing
ASPC lab sb229@zips.uakron.edu Grant # June 1509754 27, 2017 1 / 26 igital Architecture for Real-Time CNN-based Face etection for Video Processing Smrity Bhattarai 1, Arjuna Madanayake 1 Renato J. Cintra
More informationIntroduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen
Neural Networks - I Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Neural Networks 1 /
More informationWhat Do Neural Networks Do? MLP Lecture 3 Multi-layer networks 1
What Do Neural Networks Do? MLP Lecture 3 Multi-layer networks 1 Multi-layer networks Steve Renals Machine Learning Practical MLP Lecture 3 7 October 2015 MLP Lecture 3 Multi-layer networks 2 What Do Single
More informationMultilayer Perceptrons (MLPs)
CSE 5526: Introduction to Neural Networks Multilayer Perceptrons (MLPs) 1 Motivation Multilayer networks are more powerful than singlelayer nets Example: XOR problem x 2 1 AND x o x 1 x 2 +1-1 o x x 1-1
More informationCSC Neural Networks. Perceptron Learning Rule
CSC 302 1.5 Neural Networks Perceptron Learning Rule 1 Objectives Determining the weight matrix and bias for perceptron networks with many inputs. Explaining what a learning rule is. Developing the perceptron
More informationTracking and Fitting. Natalia Kuznetsova, UCSB. BaBar Detector Physics Series. November 19, November 19, 1999 Natalia Kuznetsova, UCSB 1
Tracking and Fitting Natalia Kuznetsova, UCSB BaBar Detector Physics Series November 19, 1999 November 19, 1999 Natalia Kuznetsova, UCSB 1 Outline BaBar tracking devices: SVT and DCH Track finding SVT
More informationCS:4420 Artificial Intelligence
CS:4420 Artificial Intelligence Spring 2018 Neural Networks Cesare Tinelli The University of Iowa Copyright 2004 18, Cesare Tinelli and Stuart Russell a a These notes were originally developed by Stuart
More informationParticle Identification at LHCb. IML Workshop. April 10, 2018
Particle Identification at LHCb Miriam Lucio, on behalf of the LHCb Collaboration IML Workshop April 10, 2018 M. Lucio Particle Identification at LHCb April 10, 2018 1 Outline 1 Introduction 2 Neutral
More informationCosmic Muons and the Cosmic Challenge. data taken with B 3.5Tesla, red lines are reconstructed chamber segments
Cosmic Muons and the Cosmic Challenge data taken with B 3.5Tesla, red lines are reconstructed chamber segments 0 Preamble Warning: Data analyzed in this talk are just a few days old, so all interpretations
More informationIntroduction to Neural Networks
CUONG TUAN NGUYEN SEIJI HOTTA MASAKI NAKAGAWA Tokyo University of Agriculture and Technology Copyright by Nguyen, Hotta and Nakagawa 1 Pattern classification Which category of an input? Example: Character
More informationMulti-layer Neural Networks
Multi-layer Neural Networks Steve Renals Informatics 2B Learning and Data Lecture 13 8 March 2011 Informatics 2B: Learning and Data Lecture 13 Multi-layer Neural Networks 1 Overview Multi-layer neural
More informationDEEP LEARNING AND NEURAL NETWORKS: BACKGROUND AND HISTORY
DEEP LEARNING AND NEURAL NETWORKS: BACKGROUND AND HISTORY 1 On-line Resources http://neuralnetworksanddeeplearning.com/index.html Online book by Michael Nielsen http://matlabtricks.com/post-5/3x3-convolution-kernelswith-online-demo
More informationarxiv: v1 [physics.ins-det] 1 Sep 2009
The MEG Spectrometer at PSI Paolo Walter Cattaneo on behalf of the MEG collaboration INFN Pavia, Via Bassi 6, Pavia, I-27100, Italy arxiv:0909.0199v1 [physics.ins-det] 1 Sep 2009 The MEG experiment is
More informationLast update: October 26, Neural networks. CMSC 421: Section Dana Nau
Last update: October 26, 207 Neural networks CMSC 42: Section 8.7 Dana Nau Outline Applications of neural networks Brains Neural network units Perceptrons Multilayer perceptrons 2 Example Applications
More informationArtificial Neural Network
Artificial Neural Network Contents 2 What is ANN? Biological Neuron Structure of Neuron Types of Neuron Models of Neuron Analogy with human NN Perceptron OCR Multilayer Neural Network Back propagation
More informationLearning from Data: Multi-layer Perceptrons
Learning from Data: Multi-layer Perceptrons Amos Storkey, School of Informatics University of Edinburgh Semester, 24 LfD 24 Layered Neural Networks Background Single Neurons Relationship to logistic regression.
More informationPattern Classification
Pattern Classification All materials in these slides were taen from Pattern Classification (2nd ed) by R. O. Duda,, P. E. Hart and D. G. Stor, John Wiley & Sons, 2000 with the permission of the authors
More informationRadial Basis Function Networks. Ravi Kaushik Project 1 CSC Neural Networks and Pattern Recognition
Radial Basis Function Networks Ravi Kaushik Project 1 CSC 84010 Neural Networks and Pattern Recognition History Radial Basis Function (RBF) emerged in late 1980 s as a variant of artificial neural network.
More informationAlexander Leopold Felicitas Breibeck Christoph Schwanda. for the Belle Collaboration
Full reconstruction of hadronic B 0 S decays at Belle Alexander Leopold Felicitas Breibeck Christoph Schwanda for the Belle Collaboration Insitute of High Energy Physics Austrian Academy of Sciences under
More informationPart 8: Neural Networks
METU Informatics Institute Min720 Pattern Classification ith Bio-Medical Applications Part 8: Neural Netors - INTRODUCTION: BIOLOGICAL VS. ARTIFICIAL Biological Neural Netors A Neuron: - A nerve cell as
More informationNeural Networks and the Back-propagation Algorithm
Neural Networks and the Back-propagation Algorithm Francisco S. Melo In these notes, we provide a brief overview of the main concepts concerning neural networks and the back-propagation algorithm. We closely
More informationMultilayer Neural Networks
Multilayer Neural Networks Introduction Goal: Classify objects by learning nonlinearity There are many problems for which linear discriminants are insufficient for minimum error In previous methods, the
More informationArtificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!
Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011! 1 Todayʼs lecture" How the brain works (!)! Artificial neural networks! Perceptrons! Multilayer feed-forward networks! Error
More informationNeural Networks, Computation Graphs. CMSC 470 Marine Carpuat
Neural Networks, Computation Graphs CMSC 470 Marine Carpuat Binary Classification with a Multi-layer Perceptron φ A = 1 φ site = 1 φ located = 1 φ Maizuru = 1 φ, = 2 φ in = 1 φ Kyoto = 1 φ priest = 0 φ
More informationMultivariate Methods in Statistical Data Analysis
Multivariate Methods in Statistical Data Analysis Web-Site: http://tmva.sourceforge.net/ See also: "TMVA - Toolkit for Multivariate Data Analysis, A. Hoecker, P. Speckmayer, J. Stelzer, J. Therhaag, E.
More informationMachine Learning and Data Mining. Multi-layer Perceptrons & Neural Networks: Basics. Prof. Alexander Ihler
+ Machine Learning and Data Mining Multi-layer Perceptrons & Neural Networks: Basics Prof. Alexander Ihler Linear Classifiers (Perceptrons) Linear Classifiers a linear classifier is a mapping which partitions
More informationMachine Learning (CSE 446): Neural Networks
Machine Learning (CSE 446): Neural Networks Noah Smith c 2017 University of Washington nasmith@cs.washington.edu November 6, 2017 1 / 22 Admin No Wednesday office hours for Noah; no lecture Friday. 2 /
More informationNeural Networks Lecture 4: Radial Bases Function Networks
Neural Networks Lecture 4: Radial Bases Function Networks H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011. A. Talebi, Farzaneh Abdollahi
More informationDeep Neural Networks (1) Hidden layers; Back-propagation
Deep Neural Networs (1) Hidden layers; Bac-propagation Steve Renals Machine Learning Practical MLP Lecture 3 4 October 2017 / 9 October 2017 MLP Lecture 3 Deep Neural Networs (1) 1 Recap: Softmax single
More informationSample questions for COMP-424 final exam
Sample questions for COMP-44 final exam Doina Precup These are examples of questions from past exams. They are provided without solutions. However, Doina and the TAs would be happy to answer questions
More informationMultilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs)
Multilayer Neural Networks (sometimes called Multilayer Perceptrons or MLPs) Linear separability Hyperplane In 2D: w x + w 2 x 2 + w 0 = 0 Feature x 2 = w w 2 x w 0 w 2 Feature 2 A perceptron can separate
More informationNeural Networks Task Sheet 2. Due date: May
Neural Networks 2007 Task Sheet 2 1/6 University of Zurich Prof. Dr. Rolf Pfeifer, pfeifer@ifi.unizh.ch Department of Informatics, AI Lab Matej Hoffmann, hoffmann@ifi.unizh.ch Andreasstrasse 15 Marc Ziegler,
More information(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann
(Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for
More informationPERFORMANCE OF THE CELLULAR AUTOMATON (CA) SEED FINDER IN TPC TRACKING. Tracking Focus Group. Abstract
PERFORMANCE OF THE CELLULAR AUTOMATON (CA) SEED FINDER IN TPC TRACKING Tracking Focus Group Abstract We present a summary of performance evaluations of a new track seed finder in STAR TPC tracking. The
More informationMultilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs)
Multilayer Neural Networks (sometimes called Multilayer Perceptrons or MLPs) Linear separability Hyperplane In 2D: w 1 x 1 + w 2 x 2 + w 0 = 0 Feature 1 x 2 = w 1 w 2 x 1 w 0 w 2 Feature 2 A perceptron
More informationLecture 4: Feed Forward Neural Networks
Lecture 4: Feed Forward Neural Networks Dr. Roman V Belavkin Middlesex University BIS4435 Biological neurons and the brain A Model of A Single Neuron Neurons as data-driven models Neural Networks Training
More informationAI Programming CS F-20 Neural Networks
AI Programming CS662-2008F-20 Neural Networks David Galles Department of Computer Science University of San Francisco 20-0: Symbolic AI Most of this class has been focused on Symbolic AI Focus or symbols
More informationIntroduction to Deep Learning
Introduction to Deep Learning Some slides and images are taken from: David Wolfe Corne Wikipedia Geoffrey A. Hinton https://www.macs.hw.ac.uk/~dwcorne/teaching/introdl.ppt Feedforward networks for function
More informationMachine Learning for Large-Scale Data Analysis and Decision Making A. Neural Networks Week #6
Machine Learning for Large-Scale Data Analysis and Decision Making 80-629-17A Neural Networks Week #6 Today Neural Networks A. Modeling B. Fitting C. Deep neural networks Today s material is (adapted)
More informationNeural Networks. Fundamentals Framework for distributed processing Network topologies Training of ANN s Notation Perceptron Back Propagation
Neural Networks Fundamentals Framework for distributed processing Network topologies Training of ANN s Notation Perceptron Back Propagation Neural Networks Historical Perspective A first wave of interest
More informationArtificial Neural Networks. Edward Gatt
Artificial Neural Networks Edward Gatt What are Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning Very
More informationMachine Learning. Neural Networks. (slides from Domingos, Pardo, others)
Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward
More informationCourse 395: Machine Learning - Lectures
Course 395: Machine Learning - Lectures Lecture 1-2: Concept Learning (M. Pantic) Lecture 3-4: Decision Trees & CBC Intro (M. Pantic & S. Petridis) Lecture 5-6: Evaluating Hypotheses (S. Petridis) Lecture
More informationApprentissage, réseaux de neurones et modèles graphiques (RCP209) Neural Networks and Deep Learning
Apprentissage, réseaux de neurones et modèles graphiques (RCP209) Neural Networks and Deep Learning Nicolas Thome Prenom.Nom@cnam.fr http://cedric.cnam.fr/vertigo/cours/ml2/ Département Informatique Conservatoire
More informationEE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan
EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, 2012 Sasidharan Sreedharan www.sasidharan.webs.com 3/1/2012 1 Syllabus Artificial Intelligence Systems- Neural Networks, fuzzy logic,
More informationSimple Neural Nets For Pattern Classification
CHAPTER 2 Simple Neural Nets For Pattern Classification Neural Networks General Discussion One of the simplest tasks that neural nets can be trained to perform is pattern classification. In pattern classification
More informationStatistical Machine Learning from Data
January 17, 2006 Samy Bengio Statistical Machine Learning from Data 1 Statistical Machine Learning from Data Other Artificial Neural Networks Samy Bengio IDIAP Research Institute, Martigny, Switzerland,
More informationThe Multi-Layer Perceptron
EC 6430 Pattern Recognition and Analysis Monsoon 2011 Lecture Notes - 6 The Multi-Layer Perceptron Single layer networks have limitations in terms of the range of functions they can represent. Multi-layer
More informationConvolutional networks. Sebastian Seung
Convolutional networks Sebastian Seung Convolutional network Neural network with spatial organization every neuron has a location usually on a grid Translation invariance synaptic strength depends on locations
More informationApplied Statistics. Multivariate Analysis - part II. Troels C. Petersen (NBI) Statistics is merely a quantization of common sense 1
Applied Statistics Multivariate Analysis - part II Troels C. Petersen (NBI) Statistics is merely a quantization of common sense 1 Fisher Discriminant You want to separate two types/classes (A and B) of
More informationMachine Learning I Continuous Reinforcement Learning
Machine Learning I Continuous Reinforcement Learning Thomas Rückstieß Technische Universität München January 7/8, 2010 RL Problem Statement (reminder) state s t+1 ENVIRONMENT reward r t+1 new step r t
More informationLogistic Regression & Neural Networks
Logistic Regression & Neural Networks CMSC 723 / LING 723 / INST 725 Marine Carpuat Slides credit: Graham Neubig, Jacob Eisenstein Logistic Regression Perceptron & Probabilities What if we want a probability
More informationLecture 4: Perceptrons and Multilayer Perceptrons
Lecture 4: Perceptrons and Multilayer Perceptrons Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning Perceptrons, Artificial Neuronal Networks Lecture 4: Perceptrons
More informationNeural Networks biological neuron artificial neuron 1
Neural Networks biological neuron artificial neuron 1 A two-layer neural network Output layer (activation represents classification) Weighted connections Hidden layer ( internal representation ) Input
More informationMultilayer Perceptron = FeedForward Neural Network
Multilayer Perceptron = FeedForward Neural Networ History Definition Classification = feedforward operation Learning = bacpropagation = local optimization in the space of weights Pattern Classification
More informationDetecting Statistical Interactions from Neural Network Weights
Detecting Statistical Interactions from Neural Network Weights Michael Tsang Joint work with Dehua Cheng, Yan Liu 1/17 Motivation: We seek assurance that a neural network learned the longitude x latitude
More informationThe Perceptron. Volker Tresp Summer 2014
The Perceptron Volker Tresp Summer 2014 1 Introduction One of the first serious learning machines Most important elements in learning tasks Collection and preprocessing of training data Definition of a
More informationThe Perceptron. Volker Tresp Summer 2016
The Perceptron Volker Tresp Summer 2016 1 Elements in Learning Tasks Collection, cleaning and preprocessing of training data Definition of a class of learning models. Often defined by the free model parameters
More informationCSC242: Intro to AI. Lecture 21
CSC242: Intro to AI Lecture 21 Administrivia Project 4 (homeworks 18 & 19) due Mon Apr 16 11:59PM Posters Apr 24 and 26 You need an idea! You need to present it nicely on 2-wide by 4-high landscape pages
More informationArtificial Neural Networks Examination, June 2005
Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either
More informationLearning and Memory in Neural Networks
Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units
More informationCS 4700: Foundations of Artificial Intelligence
CS 4700: Foundations of Artificial Intelligence Prof. Bart Selman selman@cs.cornell.edu Machine Learning: Neural Networks R&N 18.7 Intro & perceptron learning 1 2 Neuron: How the brain works # neurons
More information2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks. Todd W. Neller
2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks Todd W. Neller Machine Learning Learning is such an important part of what we consider "intelligence" that
More informationAdvanced Stereoscopic Array Trigger. Frank Krennrich (Iowa State University)
Advanced Stereoscopic Array Trigger Frank Krennrich (Iowa State University) PAT Pattern Array Trigger Collaboration of Argonne National Laboratory & Iowa State University, Karen Byrum, John Dawson, Gary
More informationMTV - S1183 Beam Time Summary. Performance test of MTV new detector : CDC. Yumi Totsuka. Department of Physics, Rikkyo University, Tokyo, Japan
MTV - S1183 Beam Time Summary Performance test of MTV new detector : CDC Yumi Totsuka Department of Physics, Rikkyo University, Tokyo, Japan TRIUMF-ISAC Science Forum (30. Nov. 2011) Schedule: Beam Time
More informationOptmization Methods for Machine Learning Beyond Perceptron Feed Forward neural networks (FFN)
Optmization Methods for Machine Learning Beyond Perceptron Feed Forward neural networks (FFN) Laura Palagi http://www.dis.uniroma1.it/ palagi Dipartimento di Ingegneria informatica automatica e gestionale
More informationCOMP-4360 Machine Learning Neural Networks
COMP-4360 Machine Learning Neural Networks Jacky Baltes Autonomous Agents Lab University of Manitoba Winnipeg, Canada R3T 2N2 Email: jacky@cs.umanitoba.ca WWW: http://www.cs.umanitoba.ca/~jacky http://aalab.cs.umanitoba.ca
More informationRare B decays in ATLAS and CMS
Rare B decays in ATLAS and CMS University Dec. 14 th 2006 Makoto Tomoto Nagoya University on behalf of CMS and ATLAS collaborations Makoto Tomoto (Nagoya University) Outline B physics in ATLAS and CMS
More informationIntroduction Biologically Motivated Crude Model Backpropagation
Introduction Biologically Motivated Crude Model Backpropagation 1 McCulloch-Pitts Neurons In 1943 Warren S. McCulloch, a neuroscientist, and Walter Pitts, a logician, published A logical calculus of the
More information4. Multilayer Perceptrons
4. Multilayer Perceptrons This is a supervised error-correction learning algorithm. 1 4.1 Introduction A multilayer feedforward network consists of an input layer, one or more hidden layers, and an output
More informationComputational Intelligence Lecture 3: Simple Neural Networks for Pattern Classification
Computational Intelligence Lecture 3: Simple Neural Networks for Pattern Classification Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Fall 2011 arzaneh Abdollahi
More informationNeural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2
Neural Nets in PR NM P F Outline Motivation: Pattern Recognition XII human brain study complex cognitive tasks Michal Haindl Faculty of Information Technology, KTI Czech Technical University in Prague
More informationPerformance of the FastTracKer in ATLAS
Performance of the FastTracKer in ATLAS Maddalena Giulini under supervision of Prof. A. Schöning and Dr. T. Klimkovich Physikalisches Institut, Universität Heidelberg DPG Spring Conference, Mainz March
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics 8. Multivariate Analysis Prof. Dr. Klaus Reygers (lectures) Dr. Sebastian Neubert (tutorials) Heidelberg University WS 2017/18 Multi-Variate Classification Consider
More informationLecture 10. Neural networks and optimization. Machine Learning and Data Mining November Nando de Freitas UBC. Nonlinear Supervised Learning
Lecture 0 Neural networks and optimization Machine Learning and Data Mining November 2009 UBC Gradient Searching for a good solution can be interpreted as looking for a minimum of some error (loss) function
More informationy(x n, w) t n 2. (1)
Network training: Training a neural network involves determining the weight parameter vector w that minimizes a cost function. Given a training set comprising a set of input vector {x n }, n = 1,...N,
More informationEngineering Part IIB: Module 4F10 Statistical Pattern Processing Lecture 6: Multi-Layer Perceptrons I
Engineering Part IIB: Module 4F10 Statistical Pattern Processing Lecture 6: Multi-Layer Perceptrons I Phil Woodland: pcw@eng.cam.ac.uk Michaelmas 2012 Engineering Part IIB: Module 4F10 Introduction In
More informationNeural Networks and Deep Learning
Neural Networks and Deep Learning Professor Ameet Talwalkar November 12, 2015 Professor Ameet Talwalkar Neural Networks and Deep Learning November 12, 2015 1 / 16 Outline 1 Review of last lecture AdaBoost
More informationThe GigaFitter for Fast Track Fitting based on FPGA DSP Arrays
The GigaFitter for Fast Track Fitting based on FPGA DSP Arrays Pierluigi Catastini (Universita di Siena - INFN Pisa) For the SVT Collaboration SVT: A Success for CDF 35µm 33µm resol beam σ = 48µm SVT provides
More informationImpact of the PXD on the Vertex Reconstruction of π 0 particles
Impact of the PXD on the Vertex Reconstruction of π particles Fernando Abudinén 14. May, 216 1 Why CP-V. Sensitivity to B π π? 2 -Conversions and π e + e 3 Improvement of B Vertex Reco. 4 Summary and Outlook
More informationMachine Learning. Neural Networks. (slides from Domingos, Pardo, others)
Machine Learning Neural Networks (slides from Domingos, Pardo, others) Human Brain Neurons Input-Output Transformation Input Spikes Output Spike Spike (= a brief pulse) (Excitatory Post-Synaptic Potential)
More informationFPGA Implementation of a HOG-based Pedestrian Recognition System
MPC Workshop Karlsruhe 10/7/2009 FPGA Implementation of a HOG-based Pedestrian Recognition System Sebastian Bauer sebastian.bauer@fh-aschaffenburg.de Laboratory for Pattern Recognition and Computational
More informationDeep Neural Networks (1) Hidden layers; Back-propagation
Deep Neural Networs (1) Hidden layers; Bac-propagation Steve Renals Machine Learning Practical MLP Lecture 3 2 October 2018 http://www.inf.ed.ac.u/teaching/courses/mlp/ MLP Lecture 3 / 2 October 2018 Deep
More informationInstituto Tecnológico y de Estudios Superiores de Occidente Departamento de Electrónica, Sistemas e Informática. Introductory Notes on Neural Networks
Introductory Notes on Neural Networs Dr. José Ernesto Rayas Sánche April Introductory Notes on Neural Networs Dr. José Ernesto Rayas Sánche BIOLOGICAL NEURAL NETWORKS The brain can be seen as a highly
More informationCMSC 421: Neural Computation. Applications of Neural Networks
CMSC 42: Neural Computation definition synonyms neural networks artificial neural networks neural modeling connectionist models parallel distributed processing AI perspective Applications of Neural Networks
More informationARTIFICIAL INTELLIGENCE. Artificial Neural Networks
INFOB2KI 2017-2018 Utrecht University The Netherlands ARTIFICIAL INTELLIGENCE Artificial Neural Networks Lecturer: Silja Renooij These slides are part of the INFOB2KI Course Notes available from www.cs.uu.nl/docs/vakken/b2ki/schema.html
More informationPerceptron. (c) Marcin Sydow. Summary. Perceptron
Topics covered by this lecture: Neuron and its properties Mathematical model of neuron: as a classier ' Learning Rule (Delta Rule) Neuron Human neural system has been a natural source of inspiration for
More informationCSC321 Lecture 5: Multilayer Perceptrons
CSC321 Lecture 5: Multilayer Perceptrons Roger Grosse Roger Grosse CSC321 Lecture 5: Multilayer Perceptrons 1 / 21 Overview Recall the simple neuron-like unit: y output output bias i'th weight w 1 w2 w3
More informationEEE 241: Linear Systems
EEE 4: Linear Systems Summary # 3: Introduction to artificial neural networks DISTRIBUTED REPRESENTATION An ANN consists of simple processing units communicating with each other. The basic elements of
More information