Artificial neural networks

Size: px
Start display at page:

Download "Artificial neural networks"

Transcription

1 Artificial neural networks B. Mehlig, Department of Physics, University of Gothenburg, Sweden FFR35/FIM70 Artificial Neural Networks Chalmers/Gothenburg University, 7.5 credits

2

3 3

4 Course home page Teachers Forum Bernhard Mehlig, Johan Fries, Marina Rafajlovic Online forum at Piazza. Ask questions on the forum (instead of ing) - so that other students can benefit. Course representatives TBA Schedule TimeEdit Literature Lecture notes, will be continuously updated. Examination: Homework problems: OpenTA. Written exam. You must be registered for the course to take the exam. Changes since course was last given Rules for passing grade. Format for submission of homework problems. 4

5 Piazza 5

6 OpenTA 6

7 OpenTA Syntax 7

8 OpenTA Quiz 8

9 OpenTA Quiz 9

10 Introductory lecture Slides are on the course home page. 0

11 Neurons in the cerebral cortex brainmaps.org Neurons in the cerebral cortex (outer layer of the cerebrum, the largest and best developed part of the Human brain)

12 Neurons in the cerebral cortex brainmaps.org axon output cell body process dendrites input Neurons in the cerebral cortex (outer layer of the cerebrum, the largest and best developed part of the Human brain)

13 Neuron anatomy and activity Schematic drawing of a neuron Output of a neuron: spike train input process output dendrites synapses cell body axon active inactive time Total length of dendrites up to cm 00 µm synapseweb.clm.utexas.edu/dimensions-dendrites wikipedia.org/wiki/neuron Spike train in electrosensory pyramidal neuron in fish (Eigenmannia) Gabbiani & Metzner, J. Exp. Biol. 0 (999) 67 3

14 McCulloch-Pitts neuron McCulloch & Pitts, Bull. Math. Biophys. 5 (943) 5 Simple model for a neuron: w i Signal processing: weighted sum of inputs Activation function g(b i ) w i i O i O i = g N w ij j i j= = b i (local field) N w in = g w i w in N i ) Incoming signals j j=,...,n w ij Weights (synaptic couplings) Threshold i Neuron number Output i O i 4

15 Activation function Signal processing of McCulloch-Pitts neuron: weighted sum of inputs with activation function g(b i ) : O i = g g(b) N j= w ij j i = b i = g w i w in N i ) g(b) g(b) Inputs j Synaptic couplings w ij Threshold i Output O i 0 b b 0 b Step function g(b) (b) Signum function sgn(b) Sigmoid function g(x) = with parameter > 0 e b + 0 b ReLU function max(0,b) 5

16 Highly simplified idealisation of a neuron Take activation function to be signum function sgn(b i ). Output of neuron i can assume only two values, ±, representing the level of activity of the neuron. This is an idealisation. -real neurons fire spike trains active inactive -switching active inactive can be delayed (time delay) -real neurons may perform non-linear summation of inputs -real neurons subject to noise (stochastic dynamics) 6

17 Neural networks Connect neurons into networks that can perform computing tasks: for example image analysis, object identification, pattern recognition, classification, clustering, data compression. inputs inputs Simple perceptrons Two-layer perceptron Recurrent network Boltzmann machine 7

18 A classification task Input patterns. Index µ =,...,p labels patterns. In this example (µ) (µ) each pattern has two components, and. " # (µ) Arrange components into vector, (µ) =, shown in the Figure. (µ) We see: the patterns fall into two classes: on the left, and on the right. (µ) (3) () (0) () (6) (4) (5) (8) (7) (9) 8

19 A classification task Input patterns. Index µ =,...,p labels patterns. (µ) (µ) Each pattern has two components, and. " # Arrange components into vector, (µ) =, shown in the Figure. (µ) We see: the patterns fall into two classes: the left, and on the right. (µ) on (µ) Draw a red line (decision boundary) to distinguish the two types of patterns. (3) () (0) () (6) (4) (5) (8) (7) (9) 9

20 A classification task Input patterns. Index µ =,...,p labels patterns. (µ) (µ) Each pattern has two components, and. " # Arrange components into vector, (µ) =, shown in the Figure. (µ) We see: the patterns fall into two classes: the left, and on the right. (µ) on (µ) Draw a red line (decision boundary) to distinguish the two types of patterns. (3) () (0) () (6) (4) (5) (8) (7) (9) Aim: train a neural network to compute the decision boundary. To do this, define target values: t (µ) =for, and t (µ) = for Training set ( (µ),t (µ) ),µ=,...,p. 0

21 Simple perceptron Simple perceptron: one neuron. Two input units and. Activation function sgn(b). No threshold, =0. inputs w w output O Output Vectors O = sgn(w + w ) = sgn(w ) w = apple w denotes length of vector w apple apple cos = w = w sin = apple cos sin Scalar product between vectors: ' w So w = w + w w = w cos( ) = w cos = w (cos cos + sin sin cos angle between w and Aim: adjust the weights w so that network outputs correct target value for all patterns: O (µ) = sgn(w (µ) )=t (µ) for µ =,...,p

22 Geometrical solution Simple perceptron: One neuron. Two input units and. Activation function sgn(b). No threshold, =0. inputs w w output O Output O = sgn(w ) with w = w cos angle between w and Aim: adjust the weights w so that network outputs correct target values for all patterns: Legend t (µ) = t (µ) = (3) () () (0) (4) (5) ' w (9) (8) (6) (7) O (µ) = sgn(w (µ) )=t (µ) for µ =,...,p where t (µ) =for and t (µ) = for. Solution: w decision boundary Check: w (8) > 0 so O (8) = sgn(w (8) )=. Correct since t (8) =. cos

23 Hebb s rule (8) Now the pattern is on wrong side of the red line. So O (8) = sgn(w (8) ) 6= t (8). Move the red line so that O (8) = sgn(w (8) )=t (8) by rotating the weight vector w: Legend t (µ) = t (µ) = (3) () () (8) (6) (4) (5) (0) w (9) (7) 3

24 Hebb s rule Now the pattern (8) is on wrong side of the red line. So O (8) = sgn(w (8) ) 6= t (8). Move the red line so that O (8) = sgn(w (8) )=t (8) by rotating the weight vector w: w = w + (8) (small parameter > 0 ) so that O (8) = sgn(w (8) )=t (8). Legend t (µ) = t (µ) = (3) () () (8) (6) (4) (5) (0) w (7) (9) 4

25 Hebb s rule (4) Now the pattern is on wrong side of the red line. So O (4) = sgn(w (4) ) 6= t (4). Move the red line so that O (4) = sgn(w (4) )=t (4) by rotating the weight vector w: w = w (4) (small parameter > 0 ) so that O (4) = sgn(w (4) )=t (4). Legend t (µ) = t (µ) = (3) () () (6) (5) (0) w (7) (4) (9) (8) 5

26 Hebb s rule (4) Now the pattern is on wrong side of the red line. So O (4) = sgn(w (4) ) 6= t (4). Move the red line so that O (4) = sgn(w (4) )=t (4) by rotating the weight vector w: w = w (4) (small parameter > 0 ) so that O (4) = sgn(w (4) )=t (4). Legend t (µ) = t (µ) = (3) () () (6) (5) (0) w (7) (4) (9) (8) 6

27 Hebb s rule (4) Now the pattern is on wrong side of the red line. So O (4) = sgn(w (4) ) 6= t (4). Move the red line so that O (4) = sgn(w (4) )=t (4) by rotating the weight vector w: w = w (4) (small parameter > 0 ) so that O (4) = sgn(w (4) )=t (4). Legend t (µ) = t (µ) = Note the minus sign: w = w + (8) where t (8) = (3) () () (6) (5) w = w (4) where Learning rule (Hebb s rule) t (4) = (0) w (7) w 0 = w + w with w = t (µ) (µ) (4) (9) (8) Apply learning rule many times until problem is solved. 7

28 Example - AND function Logical AND Legend t (µ) = t (µ) = t () (4) w w = 0 - () (3) w = 8

29 Example - XOR function Logical XOR t () () (4) (3) Legend t (µ) = t (µ) = 9

30 Example - XOR function Logical XOR t () () (4) (3) Legend t (µ) = t (µ) = This problem is not linearly separable because we cannot separate from by a single red line. 30

31 Example - XOR function Logical XOR t () () (4) (3) Legend t (µ) = t (µ) = This problem is not linearly separable because we cannot separate from by a single red line. Solution: use two red lines. 3

32 Example - XOR function Logical XOR t () I () (4) (3) Two hidden neurons, each one defines one red line. Together they define three regions, I, II, and III. II Legend t (µ) = t (µ) = III V V layer of hidden neurons V and V (neither input nor output) all hidden weights equal to We need a third neuron (with weights W and W to associate region II with, and regions II and III with. O = sgn(v V ) 3 threshold

33 Non-(linearly) separable problems Solve problems that are not linearly separable with a hidden layer of neurons Legend t (µ) = t (µ) = Four hidden neurons - one for each red line-segment. Move the red lines into the correct configuration by repeatedly using Hebb s rule until the problem is solved (a fifth neuron assigns regions and solves the classification problem). 33

34 Generalisation Train the network on a training set ( (µ),t (µ) ),µ=,...,p : move red lines into the correct configuration by repeatedly applying Hebb s rule to adjust all weights. Usually many iterations necessary. Legend t (µ) = t (µ) = Once all red lines are in the right place (all weights determined), apply network to a new data set. If the training set was reliable, then the network has learnt to classify the new data, it has learnt to generalise. 34

35 Training Train the network on a training set ( (µ),t (µ) ),µ=,...,p : move red lines into the correct configuration by repeatedly applying Hebb s rule to adjust all weights. Usually many iterations necessary. Legend t (µ) = t (µ) = Once all red lines are in the right place (all weights determined), apply network to a new data set. If the training set was reliable, then the network has learnt to classify the new data, it has learnt to generalise. 35

36 Training Train the network on a training set ( (µ),t (µ) ),µ=,...,p : move red lines into the correct configuration by repeatedly applying Hebb s rule to adjust all weights. Usually many iterations necessary. Legend t (µ) = t (µ) = error Usually a small number of errors is acceptable. It is often not meaningful to try to finetune very precisely (input patterns are subject to noise). 36

37 Overfitting Train the network on a training set ( (µ),t (µ) ),µ=,...,p : move red lines into the correct configuration by repeatedly applying Hebb s rule to adjust all weights. Usually many iterations necessary. Legend t (µ) = t (µ) = Here: used 5 hidden neurons to fit decision boundary very precisely. But often the inputs are affected by noise, which can be very different in different data sets training set 37

38 Overfitting Train the network on a training set ( (µ),t (µ) ),µ=,...,p : move red lines into the correct configuration by repeatedly applying Hebb s rule to adjust all weights. Usually many iterations necessary. Legend t (µ) = t (µ) = Here: used 5 hidden neurons to fit decision boundary very precisely. But often the inputs are affected by noise, which can be very different in different data sets. data set Here the network just learnt noise in the training set. Avoid this by reducing number of weights (dropout). 38

39 Multilayer perceptrons All examples had a two-dimensional input space and one-dimensional outputs. Reason: so that we could draw the problem and understand its geometrical form. In reality problems are often high-dimensional. For example image recognition: input dimension N = #pixels 3. Output dimension: number of classes to be recognised (car, lorry, road sign, person, bicycle,...) Multilayer perceptron: network with many hidden layers Two problems: (i) expensive to train if many weights (ii) slow convergence (weights get stuck) Questions: (i) how many hidden layers necessary? (ii) how many hidden layers efficient? hidden layers number m and m 39

40 How many hidden layers? All Boolean functions with N inputs can be trained/learned with a single hidden layer. Proof by construction. Requires N neurons in the hidden layer. Example: XOR function ( N =). For large N, this architecture is not practical, because the number of neurons increases exponentially with N. Better: use more layers. Intuition: the more layers the more powerful is the network. hidden layers But such deep networks are in general hard to train. inputs outputs with w +w w>0 40

41 Deep learning Deep learning. Perceptrons with large numbers of layers. In the last few years, interest in neural-network algorithms has exploded, and their performance has significantly improved. Why? Better training sets classification error (top-5) ILSVRC classification error rate not NN NN Goodfellow et al. Deep Learning (06) Improved architecture - better activation functions, g(b) = max(0,b) Giorot, Bordes & Bengio (0) - a bunch of other tricks (dropout, batch normalisation) Better hardware - GPUs - dedicated chips by Google en.wikipedia.org/wiki/tensor_processing_unit 4

42 ImageNet To obtain training set: must collect and manually annotate data. Questions: (i) efficient data collection and annotation (ii) aim for large variety in collected data Example: Image-net (publicly available data set): > 07 images, classified into 03 classes. image-net.org 4

43 recaptcha github.com/google/recaptcha Two goals. (i) protects websites from bot (ii) users who click correctly provide correct annotation. 43

44 Convolutional networks Convolutional network. Each neuron in first hidden layer is connected to a small region of inputs. Here 3 3 pixels. 0 0 V 8 8 Slide the region over input image. Use same weights for all hidden neurons. Update V mn = g 3 3 w kl m+k,n+l V mn 0 0 V 8 8 k= l= - detects the same local feature everywhere in input image (edge, corner,...). Feature map. inputs 0 0 hidden the form of the sum is called convolution - efficient because fewer weights - use several feature maps to detect different features in input image 44

45 Convolutional networks - max-pooling: take maximum of over small region ( ) to reduce # of parameters V mn inputs 0 0 hidden max-pooling outputs add fully connected layers to learn more abstract features - Example Krizhevsky, Stuskever & Hinton (0) 45

46 Object detection Deep learning with a convolutional network. Here Yolo-9000 with tiny number of weights (so that the algorithm runs on my laptop). Pretrained on VOC training set Redmon et al. github.com/pjreddie/darknet/wiki/yolo:-real-time-object-detection Installation instructions: github.com/philipperemy/yolo

47 Yolo-9000 Object detection from video frames using Yolo-9000 github.com/philipperemy/yolo-9000 Film recorded with Iphone 6 on my way home from work. Tiny number of weights (so that the algorithm runs on my laptop) Pretrained on Pascal VOC data set host.robots.ox.ac.uk/pascal/voc/ 47

48 Deep learning for self-driving cars Zenuity: joint venture between Autoliv and Volvo Cars. Goal: develop software for assisted and autonomous driving. Founded employees, mostly in Gothenburg. 48

49 Conclusions Artificial neural networks were already studied in the 80ies In the last few years: deep-learning revolution due to - improved algorithms (focus on object detection) - better hardware (GPUs, dedicated chips) - better training sets Applications: google, facebook, tesla, zenuity,... Perspectives - close connections to statistical physics (spin glasses, random magnets) - related methods in mathematical statistics (big data analysis) - unsupervised learning (clustering and familiarity of input data) 49

50 Conclusions xkcd.com/897 50

51 Literature Neural networks Mehlig, Artificial Neural Networks physics.gu.se/~frtbm Hertz, Krogh & Palmer, Introduction to the theory of neural computation (Santa Fe Institute Series) Deep learning Nielsen Neural Networks and Deep Learning neuralnetworksanddeeplearning.com/ Goodfellow, Bengio & Courville Deep Learning MIT Press LeCunn, Bengio & Hinton, Deep learning, Nature 5 (05) 436 Convolutional networks Goodfellow, Bengio & Courville Deep Learning MIT Press 5

52 Lecture Hopfield model 5

53 Associative memory problem 53

Neural networks. Chapter 19, Sections 1 5 1

Neural networks. Chapter 19, Sections 1 5 1 Neural networks Chapter 19, Sections 1 5 Chapter 19, Sections 1 5 1 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 19, Sections 1 5 2 Brains 10

More information

Neural networks. Chapter 20. Chapter 20 1

Neural networks. Chapter 20. Chapter 20 1 Neural networks Chapter 20 Chapter 20 1 Outline Brains Neural networks Perceptrons Multilayer networks Applications of neural networks Chapter 20 2 Brains 10 11 neurons of > 20 types, 10 14 synapses, 1ms

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward

More information

Grundlagen der Künstlichen Intelligenz

Grundlagen der Künstlichen Intelligenz Grundlagen der Künstlichen Intelligenz Neural networks Daniel Hennes 21.01.2018 (WS 2017/18) University Stuttgart - IPVS - Machine Learning & Robotics 1 Today Logistic regression Neural networks Perceptron

More information

Neural networks. Chapter 20, Section 5 1

Neural networks. Chapter 20, Section 5 1 Neural networks Chapter 20, Section 5 Chapter 20, Section 5 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 20, Section 5 2 Brains 0 neurons of

More information

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA   1/ 21 Neural Networks Chapter 8, Section 7 TB Artificial Intelligence Slides from AIMA http://aima.cs.berkeley.edu / 2 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural

More information

CHALMERS, GÖTEBORGS UNIVERSITET. EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

CHALMERS, GÖTEBORGS UNIVERSITET. EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD CHALMERS, GÖTEBORGS UNIVERSITET EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 135, FIM 72 GU, PhD Time: Place: Teachers: Allowed material: Not allowed: October 23, 217, at 8 3 12 3 Lindholmen-salar

More information

CSE446: Neural Networks Spring Many slides are adapted from Carlos Guestrin and Luke Zettlemoyer

CSE446: Neural Networks Spring Many slides are adapted from Carlos Guestrin and Luke Zettlemoyer CSE446: Neural Networks Spring 2017 Many slides are adapted from Carlos Guestrin and Luke Zettlemoyer Human Neurons Switching time ~ 0.001 second Number of neurons 10 10 Connections per neuron 10 4-5 Scene

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward

More information

CS:4420 Artificial Intelligence

CS:4420 Artificial Intelligence CS:4420 Artificial Intelligence Spring 2018 Neural Networks Cesare Tinelli The University of Iowa Copyright 2004 18, Cesare Tinelli and Stuart Russell a a These notes were originally developed by Stuart

More information

Introduction to Neural Networks

Introduction to Neural Networks Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks 鮑興國 Ph.D. National Taiwan University of Science and Technology Outline Perceptrons Gradient descent Multi-layer networks Backpropagation Hidden layer representations Examples

More information

Administration. Registration Hw3 is out. Lecture Captioning (Extra-Credit) Scribing lectures. Questions. Due on Thursday 10/6

Administration. Registration Hw3 is out. Lecture Captioning (Extra-Credit) Scribing lectures. Questions. Due on Thursday 10/6 Administration Registration Hw3 is out Due on Thursday 10/6 Questions Lecture Captioning (Extra-Credit) Look at Piazza for details Scribing lectures With pay; come talk to me/send email. 1 Projects Projects

More information

Deep Learning: a gentle introduction

Deep Learning: a gentle introduction Deep Learning: a gentle introduction Jamal Atif jamal.atif@dauphine.fr PSL, Université Paris-Dauphine, LAMSADE February 8, 206 Jamal Atif (Université Paris-Dauphine) Deep Learning February 8, 206 / Why

More information

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau Last update: October 26, 207 Neural networks CMSC 42: Section 8.7 Dana Nau Outline Applications of neural networks Brains Neural network units Perceptrons Multilayer perceptrons 2 Example Applications

More information

Machine Learning for Physicists Lecture 1

Machine Learning for Physicists Lecture 1 Machine Learning for Physicists Lecture 1 Summer 2017 University of Erlangen-Nuremberg Florian Marquardt (Image generated by a net with 20 hidden layers) OUTPUT INPUT (Picture: Wikimedia Commons) OUTPUT

More information

Feature Design. Feature Design. Feature Design. & Deep Learning

Feature Design. Feature Design. Feature Design. & Deep Learning Artificial Intelligence and its applications Lecture 9 & Deep Learning Professor Daniel Yeung danyeung@ieee.org Dr. Patrick Chan patrickchan@ieee.org South China University of Technology, China Appropriately

More information

PV021: Neural networks. Tomáš Brázdil

PV021: Neural networks. Tomáš Brázdil 1 PV021: Neural networks Tomáš Brázdil 2 Course organization Course materials: Main: The lecture Neural Networks and Deep Learning by Michael Nielsen http://neuralnetworksanddeeplearning.com/ (Extremely

More information

Course 395: Machine Learning - Lectures

Course 395: Machine Learning - Lectures Course 395: Machine Learning - Lectures Lecture 1-2: Concept Learning (M. Pantic) Lecture 3-4: Decision Trees & CBC Intro (M. Pantic & S. Petridis) Lecture 5-6: Evaluating Hypotheses (S. Petridis) Lecture

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) Human Brain Neurons Input-Output Transformation Input Spikes Output Spike Spike (= a brief pulse) (Excitatory Post-Synaptic Potential)

More information

Lecture 7 Artificial neural networks: Supervised learning

Lecture 7 Artificial neural networks: Supervised learning Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in

More information

Artificial neural networks

Artificial neural networks Artificial neural networks Chapter 8, Section 7 Artificial Intelligence, spring 203, Peter Ljunglöf; based on AIMA Slides c Stuart Russel and Peter Norvig, 2004 Chapter 8, Section 7 Outline Brains Neural

More information

SGD and Deep Learning

SGD and Deep Learning SGD and Deep Learning Subgradients Lets make the gradient cheating more formal. Recall that the gradient is the slope of the tangent. f(w 1 )+rf(w 1 ) (w w 1 ) Non differentiable case? w 1 Subgradients

More information

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided

More information

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann (Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for

More information

Lecture 4: Feed Forward Neural Networks

Lecture 4: Feed Forward Neural Networks Lecture 4: Feed Forward Neural Networks Dr. Roman V Belavkin Middlesex University BIS4435 Biological neurons and the brain A Model of A Single Neuron Neurons as data-driven models Neural Networks Training

More information

Need for Deep Networks Perceptron. Can only model linear functions. Kernel Machines. Non-linearity provided by kernels

Need for Deep Networks Perceptron. Can only model linear functions. Kernel Machines. Non-linearity provided by kernels Need for Deep Networks Perceptron Can only model linear functions Kernel Machines Non-linearity provided by kernels Need to design appropriate kernels (possibly selecting from a set, i.e. kernel learning)

More information

Artificial Neural Networks. Q550: Models in Cognitive Science Lecture 5

Artificial Neural Networks. Q550: Models in Cognitive Science Lecture 5 Artificial Neural Networks Q550: Models in Cognitive Science Lecture 5 "Intelligence is 10 million rules." --Doug Lenat The human brain has about 100 billion neurons. With an estimated average of one thousand

More information

Introduction Biologically Motivated Crude Model Backpropagation

Introduction Biologically Motivated Crude Model Backpropagation Introduction Biologically Motivated Crude Model Backpropagation 1 McCulloch-Pitts Neurons In 1943 Warren S. McCulloch, a neuroscientist, and Walter Pitts, a logician, published A logical calculus of the

More information

Artificial Neural Networks. Historical description

Artificial Neural Networks. Historical description Artificial Neural Networks Historical description Victor G. Lopez 1 / 23 Artificial Neural Networks (ANN) An artificial neural network is a computational model that attempts to emulate the functions of

More information

Neural Networks. Single-layer neural network. CSE 446: Machine Learning Emily Fox University of Washington March 10, /9/17

Neural Networks. Single-layer neural network. CSE 446: Machine Learning Emily Fox University of Washington March 10, /9/17 3/9/7 Neural Networks Emily Fox University of Washington March 0, 207 Slides adapted from Ali Farhadi (via Carlos Guestrin and Luke Zettlemoyer) Single-layer neural network 3/9/7 Perceptron as a neural

More information

Jakub Hajic Artificial Intelligence Seminar I

Jakub Hajic Artificial Intelligence Seminar I Jakub Hajic Artificial Intelligence Seminar I. 11. 11. 2014 Outline Key concepts Deep Belief Networks Convolutional Neural Networks A couple of questions Convolution Perceptron Feedforward Neural Network

More information

Simple Neural Nets For Pattern Classification

Simple Neural Nets For Pattern Classification CHAPTER 2 Simple Neural Nets For Pattern Classification Neural Networks General Discussion One of the simplest tasks that neural nets can be trained to perform is pattern classification. In pattern classification

More information

Neural Networks: Introduction

Neural Networks: Introduction Neural Networks: Introduction Machine Learning Fall 2017 Based on slides and material from Geoffrey Hinton, Richard Socher, Dan Roth, Yoav Goldberg, Shai Shalev-Shwartz and Shai Ben-David, and others 1

More information

Neural Networks. Nicholas Ruozzi University of Texas at Dallas

Neural Networks. Nicholas Ruozzi University of Texas at Dallas Neural Networks Nicholas Ruozzi University of Texas at Dallas Handwritten Digit Recognition Given a collection of handwritten digits and their corresponding labels, we d like to be able to correctly classify

More information

DEEP LEARNING AND NEURAL NETWORKS: BACKGROUND AND HISTORY

DEEP LEARNING AND NEURAL NETWORKS: BACKGROUND AND HISTORY DEEP LEARNING AND NEURAL NETWORKS: BACKGROUND AND HISTORY 1 On-line Resources http://neuralnetworksanddeeplearning.com/index.html Online book by Michael Nielsen http://matlabtricks.com/post-5/3x3-convolution-kernelswith-online-demo

More information

Machine Learning. Neural Networks

Machine Learning. Neural Networks Machine Learning Neural Networks Bryan Pardo, Northwestern University, Machine Learning EECS 349 Fall 2007 Biological Analogy Bryan Pardo, Northwestern University, Machine Learning EECS 349 Fall 2007 THE

More information

Part 8: Neural Networks

Part 8: Neural Networks METU Informatics Institute Min720 Pattern Classification ith Bio-Medical Applications Part 8: Neural Netors - INTRODUCTION: BIOLOGICAL VS. ARTIFICIAL Biological Neural Netors A Neuron: - A nerve cell as

More information

Data Mining Part 5. Prediction

Data Mining Part 5. Prediction Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,

More information

Classification goals: Make 1 guess about the label (Top-1 error) Make 5 guesses about the label (Top-5 error) No Bounding Box

Classification goals: Make 1 guess about the label (Top-1 error) Make 5 guesses about the label (Top-5 error) No Bounding Box ImageNet Classification with Deep Convolutional Neural Networks Alex Krizhevsky, Ilya Sutskever, Geoffrey E. Hinton Motivation Classification goals: Make 1 guess about the label (Top-1 error) Make 5 guesses

More information

Artificial Neural Networks Examination, June 2005

Artificial Neural Networks Examination, June 2005 Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either

More information

COMP9444 Neural Networks and Deep Learning 2. Perceptrons. COMP9444 c Alan Blair, 2017

COMP9444 Neural Networks and Deep Learning 2. Perceptrons. COMP9444 c Alan Blair, 2017 COMP9444 Neural Networks and Deep Learning 2. Perceptrons COMP9444 17s2 Perceptrons 1 Outline Neurons Biological and Artificial Perceptron Learning Linear Separability Multi-Layer Networks COMP9444 17s2

More information

CS 4700: Foundations of Artificial Intelligence

CS 4700: Foundations of Artificial Intelligence CS 4700: Foundations of Artificial Intelligence Prof. Bart Selman selman@cs.cornell.edu Machine Learning: Neural Networks R&N 18.7 Intro & perceptron learning 1 2 Neuron: How the brain works # neurons

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE 4: Linear Systems Summary # 3: Introduction to artificial neural networks DISTRIBUTED REPRESENTATION An ANN consists of simple processing units communicating with each other. The basic elements of

More information

Lecture 4: Perceptrons and Multilayer Perceptrons

Lecture 4: Perceptrons and Multilayer Perceptrons Lecture 4: Perceptrons and Multilayer Perceptrons Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning Perceptrons, Artificial Neuronal Networks Lecture 4: Perceptrons

More information

Introduction to Deep Learning

Introduction to Deep Learning Introduction to Deep Learning Some slides and images are taken from: David Wolfe Corne Wikipedia Geoffrey A. Hinton https://www.macs.hw.ac.uk/~dwcorne/teaching/introdl.ppt Feedforward networks for function

More information

Classification with Perceptrons. Reading:

Classification with Perceptrons. Reading: Classification with Perceptrons Reading: Chapters 1-3 of Michael Nielsen's online book on neural networks covers the basics of perceptrons and multilayer neural networks We will cover material in Chapters

More information

Need for Deep Networks Perceptron. Can only model linear functions. Kernel Machines. Non-linearity provided by kernels

Need for Deep Networks Perceptron. Can only model linear functions. Kernel Machines. Non-linearity provided by kernels Need for Deep Networks Perceptron Can only model linear functions Kernel Machines Non-linearity provided by kernels Need to design appropriate kernels (possibly selecting from a set, i.e. kernel learning)

More information

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009 AN INTRODUCTION TO NEURAL NETWORKS Scott Kuindersma November 12, 2009 SUPERVISED LEARNING We are given some training data: We must learn a function If y is discrete, we call it classification If it is

More information

Neural networks and optimization

Neural networks and optimization Neural networks and optimization Nicolas Le Roux Criteo 18/05/15 Nicolas Le Roux (Criteo) Neural networks and optimization 18/05/15 1 / 85 1 Introduction 2 Deep networks 3 Optimization 4 Convolutional

More information

Introduction To Artificial Neural Networks

Introduction To Artificial Neural Networks Introduction To Artificial Neural Networks Machine Learning Supervised circle square circle square Unsupervised group these into two categories Supervised Machine Learning Supervised Machine Learning Supervised

More information

COMP 551 Applied Machine Learning Lecture 14: Neural Networks

COMP 551 Applied Machine Learning Lecture 14: Neural Networks COMP 551 Applied Machine Learning Lecture 14: Neural Networks Instructor: Ryan Lowe (ryan.lowe@mail.mcgill.ca) Slides mostly by: Class web page: www.cs.mcgill.ca/~hvanho2/comp551 Unless otherwise noted,

More information

<Special Topics in VLSI> Learning for Deep Neural Networks (Back-propagation)

<Special Topics in VLSI> Learning for Deep Neural Networks (Back-propagation) Learning for Deep Neural Networks (Back-propagation) Outline Summary of Previous Standford Lecture Universal Approximation Theorem Inference vs Training Gradient Descent Back-Propagation

More information

Neural Networks. Mark van Rossum. January 15, School of Informatics, University of Edinburgh 1 / 28

Neural Networks. Mark van Rossum. January 15, School of Informatics, University of Edinburgh 1 / 28 1 / 28 Neural Networks Mark van Rossum School of Informatics, University of Edinburgh January 15, 2018 2 / 28 Goals: Understand how (recurrent) networks behave Find a way to teach networks to do a certain

More information

Sections 18.6 and 18.7 Artificial Neural Networks

Sections 18.6 and 18.7 Artificial Neural Networks Sections 18.6 and 18.7 Artificial Neural Networks CS4811 - Artificial Intelligence Nilufer Onder Department of Computer Science Michigan Technological University Outline The brain vs. artifical neural

More information

Learning and Memory in Neural Networks

Learning and Memory in Neural Networks Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units

More information

CS 4700: Foundations of Artificial Intelligence

CS 4700: Foundations of Artificial Intelligence CS 4700: Foundations of Artificial Intelligence Prof. Bart Selman selman@cs.cornell.edu Machine Learning: Neural Networks R&N 18.7 Intro & perceptron learning 1 2 Neuron: How the brain works # neurons

More information

Convolutional Neural Networks II. Slides from Dr. Vlad Morariu

Convolutional Neural Networks II. Slides from Dr. Vlad Morariu Convolutional Neural Networks II Slides from Dr. Vlad Morariu 1 Optimization Example of optimization progress while training a neural network. (Loss over mini-batches goes down over time.) 2 Learning rate

More information

Artificial Neural Network and Fuzzy Logic

Artificial Neural Network and Fuzzy Logic Artificial Neural Network and Fuzzy Logic 1 Syllabus 2 Syllabus 3 Books 1. Artificial Neural Networks by B. Yagnanarayan, PHI - (Cover Topologies part of unit 1 and All part of Unit 2) 2. Neural Networks

More information

Artificial Neural Networks Examination, March 2004

Artificial Neural Networks Examination, March 2004 Artificial Neural Networks Examination, March 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

Machine Learning and Data Mining. Multi-layer Perceptrons & Neural Networks: Basics. Prof. Alexander Ihler

Machine Learning and Data Mining. Multi-layer Perceptrons & Neural Networks: Basics. Prof. Alexander Ihler + Machine Learning and Data Mining Multi-layer Perceptrons & Neural Networks: Basics Prof. Alexander Ihler Linear Classifiers (Perceptrons) Linear Classifiers a linear classifier is a mapping which partitions

More information

Artificial Neural Networks Examination, June 2004

Artificial Neural Networks Examination, June 2004 Artificial Neural Networks Examination, June 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

Machine Learning

Machine Learning Machine Learning 10-315 Maria Florina Balcan Machine Learning Department Carnegie Mellon University 03/29/2019 Today: Artificial neural networks Backpropagation Reading: Mitchell: Chapter 4 Bishop: Chapter

More information

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks INFOB2KI 2017-2018 Utrecht University The Netherlands ARTIFICIAL INTELLIGENCE Artificial Neural Networks Lecturer: Silja Renooij These slides are part of the INFOB2KI Course Notes available from www.cs.uu.nl/docs/vakken/b2ki/schema.html

More information

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, 2012 Sasidharan Sreedharan www.sasidharan.webs.com 3/1/2012 1 Syllabus Artificial Intelligence Systems- Neural Networks, fuzzy logic,

More information

Introduction to Neural Networks

Introduction to Neural Networks Introduction to Neural Networks Philipp Koehn 3 October 207 Linear Models We used before weighted linear combination of feature values h j and weights λ j score(λ, d i ) = j λ j h j (d i ) Such models

More information

Machine Learning for Large-Scale Data Analysis and Decision Making A. Neural Networks Week #6

Machine Learning for Large-Scale Data Analysis and Decision Making A. Neural Networks Week #6 Machine Learning for Large-Scale Data Analysis and Decision Making 80-629-17A Neural Networks Week #6 Today Neural Networks A. Modeling B. Fitting C. Deep neural networks Today s material is (adapted)

More information

Master Recherche IAC TC2: Apprentissage Statistique & Optimisation

Master Recherche IAC TC2: Apprentissage Statistique & Optimisation Master Recherche IAC TC2: Apprentissage Statistique & Optimisation Alexandre Allauzen Anne Auger Michèle Sebag LIMSI LRI Oct. 4th, 2012 This course Bio-inspired algorithms Classical Neural Nets History

More information

Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!

Artificial Neural Networks and Nonparametric Methods CMPSCI 383 Nov 17, 2011! Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011! 1 Todayʼs lecture" How the brain works (!)! Artificial neural networks! Perceptrons! Multilayer feed-forward networks! Error

More information

2018 EE448, Big Data Mining, Lecture 5. (Part II) Weinan Zhang Shanghai Jiao Tong University

2018 EE448, Big Data Mining, Lecture 5. (Part II) Weinan Zhang Shanghai Jiao Tong University 2018 EE448, Big Data Mining, Lecture 5 Supervised Learning (Part II) Weinan Zhang Shanghai Jiao Tong University http://wnzhang.net http://wnzhang.net/teaching/ee448/index.html Content of Supervised Learning

More information

Machine Learning Lecture 10

Machine Learning Lecture 10 Machine Learning Lecture 10 Neural Networks 26.11.2018 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de leibe@vision.rwth-aachen.de Today s Topic Deep Learning 2 Course Outline Fundamentals Bayes

More information

Artificial Neural Networks The Introduction

Artificial Neural Networks The Introduction Artificial Neural Networks The Introduction 01001110 01100101 01110101 01110010 01101111 01101110 01101111 01110110 01100001 00100000 01110011 01101011 01110101 01110000 01101001 01101110 01100001 00100000

More information

Artificial Neural Network

Artificial Neural Network Artificial Neural Network Contents 2 What is ANN? Biological Neuron Structure of Neuron Types of Neuron Models of Neuron Analogy with human NN Perceptron OCR Multilayer Neural Network Back propagation

More information

Administrative Issues. CS5242 mirror site: https://web.bii.a-star.edu.sg/~leehk/cs5424.html

Administrative Issues. CS5242 mirror site: https://web.bii.a-star.edu.sg/~leehk/cs5424.html Administrative Issues CS5242 mirror site: https://web.bii.a-star.edu.sg/~leehk/cs5424.html 1 Assignments (10% * 4 = 40%) For each assignment It has multiple sub-tasks, including coding tasks You need to

More information

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Neural Networks Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Brains as Computational Devices Brains advantages with respect to digital computers: Massively parallel Fault-tolerant Reliable

More information

Based on the original slides of Hung-yi Lee

Based on the original slides of Hung-yi Lee Based on the original slides of Hung-yi Lee Google Trends Deep learning obtains many exciting results. Can contribute to new Smart Services in the Context of the Internet of Things (IoT). IoT Services

More information

Introduction to Deep Learning

Introduction to Deep Learning Introduction to Deep Learning A. G. Schwing & S. Fidler University of Toronto, 2015 A. G. Schwing & S. Fidler (UofT) CSC420: Intro to Image Understanding 2015 1 / 39 Outline 1 Universality of Neural Networks

More information

Neural Networks. Xiaojin Zhu Computer Sciences Department University of Wisconsin, Madison. slide 1

Neural Networks. Xiaojin Zhu Computer Sciences Department University of Wisconsin, Madison. slide 1 Neural Networks Xiaoin Zhu erryzhu@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison slide 1 Terminator 2 (1991) JOHN: Can you learn? So you can be... you know. More human. Not

More information

Neural Networks and Deep Learning.

Neural Networks and Deep Learning. Neural Networks and Deep Learning www.cs.wisc.edu/~dpage/cs760/ 1 Goals for the lecture you should understand the following concepts perceptrons the perceptron training rule linear separability hidden

More information

CE213 Artificial Intelligence Lecture 14

CE213 Artificial Intelligence Lecture 14 CE213 Artificial Intelligence Lecture 14 Neural Networks: Part 2 Learning Rules -Hebb Rule - Perceptron Rule -Delta Rule Neural Networks Using Linear Units [ Difficulty warning: equations! ] 1 Learning

More information

Artificial Neural Networks. Introduction to Computational Neuroscience Tambet Matiisen

Artificial Neural Networks. Introduction to Computational Neuroscience Tambet Matiisen Artificial Neural Networks Introduction to Computational Neuroscience Tambet Matiisen 2.04.2018 Artificial neural network NB! Inspired by biology, not based on biology! Applications Automatic speech recognition

More information

Apprentissage, réseaux de neurones et modèles graphiques (RCP209) Neural Networks and Deep Learning

Apprentissage, réseaux de neurones et modèles graphiques (RCP209) Neural Networks and Deep Learning Apprentissage, réseaux de neurones et modèles graphiques (RCP209) Neural Networks and Deep Learning Nicolas Thome Prenom.Nom@cnam.fr http://cedric.cnam.fr/vertigo/cours/ml2/ Département Informatique Conservatoire

More information

Introduction to Deep Learning

Introduction to Deep Learning Introduction to Deep Learning A. G. Schwing & S. Fidler University of Toronto, 2014 A. G. Schwing & S. Fidler (UofT) CSC420: Intro to Image Understanding 2014 1 / 35 Outline 1 Universality of Neural Networks

More information

Lecture 6. Regression

Lecture 6. Regression Lecture 6. Regression Prof. Alan Yuille Summer 2014 Outline 1. Introduction to Regression 2. Binary Regression 3. Linear Regression; Polynomial Regression 4. Non-linear Regression; Multilayer Perceptron

More information

CSE446: Neural Networks Winter 2015

CSE446: Neural Networks Winter 2015 CSE446: Neural Networks Winter 2015 Luke Ze

More information

Introduction to Neural Networks

Introduction to Neural Networks CUONG TUAN NGUYEN SEIJI HOTTA MASAKI NAKAGAWA Tokyo University of Agriculture and Technology Copyright by Nguyen, Hotta and Nakagawa 1 Pattern classification Which category of an input? Example: Character

More information

COMP-4360 Machine Learning Neural Networks

COMP-4360 Machine Learning Neural Networks COMP-4360 Machine Learning Neural Networks Jacky Baltes Autonomous Agents Lab University of Manitoba Winnipeg, Canada R3T 2N2 Email: jacky@cs.umanitoba.ca WWW: http://www.cs.umanitoba.ca/~jacky http://aalab.cs.umanitoba.ca

More information

Deep Neural Networks (1) Hidden layers; Back-propagation

Deep Neural Networks (1) Hidden layers; Back-propagation Deep Neural Networs (1) Hidden layers; Bac-propagation Steve Renals Machine Learning Practical MLP Lecture 3 4 October 2017 / 9 October 2017 MLP Lecture 3 Deep Neural Networs (1) 1 Recap: Softmax single

More information

Master Recherche IAC Apprentissage Statistique, Optimisation & Applications

Master Recherche IAC Apprentissage Statistique, Optimisation & Applications Master Recherche IAC Apprentissage Statistique, Optimisation & Applications Anne Auger Balazs Kégl Michèle Sebag TAO Nov. 28th, 2012 Contents WHO Anne Auger, optimization Balazs Kégl, machine learning

More information

AI Programming CS F-20 Neural Networks

AI Programming CS F-20 Neural Networks AI Programming CS662-2008F-20 Neural Networks David Galles Department of Computer Science University of San Francisco 20-0: Symbolic AI Most of this class has been focused on Symbolic AI Focus or symbols

More information

Sections 18.6 and 18.7 Artificial Neural Networks

Sections 18.6 and 18.7 Artificial Neural Networks Sections 18.6 and 18.7 Artificial Neural Networks CS4811 - Artificial Intelligence Nilufer Onder Department of Computer Science Michigan Technological University Outline The brain vs artifical neural networks

More information

Artificial Neural Network

Artificial Neural Network Artificial Neural Network Eung Je Woo Department of Biomedical Engineering Impedance Imaging Research Center (IIRC) Kyung Hee University Korea ejwoo@khu.ac.kr Neuron and Neuron Model McCulloch and Pitts

More information

CMSC 421: Neural Computation. Applications of Neural Networks

CMSC 421: Neural Computation. Applications of Neural Networks CMSC 42: Neural Computation definition synonyms neural networks artificial neural networks neural modeling connectionist models parallel distributed processing AI perspective Applications of Neural Networks

More information

Learning Deep Architectures for AI. Part I - Vijay Chakilam

Learning Deep Architectures for AI. Part I - Vijay Chakilam Learning Deep Architectures for AI - Yoshua Bengio Part I - Vijay Chakilam Chapter 0: Preliminaries Neural Network Models The basic idea behind the neural network approach is to model the response as a

More information

Deep Learning (CNNs)

Deep Learning (CNNs) 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Deep Learning (CNNs) Deep Learning Readings: Murphy 28 Bishop - - HTF - - Mitchell

More information

Introduction to Artificial Neural Networks

Introduction to Artificial Neural Networks Facultés Universitaires Notre-Dame de la Paix 27 March 2007 Outline 1 Introduction 2 Fundamentals Biological neuron Artificial neuron Artificial Neural Network Outline 3 Single-layer ANN Perceptron Adaline

More information

10. Artificial Neural Networks

10. Artificial Neural Networks Foundations of Machine Learning CentraleSupélec Fall 217 1. Artificial Neural Networks Chloé-Agathe Azencot Centre for Computational Biology, Mines ParisTech chloe-agathe.azencott@mines-paristech.fr Learning

More information

Using a Hopfield Network: A Nuts and Bolts Approach

Using a Hopfield Network: A Nuts and Bolts Approach Using a Hopfield Network: A Nuts and Bolts Approach November 4, 2013 Gershon Wolfe, Ph.D. Hopfield Model as Applied to Classification Hopfield network Training the network Updating nodes Sequencing of

More information

Machine Learning Lecture 12

Machine Learning Lecture 12 Machine Learning Lecture 12 Neural Networks 30.11.2017 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de leibe@vision.rwth-aachen.de Course Outline Fundamentals Bayes Decision Theory Probability

More information

Classification of Hand-Written Digits Using Scattering Convolutional Network

Classification of Hand-Written Digits Using Scattering Convolutional Network Mid-year Progress Report Classification of Hand-Written Digits Using Scattering Convolutional Network Dongmian Zou Advisor: Professor Radu Balan Co-Advisor: Dr. Maneesh Singh (SRI) Background Overview

More information