COMP9444 Neural Networks and Deep Learning 3. Backpropagation

Size: px
Start display at page:

Download "COMP9444 Neural Networks and Deep Learning 3. Backpropagation"

Transcription

1 COMP9444 Neural Netwrks and Deep Learning 3. Backprpagatin Tetbk, Sectins 4.3, 5.2, 6.5.2

2 COMP s2 Backprpagatin 1 Outline Supervised Learning Ockham s Razr (5.2) Multi-Layer Netwrks Gradient Descent (4.3, 6.5.2)

3 COMP s2 Backprpagatin 2 Types f Learning Supervised Learning agent is presented with eamples f inputs and their target utputs Reinfrcement Learning agent is nt presented with target utputs, but is given a reward signal, which it aims t maimize Unsupervised Learning agent is nly presented with the inputs themselves, and aims t find structure in these inputs

4 COMP s2 Backprpagatin 3 Supervised Learning we have a training set and a test set, each cnsisting f a set f items; fr each item, a number f input attributes and a target value are specified. the aim is t predict the target value, based n the input attributes. agent is presented with the input and target utput fr each item in the training set; it must then predict the utput fr each item in the test set varius learning paradigms are available: Neural Netwrk Decisin Tree Supprt Vectr Machine, etc.

5 COMP s2 Backprpagatin 4 Supervised Learning Issues framewrk (decisin tree, neural netwrk, SVM, etc.) representatin (f inputs and utputs) pre-prcessing / pst-prcessing training methd (perceptrn learning, backprpagatin, etc.) generalizatin (avid ver-fitting) evaluatin (separate training and testing sets)

6 COMP s2 Backprpagatin 5 Curve Fitting Which curve gives the best fit t these data? f()

7 COMP s2 Backprpagatin 6 Curve Fitting Which curve gives the best fit t these data? f() straight line?

8 COMP s2 Backprpagatin 7 Curve Fitting Which curve gives the best fit t these data? f() parabla?

9 COMP s2 Backprpagatin 8 Curve Fitting Which curve gives the best fit t these data? f() 4th rder plynmial?

10 COMP s2 Backprpagatin 9 Curve Fitting Which curve gives the best fit t these data? f() Smething else?

11 COMP s2 Backprpagatin 10 Ockham s Razr The mst likely hypthesis is the simplest ne cnsistent with the data. inadequate gd cmprmise ver-fitting Since there can be nise in the measurements, in practice need t make a tradeff between simplicity f the hypthesis and hw well it fits the data.

12 COMP s2 Backprpagatin 11 Outliers Predicted Buchanan Vtes by Cunty [faculty.washingtn.edu/mtbrett]

13 COMP s2 Backprpagatin 12 Butterfly Ballt

14 COMP s2 Backprpagatin 13 Recall: Limitatins f Perceptrns Prblem: many useful functins are nt linearly separable (e.g. XOR) I 1 I 1 I ? I 2 I I 2 I 1 I 2 I 1 I 2 (a) and (b) r (c) r I 1 I 2 Pssible slutin: 1 XOR 2 can be written as: ( 1 AND 2 ) NOR ( 1 NOR 2 ) Recall that AND, OR and NOR can be implemented by perceptrns.

15 COMP s2 Backprpagatin 14 Multi-Layer Neural Netwrks XOR NOR AND NOR Prblem: Hw can we train it t learn a new functin? (credit assignment)

16 COMP s2 Backprpagatin 15 Tw-Layer Neural Netwrk Output units a i W j,i Hidden units a j W k,j Input units a k Nrmally, the numbers f input and utput units are fied, but we can chse the number f hidden units.

17 COMP s2 Backprpagatin 16 The XOR Prblem 1 2 target fr this ty prblem, there is nly a training set; there is n validatin r test set, s we dn t wrry abut verfitting the XOR data cannt be learned with a perceptrn, but can be achieved using a 2-layer netwrk with tw hidden units

18 COMP s2 Backprpagatin 17 Neural Netwrk Equatins b 1 u 1 w 11 c z s v 1 v 2 y y1 2 u 2 b 2 w 22 u 1 = b 1 + w w 12 2 y 1 = g(u 1 ) s = c+v 1 y 1 + v 2 y 2 z = g(s) E = 1 2 (z t) 2 w 21 w We smetimes use w as a shrthand fr any f the trainable weights {c,v 1,v 2,b 1,b 2,w 11,w 21,w 12,w 22 }.

19 COMP s2 Backprpagatin 18 NN Training as Cst Minimizatin We define an errr functin E t be (half) the sum ver all input patterns f the square f the difference between actual utput and desired utput E = 1 2 (z t) 2 If we think f E as height, it defines an errr landscape n the weight space. The aim is t find a set f weights fr which E is very lw.

20 COMP s2 Backprpagatin 19 Lcal Search in Weight Space Prblem: because f the step functin, the landscape will nt be smth but will instead cnsist almst entirely f flat lcal regins and shulders, with ccasinal discntinuus jumps.

21 COMP s2 Backprpagatin 20 Key Idea a i a i a i t in i in i in i 1 (a) Step functin (b) Sign functin (c) Sigmid functin Replace the (discntinuus) step functin with a differentiable functin, such as the sigmid: g(s)= 1 1+e s r hyperblic tangent g(s)=tanh(s)= es e s e s + e s = 2( 1 1+e 2s ) 1

22 COMP s2 Backprpagatin 21 Gradient Descent (4.3) Recall that the errr functin E is (half) the sum ver all input patterns f the square f the difference between actual utput and desired utput E = 1 2 (z t) 2 The aim is t find a set f weights fr which E is very lw. If the functins invlved are smth, we can use multi-variable calculus t adjust the weights in such a way as t take us in the steepest dwnhill directin. w w η E w Parameter η is called the learning rate.

23 COMP s2 Backprpagatin 22 Chain Rule (6.5.2) If, say y=y(u) u=u() Then y = y u u This principle can be used t cmpute the partial derivatives in an efficient and lcalized manner. Nte that the transfer functin must be differentiable (usually sigmid, r tanh). Nte: if z(s)= 1 1+e s, z (s)=z(1 z). if z(s)=tanh(s), z (s)= 1 z 2.

24 COMP s2 Backprpagatin 23 Frward Pass b 1 u 1 w 11 c z s v 1 v 2 y1 2 u 2 w 21 w 12 y b 2 w 22 u 1 = b 1 + w w 12 2 y 1 = g(u 1 ) s = c+v 1 y 1 + v 2 y 2 z = g(s) E = 1 2 (z t) 2 1 2

25 COMP s2 Backprpagatin 24 Backprpagatin Partial Derivatives E z dz ds = z t = g (s)=z(1 z) s y 1 = v 1 dy 1 du 1 = y 1 (1 y 1 ) Useful ntatin δ ut = E δ 1 = E δ 2 = E s u 1 u 2 Then δ ut = (z t) z(1 z) E v 1 = δ ut y 1 δ 1 = δ ut v 1 y 1 (1 y 1 ) E w 11 = δ 1 1 Partial derivatives can be calculated efficiently by packprpagating deltas thrugh the netwrk.

26 COMP s2 Backprpagatin 25 Tw-Layer NN s Applicatins Medical Dignsis Autnmus Driving Game Playing Credit Card Fraud Detectin Handwriting Recgnitin Financial Predictin

27 COMP s2 Backprpagatin 26 Eample: Pima Indians Diabetes Dataset Attribute mean stdv 1. Number f times pregnant Plasma glucse cncentratin Diastlic bld pressure (mm Hg) Triceps skin fld thickness (mm) Hur serum insulin (mu U/ml) Bdy mass inde (weight in kg/(height in m) 2 ) Diabetes pedigree functin Age (years) Class variable (0 r 1)

28 COMP s2 Backprpagatin 27 Training Tips re-scale inputs and utputs t be in the range 0 t 1 r 1 t 1 replace missing values with mean value fr that attribute initialize weights t very small randm values n-line r batch learning three different ways t prevent verfitting: limit the number f hidden ndes r cnnectins limit the training time, using a validatin set weight decay adjust learning rate (and mmentum) t suit the particular task

29 COMP s2 Backprpagatin 28 Overfitting in Neural Netwrks Errr Errr versus weight updates (eample 1) Training set errr Validatin set errr Number f weight updates Nte: -ais culd als be number f hidden ndes r cnnectins

30 COMP s2 Backprpagatin 29 Overfitting in Neural Netwrks Errr Errr versus weight updates (eample 2) Training set errr Validatin set errr Number f weight updates

31 COMP s2 Backprpagatin 30 ALVINN (Pmerleau 1991, 1993)

32 COMP s2 Backprpagatin 31 ALVINN Sharp Left Straight Ahead Sharp Right 30 Output Units 4 Hidden Units 3032 Sensr Input Retina

33 COMP s2 Backprpagatin 32 ALVINN Autnmus Land Vehicle In a Neural Netwrk later versin included a snar range finder 8 32 range finder input retina 29 hidden units 45 utput units Supervised Learning, frm human actins (Behaviral Clning) additinal transfrmed training items t cver emergency situatins drve autnmusly frm cast t cast

34 COMP s2 Backprpagatin 33 Summary Neural netwrks are bilgically inspired Multi-layer neural netwrks can learn nn linearly separable functins Backprpagatin is effective and widely used

IAML: Support Vector Machines

IAML: Support Vector Machines 1 / 22 IAML: Supprt Vectr Machines Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester 1 2 / 22 Outline Separating hyperplane with maimum margin Nn-separable training data Epanding the input int

More information

In SMV I. IAML: Support Vector Machines II. This Time. The SVM optimization problem. We saw:

In SMV I. IAML: Support Vector Machines II. This Time. The SVM optimization problem. We saw: In SMV I IAML: Supprt Vectr Machines II Nigel Gddard Schl f Infrmatics Semester 1 We sa: Ma margin trick Gemetry f the margin and h t cmpute it Finding the ma margin hyperplane using a cnstrained ptimizatin

More information

x 1 Outline IAML: Logistic Regression Decision Boundaries Example Data

x 1 Outline IAML: Logistic Regression Decision Boundaries Example Data Outline IAML: Lgistic Regressin Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester Lgistic functin Lgistic regressin Learning lgistic regressin Optimizatin The pwer f nn-linear basis functins Least-squares

More information

Artificial Neural Networks MLP, Backpropagation

Artificial Neural Networks MLP, Backpropagation Artificial Neural Netwrks MLP, Backprpagatin 01001110 01100101 01110101 01110010 01101111 01101110 01101111 01110110 01100001 00100000 01110011 01101011 01110101 01110000 01101001 01101110 01100001 00100000

More information

COMP 551 Applied Machine Learning Lecture 11: Support Vector Machines

COMP 551 Applied Machine Learning Lecture 11: Support Vector Machines COMP 551 Applied Machine Learning Lecture 11: Supprt Vectr Machines Instructr: (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted fr this curse

More information

Pattern Recognition 2014 Support Vector Machines

Pattern Recognition 2014 Support Vector Machines Pattern Recgnitin 2014 Supprt Vectr Machines Ad Feelders Universiteit Utrecht Ad Feelders ( Universiteit Utrecht ) Pattern Recgnitin 1 / 55 Overview 1 Separable Case 2 Kernel Functins 3 Allwing Errrs (Sft

More information

Resampling Methods. Cross-validation, Bootstrapping. Marek Petrik 2/21/2017

Resampling Methods. Cross-validation, Bootstrapping. Marek Petrik 2/21/2017 Resampling Methds Crss-validatin, Btstrapping Marek Petrik 2/21/2017 Sme f the figures in this presentatin are taken frm An Intrductin t Statistical Learning, with applicatins in R (Springer, 2013) with

More information

Slide04 (supplemental) Haykin Chapter 4 (both 2nd and 3rd ed): Multi-Layer Perceptrons

Slide04 (supplemental) Haykin Chapter 4 (both 2nd and 3rd ed): Multi-Layer Perceptrons Slide04 supplemental) Haykin Chapter 4 bth 2nd and 3rd ed): Multi-Layer Perceptrns CPSC 636-600 Instructr: Ynsuck Che Heuristic fr Making Backprp Perfrm Better 1. Sequential vs. batch update: fr large

More information

k-nearest Neighbor How to choose k Average of k points more reliable when: Large k: noise in attributes +o o noise in class labels

k-nearest Neighbor How to choose k Average of k points more reliable when: Large k: noise in attributes +o o noise in class labels Mtivating Example Memry-Based Learning Instance-Based Learning K-earest eighbr Inductive Assumptin Similar inputs map t similar utputs If nt true => learning is impssible If true => learning reduces t

More information

COMP9414/ 9814/ 3411: Artificial Intelligence. 14. Course Review. COMP3411 c UNSW, 2014

COMP9414/ 9814/ 3411: Artificial Intelligence. 14. Course Review. COMP3411 c UNSW, 2014 COMP9414/ 9814/ 3411: Artificial Intelligence 14. Curse Review COMP9414/9814/3411 14s1 Review 1 Assessment Assessable cmpnents f the curse: Assignment 1 10% Assignment 2 8% Assignment 3 12% Written Eam

More information

Agenda. What is Machine Learning? Learning Type of Learning: Supervised, Unsupervised and semi supervised Classification

Agenda. What is Machine Learning? Learning Type of Learning: Supervised, Unsupervised and semi supervised Classification Agenda Artificial Intelligence and its applicatins Lecture 6 Supervised Learning Prfessr Daniel Yeung danyeung@ieee.rg Dr. Patrick Chan patrickchan@ieee.rg Suth China University f Technlgy, China Learning

More information

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised

More information

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised

More information

Tree Structured Classifier

Tree Structured Classifier Tree Structured Classifier Reference: Classificatin and Regressin Trees by L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stne, Chapman & Hall, 98. A Medical Eample (CART): Predict high risk patients

More information

COMP 551 Applied Machine Learning Lecture 4: Linear classification

COMP 551 Applied Machine Learning Lecture 4: Linear classification COMP 551 Applied Machine Learning Lecture 4: Linear classificatin Instructr: Jelle Pineau (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted

More information

Elements of Machine Intelligence - I

Elements of Machine Intelligence - I ECE-175A Elements f Machine Intelligence - I Ken Kreutz-Delgad Nun Vascncels ECE Department, UCSD Winter 2011 The curse The curse will cver basic, but imprtant, aspects f machine learning and pattern recgnitin

More information

Kinetic Model Completeness

Kinetic Model Completeness 5.68J/10.652J Spring 2003 Lecture Ntes Tuesday April 15, 2003 Kinetic Mdel Cmpleteness We say a chemical kinetic mdel is cmplete fr a particular reactin cnditin when it cntains all the species and reactins

More information

You need to be able to define the following terms and answer basic questions about them:

You need to be able to define the following terms and answer basic questions about them: CS440/ECE448 Sectin Q Fall 2017 Midterm Review Yu need t be able t define the fllwing terms and answer basic questins abut them: Intr t AI, agents and envirnments Pssible definitins f AI, prs and cns f

More information

Chapter 3: Cluster Analysis

Chapter 3: Cluster Analysis Chapter 3: Cluster Analysis } 3.1 Basic Cncepts f Clustering 3.1.1 Cluster Analysis 3.1. Clustering Categries } 3. Partitining Methds 3..1 The principle 3.. K-Means Methd 3..3 K-Medids Methd 3..4 CLARA

More information

Reinforcement Learning" CMPSCI 383 Nov 29, 2011!

Reinforcement Learning CMPSCI 383 Nov 29, 2011! Reinfrcement Learning" CMPSCI 383 Nv 29, 2011! 1 Tdayʼs lecture" Review f Chapter 17: Making Cmple Decisins! Sequential decisin prblems! The mtivatin and advantages f reinfrcement learning.! Passive learning!

More information

Part 3 Introduction to statistical classification techniques

Part 3 Introduction to statistical classification techniques Part 3 Intrductin t statistical classificatin techniques Machine Learning, Part 3, March 07 Fabi Rli Preamble ØIn Part we have seen that if we knw: Psterir prbabilities P(ω i / ) Or the equivalent terms

More information

The blessing of dimensionality for kernel methods

The blessing of dimensionality for kernel methods fr kernel methds Building classifiers in high dimensinal space Pierre Dupnt Pierre.Dupnt@ucluvain.be Classifiers define decisin surfaces in sme feature space where the data is either initially represented

More information

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must M.E. Aggune, M.J. Dambrg, M.A. El-Sharkawi, R.J. Marks II and L.E. Atlas, "Dynamic and static security assessment f pwer systems using artificial neural netwrks", Prceedings f the NSF Wrkshp n Applicatins

More information

Support-Vector Machines

Support-Vector Machines Supprt-Vectr Machines Intrductin Supprt vectr machine is a linear machine with sme very nice prperties. Haykin chapter 6. See Alpaydin chapter 13 fr similar cntent. Nte: Part f this lecture drew material

More information

Feedforward Neural Networks

Feedforward Neural Networks Feedfrward Neural Netwrks Yagmur Gizem Cinar, Eric Gaussier AMA, LIG, Univ. Grenble Alpes 17 March 2017 Yagmur Gizem Cinar, Eric Gaussier Multilayer Perceptrns (MLP) 17 March 2017 1 / 42 Reference Bk Deep

More information

MODULE FOUR. This module addresses functions. SC Academic Elementary Algebra Standards:

MODULE FOUR. This module addresses functions. SC Academic Elementary Algebra Standards: MODULE FOUR This mdule addresses functins SC Academic Standards: EA-3.1 Classify a relatinship as being either a functin r nt a functin when given data as a table, set f rdered pairs, r graph. EA-3.2 Use

More information

Enhancing Performance of MLP/RBF Neural Classifiers via an Multivariate Data Distribution Scheme

Enhancing Performance of MLP/RBF Neural Classifiers via an Multivariate Data Distribution Scheme Enhancing Perfrmance f / Neural Classifiers via an Multivariate Data Distributin Scheme Halis Altun, Gökhan Gelen Nigde University, Electrical and Electrnics Engineering Department Nigde, Turkey haltun@nigde.edu.tr

More information

A Scalable Recurrent Neural Network Framework for Model-free

A Scalable Recurrent Neural Network Framework for Model-free A Scalable Recurrent Neural Netwrk Framewrk fr Mdel-free POMDPs April 3, 2007 Zhenzhen Liu, Itamar Elhanany Machine Intelligence Lab Department f Electrical and Cmputer Engineering The University f Tennessee

More information

If (IV) is (increased, decreased, changed), then (DV) will (increase, decrease, change) because (reason based on prior research).

If (IV) is (increased, decreased, changed), then (DV) will (increase, decrease, change) because (reason based on prior research). Science Fair Prject Set Up Instructins 1) Hypthesis Statement 2) Materials List 3) Prcedures 4) Safety Instructins 5) Data Table 1) Hw t write a HYPOTHESIS STATEMENT Use the fllwing frmat: If (IV) is (increased,

More information

Simple Linear Regression (single variable)

Simple Linear Regression (single variable) Simple Linear Regressin (single variable) Intrductin t Machine Learning Marek Petrik January 31, 2017 Sme f the figures in this presentatin are taken frm An Intrductin t Statistical Learning, with applicatins

More information

Differentiation Applications 1: Related Rates

Differentiation Applications 1: Related Rates Differentiatin Applicatins 1: Related Rates 151 Differentiatin Applicatins 1: Related Rates Mdel 1: Sliding Ladder 10 ladder y 10 ladder 10 ladder A 10 ft ladder is leaning against a wall when the bttm

More information

Data Mining: Concepts and Techniques. Classification and Prediction. Chapter February 8, 2007 CSE-4412: Data Mining 1

Data Mining: Concepts and Techniques. Classification and Prediction. Chapter February 8, 2007 CSE-4412: Data Mining 1 Data Mining: Cncepts and Techniques Classificatin and Predictin Chapter 6.4-6 February 8, 2007 CSE-4412: Data Mining 1 Chapter 6 Classificatin and Predictin 1. What is classificatin? What is predictin?

More information

Physical Layer: Outline

Physical Layer: Outline 18-: Intrductin t Telecmmunicatin Netwrks Lectures : Physical Layer Peter Steenkiste Spring 01 www.cs.cmu.edu/~prs/nets-ece Physical Layer: Outline Digital Representatin f Infrmatin Characterizatin f Cmmunicatin

More information

Admin. MDP Search Trees. Optimal Quantities. Reinforcement Learning

Admin. MDP Search Trees. Optimal Quantities. Reinforcement Learning Admin Reinfrcement Learning Cntent adapted frm Berkeley CS188 MDP Search Trees Each MDP state prjects an expectimax-like search tree Optimal Quantities The value (utility) f a state s: V*(s) = expected

More information

COMP 551 Applied Machine Learning Lecture 9: Support Vector Machines (cont d)

COMP 551 Applied Machine Learning Lecture 9: Support Vector Machines (cont d) COMP 551 Applied Machine Learning Lecture 9: Supprt Vectr Machines (cnt d) Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Class web page: www.cs.mcgill.ca/~hvanh2/cmp551 Unless therwise

More information

What is Statistical Learning?

What is Statistical Learning? What is Statistical Learning? Sales 5 10 15 20 25 Sales 5 10 15 20 25 Sales 5 10 15 20 25 0 50 100 200 300 TV 0 10 20 30 40 50 Radi 0 20 40 60 80 100 Newspaper Shwn are Sales vs TV, Radi and Newspaper,

More information

NEURAL NETWORKS. Neural networks

NEURAL NETWORKS. Neural networks NEURAL NETWORKS Neural netwrks Mtivatin Humans are able t prcess cmplex tasks efficiently (perceptin, pattern recgnitin, reasning, etc.) Ability t learn frm examples Adaptability and fault tlerance Engineering

More information

Midwest Big Data Summer School: Machine Learning I: Introduction. Kris De Brabanter

Midwest Big Data Summer School: Machine Learning I: Introduction. Kris De Brabanter Midwest Big Data Summer Schl: Machine Learning I: Intrductin Kris De Brabanter kbrabant@iastate.edu Iwa State University Department f Statistics Department f Cmputer Science June 24, 2016 1/24 Outline

More information

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009 AN INTRODUCTION TO NEURAL NETWORKS Scott Kuindersma November 12, 2009 SUPERVISED LEARNING We are given some training data: We must learn a function If y is discrete, we call it classification If it is

More information

COMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification

COMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification COMP 551 Applied Machine Learning Lecture 5: Generative mdels fr linear classificatin Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Jelle Pineau Class web page: www.cs.mcgill.ca/~hvanh2/cmp551

More information

Resampling Methods. Chapter 5. Chapter 5 1 / 52

Resampling Methods. Chapter 5. Chapter 5 1 / 52 Resampling Methds Chapter 5 Chapter 5 1 / 52 1 51 Validatin set apprach 2 52 Crss validatin 3 53 Btstrap Chapter 5 2 / 52 Abut Resampling An imprtant statistical tl Pretending the data as ppulatin and

More information

CHAPTER 24: INFERENCE IN REGRESSION. Chapter 24: Make inferences about the population from which the sample data came.

CHAPTER 24: INFERENCE IN REGRESSION. Chapter 24: Make inferences about the population from which the sample data came. MATH 1342 Ch. 24 April 25 and 27, 2013 Page 1 f 5 CHAPTER 24: INFERENCE IN REGRESSION Chapters 4 and 5: Relatinships between tw quantitative variables. Be able t Make a graph (scatterplt) Summarize the

More information

The Kullback-Leibler Kernel as a Framework for Discriminant and Localized Representations for Visual Recognition

The Kullback-Leibler Kernel as a Framework for Discriminant and Localized Representations for Visual Recognition The Kullback-Leibler Kernel as a Framewrk fr Discriminant and Lcalized Representatins fr Visual Recgnitin Nun Vascncels Purdy H Pedr Mren ECE Department University f Califrnia, San Dieg HP Labs Cambridge

More information

A Correlation of. to the. South Carolina Academic Standards for Mathematics Precalculus

A Correlation of. to the. South Carolina Academic Standards for Mathematics Precalculus A Crrelatin f Suth Carlina Academic Standards fr Mathematics Precalculus INTRODUCTION This dcument demnstrates hw Precalculus (Blitzer), 4 th Editin 010, meets the indicatrs f the. Crrelatin page references

More information

Checking the resolved resonance region in EXFOR database

Checking the resolved resonance region in EXFOR database Checking the reslved resnance regin in EXFOR database Gttfried Bertn Sciété de Calcul Mathématique (SCM) Oscar Cabells OECD/NEA Data Bank JEFF Meetings - Sessin JEFF Experiments Nvember 0-4, 017 Bulgne-Billancurt,

More information

Lab 1 The Scientific Method

Lab 1 The Scientific Method INTRODUCTION The fllwing labratry exercise is designed t give yu, the student, an pprtunity t explre unknwn systems, r universes, and hypthesize pssible rules which may gvern the behavir within them. Scientific

More information

CS:4420 Artificial Intelligence

CS:4420 Artificial Intelligence CS:4420 Artificial Intelligence Spring 2017 Learning frm Examples Cesare Tinelli The University f Iwa Cpyright 2004 17, Cesare Tinelli and Stuart Russell a a These ntes were riginally develped by Stuart

More information

[COLLEGE ALGEBRA EXAM I REVIEW TOPICS] ( u s e t h i s t o m a k e s u r e y o u a r e r e a d y )

[COLLEGE ALGEBRA EXAM I REVIEW TOPICS] ( u s e t h i s t o m a k e s u r e y o u a r e r e a d y ) (Abut the final) [COLLEGE ALGEBRA EXAM I REVIEW TOPICS] ( u s e t h i s t m a k e s u r e y u a r e r e a d y ) The department writes the final exam s I dn't really knw what's n it and I can't very well

More information

GENESIS Structural Optimization for ANSYS Mechanical

GENESIS Structural Optimization for ANSYS Mechanical P3 STRUCTURAL OPTIMIZATION (Vl. II) GENESIS Structural Optimizatin fr ANSYS Mechanical An Integrated Extensin that adds Structural Optimizatin t ANSYS Envirnment New Features and Enhancements Release 2017.03

More information

Activity Guide Loops and Random Numbers

Activity Guide Loops and Random Numbers Unit 3 Lessn 7 Name(s) Perid Date Activity Guide Lps and Randm Numbers CS Cntent Lps are a relatively straightfrward idea in prgramming - yu want a certain chunk f cde t run repeatedly - but it takes a

More information

T Algorithmic methods for data mining. Slide set 6: dimensionality reduction

T Algorithmic methods for data mining. Slide set 6: dimensionality reduction T-61.5060 Algrithmic methds fr data mining Slide set 6: dimensinality reductin reading assignment LRU bk: 11.1 11.3 PCA tutrial in mycurses (ptinal) ptinal: An Elementary Prf f a Therem f Jhnsn and Lindenstrauss,

More information

Churn Prediction using Dynamic RFM-Augmented node2vec

Churn Prediction using Dynamic RFM-Augmented node2vec Churn Predictin using Dynamic RFM-Augmented nde2vec Sandra Mitrvić, Jchen de Weerdt, Bart Baesens & Wilfried Lemahieu Department f Decisin Sciences and Infrmatin Management, KU Leuven 18 September 2017,

More information

Cells though to send feedback signals from the medulla back to the lamina o L: Lamina Monopolar cells

Cells though to send feedback signals from the medulla back to the lamina o L: Lamina Monopolar cells Classificatin Rules (and Exceptins) Name: Cell type fllwed by either a clumn ID (determined by the visual lcatin f the cell) r a numeric identifier t separate ut different examples f a given cell type

More information

Linear Classification

Linear Classification Linear Classificatin CS 54: Machine Learning Slides adapted frm Lee Cper, Jydeep Ghsh, and Sham Kakade Review: Linear Regressin CS 54 [Spring 07] - H Regressin Given an input vectr x T = (x, x,, xp), we

More information

CHAPTER 3 INEQUALITIES. Copyright -The Institute of Chartered Accountants of India

CHAPTER 3 INEQUALITIES. Copyright -The Institute of Chartered Accountants of India CHAPTER 3 INEQUALITIES Cpyright -The Institute f Chartered Accuntants f India INEQUALITIES LEARNING OBJECTIVES One f the widely used decisin making prblems, nwadays, is t decide n the ptimal mix f scarce

More information

Medium Scale Integrated (MSI) devices [Sections 2.9 and 2.10]

Medium Scale Integrated (MSI) devices [Sections 2.9 and 2.10] EECS 270, Winter 2017, Lecture 3 Page 1 f 6 Medium Scale Integrated (MSI) devices [Sectins 2.9 and 2.10] As we ve seen, it s smetimes nt reasnable t d all the design wrk at the gate-level smetimes we just

More information

Administrativia. Assignment 1 due thursday 9/23/2004 BEFORE midnight. Midterm exam 10/07/2003 in class. CS 460, Sessions 8-9 1

Administrativia. Assignment 1 due thursday 9/23/2004 BEFORE midnight. Midterm exam 10/07/2003 in class. CS 460, Sessions 8-9 1 Administrativia Assignment 1 due thursday 9/23/2004 BEFORE midnight Midterm eam 10/07/2003 in class CS 460, Sessins 8-9 1 Last time: search strategies Uninfrmed: Use nly infrmatin available in the prblem

More information

Functions. EXPLORE \g the Inverse of ao Exponential Function

Functions. EXPLORE \g the Inverse of ao Exponential Function ifeg Seepe3 Functins Essential questin: What are the characteristics f lgarithmic functins? Recall that if/(x) is a ne-t-ne functin, then the graphs f/(x) and its inverse,/'~\x}, are reflectins f each

More information

Trigonometric Ratios Unit 5 Tentative TEST date

Trigonometric Ratios Unit 5 Tentative TEST date 1 U n i t 5 11U Date: Name: Trignmetric Ratis Unit 5 Tentative TEST date Big idea/learning Gals In this unit yu will extend yur knwledge f SOH CAH TOA t wrk with btuse and reflex angles. This extensin

More information

Internal vs. external validity. External validity. This section is based on Stock and Watson s Chapter 9.

Internal vs. external validity. External validity. This section is based on Stock and Watson s Chapter 9. Sectin 7 Mdel Assessment This sectin is based n Stck and Watsn s Chapter 9. Internal vs. external validity Internal validity refers t whether the analysis is valid fr the ppulatin and sample being studied.

More information

PSU GISPOPSCI June 2011 Ordinary Least Squares & Spatial Linear Regression in GeoDa

PSU GISPOPSCI June 2011 Ordinary Least Squares & Spatial Linear Regression in GeoDa There are tw parts t this lab. The first is intended t demnstrate hw t request and interpret the spatial diagnstics f a standard OLS regressin mdel using GeDa. The diagnstics prvide infrmatin abut the

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks Oliver Schulte - CMPT 310 Neural Networks Neural networks arise from attempts to model human/animal brains Many models, many claims of biological plausibility We will focus on

More information

Module 3: Gaussian Process Parameter Estimation, Prediction Uncertainty, and Diagnostics

Module 3: Gaussian Process Parameter Estimation, Prediction Uncertainty, and Diagnostics Mdule 3: Gaussian Prcess Parameter Estimatin, Predictin Uncertainty, and Diagnstics Jerme Sacks and William J Welch Natinal Institute f Statistical Sciences and University f British Clumbia Adapted frm

More information

Chapter 7. Neural Networks

Chapter 7. Neural Networks Chapter 7. Neural Netwrks Wei Pan Divisin f Bistatistics, Schl f Public Health, University f Minnesta, Minneaplis, MN 55455 Email: weip@bistat.umn.edu PubH 7475/8475 c Wei Pan Intrductin Chapter 11. nly

More information

Department of Electrical Engineering, University of Waterloo. Introduction

Department of Electrical Engineering, University of Waterloo. Introduction Sectin 4: Sequential Circuits Majr Tpics Types f sequential circuits Flip-flps Analysis f clcked sequential circuits Mre and Mealy machines Design f clcked sequential circuits State transitin design methd

More information

Misc. ArcMap Stuff Andrew Phay

Misc. ArcMap Stuff Andrew Phay Misc. ArcMap Stuff Andrew Phay aphay@whatcmcd.rg Prjectins Used t shw a spherical surface n a flat surface Distrtin Shape Distance True Directin Area Different Classes Thse that minimize distrtin in shape

More information

Computational modeling techniques

Computational modeling techniques Cmputatinal mdeling techniques Lecture 4: Mdel checing fr ODE mdels In Petre Department f IT, Åb Aademi http://www.users.ab.fi/ipetre/cmpmd/ Cntent Stichimetric matrix Calculating the mass cnservatin relatins

More information

Math Foundations 20 Work Plan

Math Foundations 20 Work Plan Math Fundatins 20 Wrk Plan Units / Tpics 20.8 Demnstrate understanding f systems f linear inequalities in tw variables. Time Frame December 1-3 weeks 6-10 Majr Learning Indicatrs Identify situatins relevant

More information

More Tutorial at

More Tutorial at Answer each questin in the space prvided; use back f page if extra space is needed. Answer questins s the grader can READILY understand yur wrk; nly wrk n the exam sheet will be cnsidered. Write answers,

More information

Computational modeling techniques

Computational modeling techniques Cmputatinal mdeling techniques Lecture 11: Mdeling with systems f ODEs In Petre Department f IT, Ab Akademi http://www.users.ab.fi/ipetre/cmpmd/ Mdeling with differential equatins Mdeling strategy Fcus

More information

ALE 21. Gibbs Free Energy. At what temperature does the spontaneity of a reaction change?

ALE 21. Gibbs Free Energy. At what temperature does the spontaneity of a reaction change? Name Chem 163 Sectin: Team Number: ALE 21. Gibbs Free Energy (Reference: 20.3 Silberberg 5 th editin) At what temperature des the spntaneity f a reactin change? The Mdel: The Definitin f Free Energy S

More information

7 TH GRADE MATH STANDARDS

7 TH GRADE MATH STANDARDS ALGEBRA STANDARDS Gal 1: Students will use the language f algebra t explre, describe, represent, and analyze number expressins and relatins 7 TH GRADE MATH STANDARDS 7.M.1.1: (Cmprehensin) Select, use,

More information

, which yields. where z1. and z2

, which yields. where z1. and z2 The Gaussian r Nrmal PDF, Page 1 The Gaussian r Nrmal Prbability Density Functin Authr: Jhn M Cimbala, Penn State University Latest revisin: 11 September 13 The Gaussian r Nrmal Prbability Density Functin

More information

CS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007

CS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007 CS 477/677 Analysis f Algrithms Fall 2007 Dr. Gerge Bebis Curse Prject Due Date: 11/29/2007 Part1: Cmparisn f Srting Algrithms (70% f the prject grade) The bjective f the first part f the assignment is

More information

* o * * o o 1.5. o o. o * 0.5 * * o * * o. o o. oo o. * * * o * 0.5. o o * * * o o. * * oo. o o o o. * o * * o* o

* o * * o o 1.5. o o. o * 0.5 * * o * * o. o o. oo o. * * * o * 0.5. o o * * * o o. * * oo. o o o o. * o * * o* o Pseudinverse Learning Algrithm fr Feedfrward Neural Netwrs Λ PING GUO and MICHAEL R. LYU Department f Cmputer Science and Engineering, The Chinese University f Hng Kng, Shatin, NT, Hng Kng. SAR f P.R.

More information

Thermodynamics Partial Outline of Topics

Thermodynamics Partial Outline of Topics Thermdynamics Partial Outline f Tpics I. The secnd law f thermdynamics addresses the issue f spntaneity and invlves a functin called entrpy (S): If a prcess is spntaneus, then Suniverse > 0 (2 nd Law!)

More information

Sections 15.1 to 15.12, 16.1 and 16.2 of the textbook (Robbins-Miller) cover the materials required for this topic.

Sections 15.1 to 15.12, 16.1 and 16.2 of the textbook (Robbins-Miller) cover the materials required for this topic. Tpic : AC Fundamentals, Sinusidal Wavefrm, and Phasrs Sectins 5. t 5., 6. and 6. f the textbk (Rbbins-Miller) cver the materials required fr this tpic.. Wavefrms in electrical systems are current r vltage

More information

INSTRUMENTAL VARIABLES

INSTRUMENTAL VARIABLES INSTRUMENTAL VARIABLES Technical Track Sessin IV Sergi Urzua University f Maryland Instrumental Variables and IE Tw main uses f IV in impact evaluatin: 1. Crrect fr difference between assignment f treatment

More information

Time, Synchronization, and Wireless Sensor Networks

Time, Synchronization, and Wireless Sensor Networks Time, Synchrnizatin, and Wireless Sensr Netwrks Part II Ted Herman University f Iwa Ted Herman/March 2005 1 Presentatin: Part II metrics and techniques single-hp beacns reginal time znes ruting-structure

More information

Biplots in Practice MICHAEL GREENACRE. Professor of Statistics at the Pompeu Fabra University. Chapter 13 Offprint

Biplots in Practice MICHAEL GREENACRE. Professor of Statistics at the Pompeu Fabra University. Chapter 13 Offprint Biplts in Practice MICHAEL GREENACRE Prfessr f Statistics at the Pmpeu Fabra University Chapter 13 Offprint CASE STUDY BIOMEDICINE Cmparing Cancer Types Accrding t Gene Epressin Arrays First published:

More information

EEO 401 Digital Signal Processing Prof. Mark Fowler

EEO 401 Digital Signal Processing Prof. Mark Fowler EEO 401 Digital Signal Prcessing Prf. Mark Fwler Intrductin Nte Set #1 ading Assignment: Ch. 1 f Prakis & Manlakis 1/13 Mdern systems generally DSP Scenari get a cntinuus-time signal frm a sensr a cnt.-time

More information

THERMAL-VACUUM VERSUS THERMAL- ATMOSPHERIC TESTS OF ELECTRONIC ASSEMBLIES

THERMAL-VACUUM VERSUS THERMAL- ATMOSPHERIC TESTS OF ELECTRONIC ASSEMBLIES PREFERRED RELIABILITY PAGE 1 OF 5 PRACTICES PRACTICE NO. PT-TE-1409 THERMAL-VACUUM VERSUS THERMAL- ATMOSPHERIC Practice: Perfrm all thermal envirnmental tests n electrnic spaceflight hardware in a flight-like

More information

1996 Engineering Systems Design and Analysis Conference, Montpellier, France, July 1-4, 1996, Vol. 7, pp

1996 Engineering Systems Design and Analysis Conference, Montpellier, France, July 1-4, 1996, Vol. 7, pp THE POWER AND LIMIT OF NEURAL NETWORKS T. Y. Lin Department f Mathematics and Cmputer Science San Jse State University San Jse, Califrnia 959-003 tylin@cs.ssu.edu and Bereley Initiative in Sft Cmputing*

More information

ECE 2100 Circuit Analysis

ECE 2100 Circuit Analysis ECE 2100 Circuit Analysis Lessn 25 Chapter 9 & App B: Passive circuit elements in the phasr representatin Daniel M. Litynski, Ph.D. http://hmepages.wmich.edu/~dlitynsk/ ECE 2100 Circuit Analysis Lessn

More information

Least Squares Optimal Filtering with Multirate Observations

Least Squares Optimal Filtering with Multirate Observations Prc. 36th Asilmar Cnf. n Signals, Systems, and Cmputers, Pacific Grve, CA, Nvember 2002 Least Squares Optimal Filtering with Multirate Observatins Charles W. herrien and Anthny H. Hawes Department f Electrical

More information

We can see from the graph above that the intersection is, i.e., [ ).

We can see from the graph above that the intersection is, i.e., [ ). MTH 111 Cllege Algebra Lecture Ntes July 2, 2014 Functin Arithmetic: With nt t much difficulty, we ntice that inputs f functins are numbers, and utputs f functins are numbers. S whatever we can d with

More information

CMSC 421: Neural Computation. Applications of Neural Networks

CMSC 421: Neural Computation. Applications of Neural Networks CMSC 42: Neural Computation definition synonyms neural networks artificial neural networks neural modeling connectionist models parallel distributed processing AI perspective Applications of Neural Networks

More information

Preparation work for A2 Mathematics [2017]

Preparation work for A2 Mathematics [2017] Preparatin wrk fr A2 Mathematics [2017] The wrk studied in Y12 after the return frm study leave is frm the Cre 3 mdule f the A2 Mathematics curse. This wrk will nly be reviewed during Year 13, it will

More information

CHAPTER 8b Static Equilibrium Units

CHAPTER 8b Static Equilibrium Units CHAPTER 8b Static Equilibrium Units The Cnditins fr Equilibrium Slving Statics Prblems Stability and Balance Elasticity; Stress and Strain The Cnditins fr Equilibrium An bject with frces acting n it, but

More information

CMU Noncooperative games 3: Price of anarchy. Teacher: Ariel Procaccia

CMU Noncooperative games 3: Price of anarchy. Teacher: Ariel Procaccia CMU 15-896 Nncperative games 3: Price f anarchy Teacher: Ariel Prcaccia Back t prisn The nly Nash equilibrium in Prisner s dilemma is bad; but hw bad is it? Objective functin: scial cst = sum f csts NE

More information

ENG2410 Digital Design Sequential Circuits: Part B

ENG2410 Digital Design Sequential Circuits: Part B ENG24 Digital Design Sequential Circuits: Part B Fall 27 S. Areibi Schl f Engineering University f Guelph Analysis f Sequential Circuits Earlier we learned hw t analyze cmbinatinal circuits We will extend

More information

This section is primarily focused on tools to aid us in finding roots/zeros/ -intercepts of polynomials. Essentially, our focus turns to solving.

This section is primarily focused on tools to aid us in finding roots/zeros/ -intercepts of polynomials. Essentially, our focus turns to solving. Sectin 3.2: Many f yu WILL need t watch the crrespnding vides fr this sectin n MyOpenMath! This sectin is primarily fcused n tls t aid us in finding rts/zers/ -intercepts f plynmials. Essentially, ur fcus

More information

ENGI 4430 Parametric Vector Functions Page 2-01

ENGI 4430 Parametric Vector Functions Page 2-01 ENGI 4430 Parametric Vectr Functins Page -01. Parametric Vectr Functins (cntinued) Any nn-zer vectr r can be decmpsed int its magnitude r and its directin: r rrˆ, where r r 0 Tangent Vectr: dx dy dz dr

More information

19 Better Neural Network Training; Convolutional Neural Networks

19 Better Neural Network Training; Convolutional Neural Networks 108 Jnathan Richard Shewchuk 19 Better Neural Netwrk Training; Cnvlutinal Neural Netwrks [I m ging t talk abut a bunch f heuristics that make gradient descent faster, r make it find better lcal minima,

More information

Big Data Analytics! Special Topics for Computer Science CSE CSE Mar 31

Big Data Analytics! Special Topics for Computer Science CSE CSE Mar 31 Bg Data Analytcs! Specal Tpcs fr Cmputer Scence CSE 4095-001 CSE 5095-005! Mar 31 Fe Wang Asscate Prfessr Department f Cmputer Scence and Engneerng fe_wang@ucnn.edu Intrductn t Deep Learnng Perceptrn In

More information

AP Physics Kinematic Wrap Up

AP Physics Kinematic Wrap Up AP Physics Kinematic Wrap Up S what d yu need t knw abut this mtin in tw-dimensin stuff t get a gd scre n the ld AP Physics Test? First ff, here are the equatins that yu ll have t wrk with: v v at x x

More information

Day 7: Optimization, regularization for NN

Day 7: Optimization, regularization for NN Day 7: Optimizatin, regularizatin fr NN Intrductin t Machine Learning Summer Schl June 18, 2018 - June 29, 2018, Chicag Instructr: Suriya Gunasekar, TTI Chicag 26 June 2018 Day 7: Tricks and tls Day 7:

More information

Example 1. A robot has a mass of 60 kg. How much does that robot weigh sitting on the earth at sea level? Given: m. Find: Relationships: W

Example 1. A robot has a mass of 60 kg. How much does that robot weigh sitting on the earth at sea level? Given: m. Find: Relationships: W Eample 1 rbt has a mass f 60 kg. Hw much des that rbt weigh sitting n the earth at sea level? Given: m Rbt = 60 kg ind: Rbt Relatinships: Slutin: Rbt =589 N = mg, g = 9.81 m/s Rbt = mrbt g = 60 9. 81 =

More information

Learning to Control an Unstable System with Forward Modeling

Learning to Control an Unstable System with Forward Modeling 324 Jrdan and Jacbs Learning t Cntrl an Unstable System with Frward Mdeling Michael I. Jrdan Brain and Cgnitive Sciences MIT Cambridge, MA 02139 Rbert A. Jacbs Cmputer and Infrmatin Sciences University

More information

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) >

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) > Btstrap Methd > # Purpse: understand hw btstrap methd wrks > bs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(bs) > mean(bs) [1] 21.64625 > # estimate f lambda > lambda = 1/mean(bs);

More information