Agenda. What is Machine Learning? Learning Type of Learning: Supervised, Unsupervised and semi supervised Classification

Size: px
Start display at page:

Download "Agenda. What is Machine Learning? Learning Type of Learning: Supervised, Unsupervised and semi supervised Classification"

Transcription

1 Agenda Artificial Intelligence and its applicatins Lecture 6 Supervised Learning Prfessr Daniel Yeung danyeung@ieee.rg Dr. Patrick Chan patrickchan@ieee.rg Suth China University f Technlgy, China Learning Type f Learning: Supervised, Unsupervised and semi supervised Classificatin Definitin s Bayes, Decisin Tree K-NN, NN SVM, MCS What is Machine Learning Changes in the system that are adaptive in the sense that they enable the system t d the same task (r tasks drawn frm a ppulatin f similar tasks) mre effectively the net time Tm Michael Mitchell 997 Chair f the Machine Learning Department at Carnegie Melln University Herbert Aleander Simn (983) Turing Award 975 Nbel Prize in Ecnmics 978 Imprving sme task T based n eperience E with respect t perfrmance measure P Machine Learning An Eample Salmn / Sea Bass A fish packing plant wants t autmate the prcess f srting incming fishes (Salmn / Sea Bass) n a belt accrding t species Salmn Sea Bass 3 4

2 Prcedure Sensing Prcedures: Measuring Device, e.g. camera Object islatin, nise reductin, Feature etractin, selectin Classificatin, Regressin, Crss-validatin, Bstrap, Object Sensing Image Preprcessing Refined Image Feature Etractin Input Features Mdel Learning Output Evaluatin Mdel Digitize the bject t the frmat which can be handled by machines 5 6 Preprcessing Feature Etractin Refine the data What can cause prblems during sensing E.g. lighting cnditins, psitin f fish n the cnveyr belt, camera nise, etc. What kind f infrmatin can distinguish ne specie f fish frm the ther E.g. length, width, weight, number and shape f fins, tail shape, etc. Eperts (Fisherman) may help A salmn is usually shrter than a sea bass Length is chsen (as a feature) as a decisin criterin 7 8

3 Feature Etractin Feature Etractin 5 is selected as the threshld Althugh sea bass is lnger than salmn in general, there are many eceptins The eperts may be wrng! Histgrams f the length feature fr tw types f fish in Training Samples l* Hw abut ther features E.g. lightness 5.5 is selected as the threshld Using lightness as a feature is much better than using length Histgrams fr the lightness feature fr the types f fish in Training Samples Salmn Sea Bass Salmn Sea Bass 9 0 Feature Etractin Mdel Learning Hw abut using tw features Lightness: Width: Sea Bass Many techniques (classifiers) are available Discuss in detail later Is it (a linear classifier) t simple Large errr n training set Salmn Artificial Intelligence and its Applicatins: Lecture 6

4 Mdel Learning Mdel Learning A mre cmple classifier: It classifies training samples perfectly Will it be t fit t training samples (verfitting) Lk mre reasnable Nt t cmple Gd in classifying the training and unseen samples 3 4 Mdel Learning Evaluatin T simple Perfrmance n the training samples is nt gd T cmple Cannt be generalized t the unseen samples Tradeff between accuracy f training samples and cmpleity Ultimate Objective Perfrmance n unseen samples (Generalizatin Ability) Cannt be calculated but just be estimated Evaluate whether a trained mdel generalizes the knwledge frm training samples t future unseen samples Measurement Using training samples / test samples Repeat eperiments 5 6

5 Evaluatin Evaluatin Errr Classificatin Errr : Class Mean Square Errr : Real number Cst Csts f misclassifying n different classes may be different Fr eample: Case : Cmpany s view Salmn is mre epensive than sea bass. Selling Salmn with the price f sea bass will be a lss If a fish is a salmn, it is classified as sea bass HIGH cst If a fish is a sea bass, it is classified as salmn LOW cst 7 8 Evaluatin Evaluatin Fr eample: Case : Custmer s view Custmers wh buy salmn will be very upset if they get sea bass; Custmers wh buy sea bass will nt be upset if they get the mre epensive salmn If a fish is a salmn, it is classified as sea bass LOW cst If a fish is a sea bass, it is classified as salmn HIGH cst Hw wuld these cst cnsideratins affect ur decisin Salmn Sea Bass Case Case Salmn Sea Bass Mre seabass Mistaken as salmn Salmn Sea Bass Mre salmn Mistaken as seabass 0 9 0

6 Pattern Classificatin Prblem Artificial dataset Pattern Classificatin Prblem Will classifier recgnize unseen samples Artificial Dataset Red Pints Unseen Samples Artificial Dataset O O O O O : Training Sample in Class X: Training Sample in Class : Training Sample in Class X: Training Sample in Class Pattern Classificatin Prblem Trains a classifier t apprimate the unknwn inpututput system using the training dataset. Fr neural netwrks, usually dne by minimizing the MSE between netwrk utputs and desired utputs in the training dataset --- the Training Errr R emp the average errr f the finite training dataset R emp = l l ( b) ( b) ( f( X ) F( X )) b= () where l, fand F dente number f training samples, classifier utput and desired utput respectively Pattern Classificatin Prblem Instead f minimizing the training errr R emp nly, the ultimate gal f classifier training is t crrectly predict the class / categry f future unseen samples. R true the epected errr fr all samples ( f( X) F( X) ) p( X) dx R true = R gen the generalizatin errr fr unseen samples (3) R gen =R true -R emp R gen nt cmputable, nly estimated by Empirical Methds Analytical Mdels () V. Vapnik, Statistical Learning Thery, Wiley, 988 R.O. Duda, P. E. Hart and D.G. Strk, Pattern Classificatin, Wiley, 00 3 V. Vapnik, Statistical Learning Thery, Wiley, 998 R.O. Duda, P. E. Hart and D.G. Strk, Pattern Classificatin, Wiley, 00 4

7 Generalizatin Errr Mdel R gen nt cmputable, nly estimated Empirical Mdel K-fld Crss-Validatin (CV) Leave-One-Out Crss-Validatin (LOOCV) Analytical Mdel Infrmatin Criteria VC-Dimensin based Analytical Mdel The artificial dataset Artificial Dataset : Training Sample in Class X: Training Sample in Class Analytical Mdel trained by minimizing R emp nly Easily ver-fitting, high R gen Artificial Dataset Analytical Mdel trained by minimizing R emp nly Easily ver-fitting, high R gen Artificial Dataset Unseen testing sample F() : Training Sample in Class X: Training Sample in Class : Training Sample in Class X: Training Sample in Class

8 Analytical Mdel trained by minimizing R emp nly Easily ver-fitting, high R gen Artificial Dataset Analytical Mdel trained by minimizing cmpleity nly Easily under-fitting and ver minimizing the cmpleity Artificial Dataset : Training Sample in Class X: Training Sample in Class : Training Sample in Class X: Training Sample in Class Analytical Mdel trained by minimizing h nly Easily under-fitting and ver minimizing the cmpleity (h) Artificial Dataset Analytical Mdel trained by minimizing h nly Easily under-fitting and ver minimizing the cmpleity (h) Artificial Dataset : Training Sample in Class X: Training Sample in Class : Training Sample in Class X: Training Sample in Class

9 Analytical Mdel trained by minimizing bth R emp and h Analytical Mdel trained by minimizing bth R emp and h Artificial Dataset Artificial Dataset f() : Training Sample in Class X: Training Sample in Class : Training Sample in Class X: Training Sample in Class Type f Learning Pattern Classificatin Prblem Central questin in PR: Find the best f t apprimate F Supervised Learning A teacher tells yu whether the answer is crrect Reinfrcement learning N teacher but having feedback Tell yu a reward f an actin but nt which actin is crrect Unsupervised Learning N teacher 35 36

10 K-Nearest Neighbr (K-NN) K-Nearest Neighbr (K-NN) A new pattern is classified by a majrity vte f its neighbrs, with the pattern being assigned t the class mst cmmn amngst its k nearest neighbrs Eample f -nn Algrithm: Given an unseen sample Green Star Calculate the distance between unseen sample and each training sample Select the K nearest training samples Majrity vte frm these K samples K = Triangle K = 3 Circle K = 5 Triangle Unseen Sample K-Nearest Neighbr (K-NN) K-Nearest Neighbr (K-NN) Nisy Data 39 Circle A simple linearly separable dataset Obviusly, unseen sample in questin shuld be identified as Triangle Hwever, if there is a nise sample net t this unseen sample, then a -NN classifier will classify it wrngly as a Circle Larger K can slve this prblem Triangle 39 Is larger K always better Obviusly, this unseen sample will be identified as a Circle by a 5-NN classifier But a 9-NN classifier will classify it as a Triangle Hwever, 5-NN classifier will classify it crrectly 40 Circle Triangle 40

11 K-Nearest Neighbr (K-NN) Advantages: Very simple All cmputatins deferred until classificatin N training is needed Disadvantages: Difficult t determine K Affected by nisy training data Classificatin is time cnsuming Need t calculate the distance between the unseen sample and each training sample Decisin Tree (DT) One f the mst widely used and practical methds fr inductive inference E.g. C4.5 Apprimates discrete-valued functins (including disjunctins) 4 4 Decisin Tree (DT) Decisin Tree (DT) D we g t play tennis tday Decisin Regin: > a If Outlk is Sunny AND Humidity is Nrmal Yes N Yes If Outlk is Overcast Yes > b If Outlk is Rain AND Wind is Weak Yes b N Yes Other situatin N a 43 44

12 Decisin Tree (DT) Internal ndes can be multivariate Mre than ne features are used Shape f Decisin Regin is irregular a + b + c > 0 N Yes Decisin Tree (DT) Hw t determine which features shuld be used in a nde Infrmatin Gain Current entrpy Gain( X, A) = H( X) H( X A) Entrpy after using feature A Reductin in entrpy (reduce uncertainty) due t srting n a feature A Entrpy is a measure f uncertainty (Smaller value, less uncertainty,, mre infrmatin gain) where H( X) = n i= p( i)lg p( i ) Xis a randm variable with nutcmes { i : i=,,n} p() is the prbability mass functin f utcme Decisin Tree (DT) Decisin Tree (DT) LOOP: Select the best feature (A) accrding t Infrmatin Gain Fr each value f A, create new descendant f nde Srt training samples t leaf ndes STOP when training samples perfectly classified 47 N > # Yes STOP > N STOP Yes STOP Rule etractin frm trees A decisin tree can be used fr feature etractin (e.g. seeing which features are useful) Interpretability Easy t understand Cmpact and fast classificatin methd 48

13 Neural Netwrk (NN) A very simplified mdel f the brain Human Brain Neural Netwrk (NN) Cmpsed f neurns which cperate tgether input utput input utput Artificial NN Basically a functin apprimatr 49 Transfrms inputs int utputs t the best f its ability Neural Netwrk (NN) 50 I I I d input w w w d f synapse neurn Ө utput Artificial Neural Netwrk (ANN) Hw des a neurn wrk Cmpsed f neurns which cperate tgether The utput f a neurn is a functin f the weighted sum f the inputs plus a bias (ptinal) unit which always emits a value f r - as a threshld. bias I w I w f Ө = f(w I + w I + + w d I d + bias) I d w d activatin functin 5 5

14 Artificial Neural Netwrk (ANN) Activatin Functin Artificial Neural Netwrk (ANN) XOR Eample The functin ( f ) is the activatin functin Eamples: y = bias ( 0.7y 0.4 ) y= sgn y Linear Functin 53 Output is the same as input Differentiable f () = Sign Functin Decisin Making Nt differentiable f () > 0 - < 0 Sigmid Functin Smth, cntinuus, and mntnically increasing (differentiable) f () = /( + e - ) 54 y = +.5 Artificial Neural Netwrk (ANN) XOR Eample Artificial Neural Netwrk (ANN) XOR Eample y = + + y = = = y = sgn(++0.5) = sgn(.5) = y = sgn(+-.5) = sgn(0.5) = y = sgn( ) = sgn(-.3) = - ( 0.7y 0.4 ) y= sgn y = - = - y = sgn( ) = sgn(-.5) = - y = sgn(---.5 ) = sgn(-3.5) = - y = sgn( ) = sgn(-.3) = - The first hidden unit: OR gate = The secnd hidden unit: AND gate y y = +.5 The final utput unit: AND NOT gate: ( 0.7y 0.4 ) y= sgn y y = y AND NOT y = ( OR ) AND NOT ( AND ) = XOR 55 56

15 Artificial Neural Netwrk (ANN) Architecture Artificial Neural Netwrk (ANN) Architecture: Eample w w w 3 Σ g w w w 3 Σ g Σ g Σ g Input Layer Hidden Layers Output Layer A simple three-layer neural netwrk Input Layer: input units Hidden Layer: 3 hidden units Output Layer: utput units Input Layer Hidden Layers Output Layer = length f a fish (salmn/seabass) = lightness f a fish (salmn/seabass) w ij = weight fr assigning imprtance f each input t a neurn Hidden neurns = Cmbinatin f length and lightness discriminant g = final utput Artificial Neural Netwrk (ANN) Learning Artificial Neural Netwrk (ANN) Learning: BP Algrithm net 59 j = d m= w w 3 w w jm Input Layer Hidden Layers Output Layer m g ( ) y j= f net j k f n net y f w' net w f g f f g f f net' = w' ki d k i= m= = w n i= im w' m ki y i ( ' ) g k= f net k Weights can be determined by training Reduce the errr between the desired utputs and the NN utputs f training samples Back-prpagatin algrithm is the mst widely used methd fr determining the weight 60 Natural etensin f LMS algrithm Prs: Simple and general methd Cns: Slw and trapped at lcal minima

16 Artificial Neural Netwrk (ANN) Learning: BP Algrithm Calculatin f the derivative flws backwards thrugh the netwrk Hence, it is called backprpagatin Supprt Vectr Machine (SVM) Which ne is the best linear separatr These derivatives pint in the directin f the maimum increase f the errr functin (find ut where ma errr being made and g back t try t decrease this errr) A small step (learning rate) in the ppsite directin will result in the maimum decrease f the (lcal) errr functin: 6 E w' = w+ α w where αis the learning rate & E the errr functin 6 Supprt Vectr Machine (SVM) Supprt Vectr Machine (SVM) A clever sheep dg wh is herding its sheep It runs between the sheep and tries t separate the black sheep and white sheep Clever Sheep dg The sheep dg keeps running The sheep start t grw wl The dg feels that the gap between black sheep and white sheep is narrwer 63 White Sheep Black Sheep 64

17 Supprt Vectr Machine (SVM) Supprt Vectr Machine (SVM) The wls becme bigger and bigger Finally, nly ne path is left.. Sheep Vectrs Largest Margin Serendipitusly, the sheep invented bth large margin classificatin and sheep vectrs Frm: Learning with Kernels, Schölkpf & Smla Supprt Vectr Machine (SVM) Supprt Vectr Machine (SVM) Which ne is the best linear separatr Similar t the sheep, the wl is grwing This shuld be the best classifier 67 68

18 Supprt Vectr Machine (SVM) On the ther hand, we can seek a classifier with the largest margin We will have similar result Supprt Vectr Machine (SVM) Maimum Margin ONLY depends n few samples, called Supprt Vectrs SVM: Linearly Separable SVM: Nn-Linearly Separable Prblem can be frmulated as a Quadratic Optimizatin Prblem and slve fr w and b minimize w, b subject t where w y w i i =...n T ( +b) i and y={, } y() = w+b y() = y() = - ξ ξ ξ 3 > > < Errr Errr Crrect ξ 4 ξ 5 ξ ξ 4 ξ ξ 5 3 ξ > < Errr Crrect Slack Variable (ξ) is added as a punishment t allw a sample in / far away frm the margin Prblem becmes: Minimize w, b subject t y i w + C N i= ξ T ( w i b) i i + ξ i =... N ξ i 0 7 Margin width = w 7 ξs f ther samples are 0 where C: tradeff parameter between errr and margin

19 SVM: Kernel Kernel is a functin which maps input int high dimensinal feature space SVM: Kernel Cnstruct linear SVM in feature space Similar t the linearly separable case but change all inner prducts t kernel functins Φ minimize w, b subject t w T yi( w φ( i) +b) i =... N Feature Space Multiple System (MCS) Multiple System (MCS) Practically, n idea which kind and which parameters f classifiers is the best fr a prblem. A number f different kinds f classifiers with different parameters are trained A best classifier is selected t use accrding t sme criteria Selecting the best classifier is nt necessarily the ideal chice Different classifiers may cntain different valuable infrmatin Ptentially valuable infrmatin may be lst by discarding results f less-successful classifiers Selecting a wrng classifier will definitely lead t errneus classificatin result 75 76

20 Multiple System (MCS) Multiple System (MCS) A single, trained classifier may nt be adequate t handle tday s increasingly cmple prblems Cmbining classifiers may be a better chice X X Single f result f f f L Fusin result Base classifier / Individual classifier MCS Fusin Methd May depend n X Sample X is fed int each individual classifier Each individual classifier makes it s wn decisin Final decisin is made by cmbining all individual decisins X f f f L Fusin Final Result Multiple System (MCS) Multiple System (MCS) Three factrs affecting the perfrmance (accuracy) f a MCS: Accuracy f base classifiers Hw gd are the base classifiers Diversity amng individual classifiers Hw different are the decisins reached by the classifiers Fusin Methd Hw t cmbine classifiers A methd t arrive at a grup (MCS) decisin Tw categries accrding t classifier utput types: Label utput the class which the sample belngs t E.g. is Class Cntinuus-valued utput Output a value (prbability) fr each class E.g. is Class 0.7 Class

21 MCS: Fusin Label Output: Demcracy Methd Unanimity (all agree) Many unknwn cases Simple Majrity (50% + ) Many unknwn cases Majrity Vte (mst vtes) Usually can make a decisin Als called the Plurality MCS: Fusin Cntinuus-valued utput : Average µ ( ) = j i= Simple Average: weight (w) is the same fr a classifier L w d i, j( ) Weighted Average a mre accurate classifier has a larger weight (w) i where: w i : weight f the ithclassifier d i,j : utput f the ithclassifier n the class j 8 8 Multiple System (MCS) MCS must be better than Single All cases: NO! But in many cases it is true 83

Pattern Recognition 2014 Support Vector Machines

Pattern Recognition 2014 Support Vector Machines Pattern Recgnitin 2014 Supprt Vectr Machines Ad Feelders Universiteit Utrecht Ad Feelders ( Universiteit Utrecht ) Pattern Recgnitin 1 / 55 Overview 1 Separable Case 2 Kernel Functins 3 Allwing Errrs (Sft

More information

IAML: Support Vector Machines

IAML: Support Vector Machines 1 / 22 IAML: Supprt Vectr Machines Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester 1 2 / 22 Outline Separating hyperplane with maimum margin Nn-separable training data Epanding the input int

More information

k-nearest Neighbor How to choose k Average of k points more reliable when: Large k: noise in attributes +o o noise in class labels

k-nearest Neighbor How to choose k Average of k points more reliable when: Large k: noise in attributes +o o noise in class labels Mtivating Example Memry-Based Learning Instance-Based Learning K-earest eighbr Inductive Assumptin Similar inputs map t similar utputs If nt true => learning is impssible If true => learning reduces t

More information

Chapter 3: Cluster Analysis

Chapter 3: Cluster Analysis Chapter 3: Cluster Analysis } 3.1 Basic Cncepts f Clustering 3.1.1 Cluster Analysis 3.1. Clustering Categries } 3. Partitining Methds 3..1 The principle 3.. K-Means Methd 3..3 K-Medids Methd 3..4 CLARA

More information

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised

More information

Resampling Methods. Cross-validation, Bootstrapping. Marek Petrik 2/21/2017

Resampling Methods. Cross-validation, Bootstrapping. Marek Petrik 2/21/2017 Resampling Methds Crss-validatin, Btstrapping Marek Petrik 2/21/2017 Sme f the figures in this presentatin are taken frm An Intrductin t Statistical Learning, with applicatins in R (Springer, 2013) with

More information

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised

More information

In SMV I. IAML: Support Vector Machines II. This Time. The SVM optimization problem. We saw:

In SMV I. IAML: Support Vector Machines II. This Time. The SVM optimization problem. We saw: In SMV I IAML: Supprt Vectr Machines II Nigel Gddard Schl f Infrmatics Semester 1 We sa: Ma margin trick Gemetry f the margin and h t cmpute it Finding the ma margin hyperplane using a cnstrained ptimizatin

More information

The blessing of dimensionality for kernel methods

The blessing of dimensionality for kernel methods fr kernel methds Building classifiers in high dimensinal space Pierre Dupnt Pierre.Dupnt@ucluvain.be Classifiers define decisin surfaces in sme feature space where the data is either initially represented

More information

Part 3 Introduction to statistical classification techniques

Part 3 Introduction to statistical classification techniques Part 3 Intrductin t statistical classificatin techniques Machine Learning, Part 3, March 07 Fabi Rli Preamble ØIn Part we have seen that if we knw: Psterir prbabilities P(ω i / ) Or the equivalent terms

More information

Tree Structured Classifier

Tree Structured Classifier Tree Structured Classifier Reference: Classificatin and Regressin Trees by L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stne, Chapman & Hall, 98. A Medical Eample (CART): Predict high risk patients

More information

Support-Vector Machines

Support-Vector Machines Supprt-Vectr Machines Intrductin Supprt vectr machine is a linear machine with sme very nice prperties. Haykin chapter 6. See Alpaydin chapter 13 fr similar cntent. Nte: Part f this lecture drew material

More information

x 1 Outline IAML: Logistic Regression Decision Boundaries Example Data

x 1 Outline IAML: Logistic Regression Decision Boundaries Example Data Outline IAML: Lgistic Regressin Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester Lgistic functin Lgistic regressin Learning lgistic regressin Optimizatin The pwer f nn-linear basis functins Least-squares

More information

COMP 551 Applied Machine Learning Lecture 11: Support Vector Machines

COMP 551 Applied Machine Learning Lecture 11: Support Vector Machines COMP 551 Applied Machine Learning Lecture 11: Supprt Vectr Machines Instructr: (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted fr this curse

More information

COMP9444 Neural Networks and Deep Learning 3. Backpropagation

COMP9444 Neural Networks and Deep Learning 3. Backpropagation COMP9444 Neural Netwrks and Deep Learning 3. Backprpagatin Tetbk, Sectins 4.3, 5.2, 6.5.2 COMP9444 17s2 Backprpagatin 1 Outline Supervised Learning Ockham s Razr (5.2) Multi-Layer Netwrks Gradient Descent

More information

Resampling Methods. Chapter 5. Chapter 5 1 / 52

Resampling Methods. Chapter 5. Chapter 5 1 / 52 Resampling Methds Chapter 5 Chapter 5 1 / 52 1 51 Validatin set apprach 2 52 Crss validatin 3 53 Btstrap Chapter 5 2 / 52 Abut Resampling An imprtant statistical tl Pretending the data as ppulatin and

More information

COMP 551 Applied Machine Learning Lecture 4: Linear classification

COMP 551 Applied Machine Learning Lecture 4: Linear classification COMP 551 Applied Machine Learning Lecture 4: Linear classificatin Instructr: Jelle Pineau (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted

More information

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must M.E. Aggune, M.J. Dambrg, M.A. El-Sharkawi, R.J. Marks II and L.E. Atlas, "Dynamic and static security assessment f pwer systems using artificial neural netwrks", Prceedings f the NSF Wrkshp n Applicatins

More information

What is Statistical Learning?

What is Statistical Learning? What is Statistical Learning? Sales 5 10 15 20 25 Sales 5 10 15 20 25 Sales 5 10 15 20 25 0 50 100 200 300 TV 0 10 20 30 40 50 Radi 0 20 40 60 80 100 Newspaper Shwn are Sales vs TV, Radi and Newspaper,

More information

Artificial Neural Networks MLP, Backpropagation

Artificial Neural Networks MLP, Backpropagation Artificial Neural Netwrks MLP, Backprpagatin 01001110 01100101 01110101 01110010 01101111 01101110 01101111 01110110 01100001 00100000 01110011 01101011 01110101 01110000 01101001 01101110 01100001 00100000

More information

COMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification

COMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification COMP 551 Applied Machine Learning Lecture 5: Generative mdels fr linear classificatin Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Jelle Pineau Class web page: www.cs.mcgill.ca/~hvanh2/cmp551

More information

Elements of Machine Intelligence - I

Elements of Machine Intelligence - I ECE-175A Elements f Machine Intelligence - I Ken Kreutz-Delgad Nun Vascncels ECE Department, UCSD Winter 2011 The curse The curse will cver basic, but imprtant, aspects f machine learning and pattern recgnitin

More information

Data Mining: Concepts and Techniques. Classification and Prediction. Chapter February 8, 2007 CSE-4412: Data Mining 1

Data Mining: Concepts and Techniques. Classification and Prediction. Chapter February 8, 2007 CSE-4412: Data Mining 1 Data Mining: Cncepts and Techniques Classificatin and Predictin Chapter 6.4-6 February 8, 2007 CSE-4412: Data Mining 1 Chapter 6 Classificatin and Predictin 1. What is classificatin? What is predictin?

More information

COMP 551 Applied Machine Learning Lecture 9: Support Vector Machines (cont d)

COMP 551 Applied Machine Learning Lecture 9: Support Vector Machines (cont d) COMP 551 Applied Machine Learning Lecture 9: Supprt Vectr Machines (cnt d) Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Class web page: www.cs.mcgill.ca/~hvanh2/cmp551 Unless therwise

More information

Administrativia. Assignment 1 due thursday 9/23/2004 BEFORE midnight. Midterm exam 10/07/2003 in class. CS 460, Sessions 8-9 1

Administrativia. Assignment 1 due thursday 9/23/2004 BEFORE midnight. Midterm exam 10/07/2003 in class. CS 460, Sessions 8-9 1 Administrativia Assignment 1 due thursday 9/23/2004 BEFORE midnight Midterm eam 10/07/2003 in class CS 460, Sessins 8-9 1 Last time: search strategies Uninfrmed: Use nly infrmatin available in the prblem

More information

Enhancing Performance of MLP/RBF Neural Classifiers via an Multivariate Data Distribution Scheme

Enhancing Performance of MLP/RBF Neural Classifiers via an Multivariate Data Distribution Scheme Enhancing Perfrmance f / Neural Classifiers via an Multivariate Data Distributin Scheme Halis Altun, Gökhan Gelen Nigde University, Electrical and Electrnics Engineering Department Nigde, Turkey haltun@nigde.edu.tr

More information

A Scalable Recurrent Neural Network Framework for Model-free

A Scalable Recurrent Neural Network Framework for Model-free A Scalable Recurrent Neural Netwrk Framewrk fr Mdel-free POMDPs April 3, 2007 Zhenzhen Liu, Itamar Elhanany Machine Intelligence Lab Department f Electrical and Cmputer Engineering The University f Tennessee

More information

You need to be able to define the following terms and answer basic questions about them:

You need to be able to define the following terms and answer basic questions about them: CS440/ECE448 Sectin Q Fall 2017 Midterm Review Yu need t be able t define the fllwing terms and answer basic questins abut them: Intr t AI, agents and envirnments Pssible definitins f AI, prs and cns f

More information

Internal vs. external validity. External validity. This section is based on Stock and Watson s Chapter 9.

Internal vs. external validity. External validity. This section is based on Stock and Watson s Chapter 9. Sectin 7 Mdel Assessment This sectin is based n Stck and Watsn s Chapter 9. Internal vs. external validity Internal validity refers t whether the analysis is valid fr the ppulatin and sample being studied.

More information

CAUSAL INFERENCE. Technical Track Session I. Phillippe Leite. The World Bank

CAUSAL INFERENCE. Technical Track Session I. Phillippe Leite. The World Bank CAUSAL INFERENCE Technical Track Sessin I Phillippe Leite The Wrld Bank These slides were develped by Christel Vermeersch and mdified by Phillippe Leite fr the purpse f this wrkshp Plicy questins are causal

More information

The Kullback-Leibler Kernel as a Framework for Discriminant and Localized Representations for Visual Recognition

The Kullback-Leibler Kernel as a Framework for Discriminant and Localized Representations for Visual Recognition The Kullback-Leibler Kernel as a Framewrk fr Discriminant and Lcalized Representatins fr Visual Recgnitin Nun Vascncels Purdy H Pedr Mren ECE Department University f Califrnia, San Dieg HP Labs Cambridge

More information

Data Mining with Linear Discriminants. Exercise: Business Intelligence (Part 6) Summer Term 2014 Stefan Feuerriegel

Data Mining with Linear Discriminants. Exercise: Business Intelligence (Part 6) Summer Term 2014 Stefan Feuerriegel Data Mining with Linear Discriminants Exercise: Business Intelligence (Part 6) Summer Term 2014 Stefan Feuerriegel Tday s Lecture Objectives 1 Recgnizing the ideas f artificial neural netwrks and their

More information

Admin. MDP Search Trees. Optimal Quantities. Reinforcement Learning

Admin. MDP Search Trees. Optimal Quantities. Reinforcement Learning Admin Reinfrcement Learning Cntent adapted frm Berkeley CS188 MDP Search Trees Each MDP state prjects an expectimax-like search tree Optimal Quantities The value (utility) f a state s: V*(s) = expected

More information

Early detection of mining truck failure by modelling its operation with neural networks classification algorithms

Early detection of mining truck failure by modelling its operation with neural networks classification algorithms RU, Rand GOLOSINSKI, T.S. Early detectin f mining truck failure by mdelling its peratin with neural netwrks classificatin algrithms. Applicatin f Cmputers and Operatins Research ill the Minerals Industries,

More information

CS:4420 Artificial Intelligence

CS:4420 Artificial Intelligence CS:4420 Artificial Intelligence Spring 2017 Learning frm Examples Cesare Tinelli The University f Iwa Cpyright 2004 17, Cesare Tinelli and Stuart Russell a a These ntes were riginally develped by Stuart

More information

Analysis on the Stability of Reservoir Soil Slope Based on Fuzzy Artificial Neural Network

Analysis on the Stability of Reservoir Soil Slope Based on Fuzzy Artificial Neural Network Research Jurnal f Applied Sciences, Engineering and Technlgy 5(2): 465-469, 2013 ISSN: 2040-7459; E-ISSN: 2040-7467 Maxwell Scientific Organizatin, 2013 Submitted: May 08, 2012 Accepted: May 29, 2012 Published:

More information

Math Foundations 20 Work Plan

Math Foundations 20 Work Plan Math Fundatins 20 Wrk Plan Units / Tpics 20.8 Demnstrate understanding f systems f linear inequalities in tw variables. Time Frame December 1-3 weeks 6-10 Majr Learning Indicatrs Identify situatins relevant

More information

INTRODUCTION TO MACHINE LEARNING FOR MEDICINE

INTRODUCTION TO MACHINE LEARNING FOR MEDICINE Fall 2017 INTRODUCTION TO MACHINE LEARNING FOR MEDICINE Carla E. Brdley Prfessr & Dean Cllege f Cmputer and Infrmatin Science Nrtheastern University WHAT IS MACHINE LEARNING/DATA MINING? Figure is frm

More information

Reinforcement Learning" CMPSCI 383 Nov 29, 2011!

Reinforcement Learning CMPSCI 383 Nov 29, 2011! Reinfrcement Learning" CMPSCI 383 Nv 29, 2011! 1 Tdayʼs lecture" Review f Chapter 17: Making Cmple Decisins! Sequential decisin prblems! The mtivatin and advantages f reinfrcement learning.! Passive learning!

More information

Simple Linear Regression (single variable)

Simple Linear Regression (single variable) Simple Linear Regressin (single variable) Intrductin t Machine Learning Marek Petrik January 31, 2017 Sme f the figures in this presentatin are taken frm An Intrductin t Statistical Learning, with applicatins

More information

COMP9414/ 9814/ 3411: Artificial Intelligence. 14. Course Review. COMP3411 c UNSW, 2014

COMP9414/ 9814/ 3411: Artificial Intelligence. 14. Course Review. COMP3411 c UNSW, 2014 COMP9414/ 9814/ 3411: Artificial Intelligence 14. Curse Review COMP9414/9814/3411 14s1 Review 1 Assessment Assessable cmpnents f the curse: Assignment 1 10% Assignment 2 8% Assignment 3 12% Written Eam

More information

1996 Engineering Systems Design and Analysis Conference, Montpellier, France, July 1-4, 1996, Vol. 7, pp

1996 Engineering Systems Design and Analysis Conference, Montpellier, France, July 1-4, 1996, Vol. 7, pp THE POWER AND LIMIT OF NEURAL NETWORKS T. Y. Lin Department f Mathematics and Cmputer Science San Jse State University San Jse, Califrnia 959-003 tylin@cs.ssu.edu and Bereley Initiative in Sft Cmputing*

More information

Differentiation Applications 1: Related Rates

Differentiation Applications 1: Related Rates Differentiatin Applicatins 1: Related Rates 151 Differentiatin Applicatins 1: Related Rates Mdel 1: Sliding Ladder 10 ladder y 10 ladder 10 ladder A 10 ft ladder is leaning against a wall when the bttm

More information

Midwest Big Data Summer School: Machine Learning I: Introduction. Kris De Brabanter

Midwest Big Data Summer School: Machine Learning I: Introduction. Kris De Brabanter Midwest Big Data Summer Schl: Machine Learning I: Intrductin Kris De Brabanter kbrabant@iastate.edu Iwa State University Department f Statistics Department f Cmputer Science June 24, 2016 1/24 Outline

More information

Slide04 (supplemental) Haykin Chapter 4 (both 2nd and 3rd ed): Multi-Layer Perceptrons

Slide04 (supplemental) Haykin Chapter 4 (both 2nd and 3rd ed): Multi-Layer Perceptrons Slide04 supplemental) Haykin Chapter 4 bth 2nd and 3rd ed): Multi-Layer Perceptrns CPSC 636-600 Instructr: Ynsuck Che Heuristic fr Making Backprp Perfrm Better 1. Sequential vs. batch update: fr large

More information

SURVIVAL ANALYSIS WITH SUPPORT VECTOR MACHINES

SURVIVAL ANALYSIS WITH SUPPORT VECTOR MACHINES 1 SURVIVAL ANALYSIS WITH SUPPORT VECTOR MACHINES Wlfgang HÄRDLE Ruslan MORO Center fr Applied Statistics and Ecnmics (CASE), Humbldt-Universität zu Berlin Mtivatin 2 Applicatins in Medicine estimatin f

More information

Linear Classification

Linear Classification Linear Classificatin CS 54: Machine Learning Slides adapted frm Lee Cper, Jydeep Ghsh, and Sham Kakade Review: Linear Regressin CS 54 [Spring 07] - H Regressin Given an input vectr x T = (x, x,, xp), we

More information

T Algorithmic methods for data mining. Slide set 6: dimensionality reduction

T Algorithmic methods for data mining. Slide set 6: dimensionality reduction T-61.5060 Algrithmic methds fr data mining Slide set 6: dimensinality reductin reading assignment LRU bk: 11.1 11.3 PCA tutrial in mycurses (ptinal) ptinal: An Elementary Prf f a Therem f Jhnsn and Lindenstrauss,

More information

Image Processing 1 (IP1) Bildverarbeitung 1

Image Processing 1 (IP1) Bildverarbeitung 1 MIN-Fakultät Fachbereich Infrmatik Arbeitsbereich SAV/BV (KOGS) Image Prcessing 1 (IP1) Bildverarbeitung 1 Lecture 15 Pa;ern Recgni=n Winter Semester 2014/15 Dr. Benjamin Seppke Prf. Siegfried S=ehl What

More information

Five Whys How To Do It Better

Five Whys How To Do It Better Five Whys Definitin. As explained in the previus article, we define rt cause as simply the uncvering f hw the current prblem came int being. Fr a simple causal chain, it is the entire chain. Fr a cmplex

More information

STATS216v Introduction to Statistical Learning Stanford University, Summer Practice Final (Solutions) Duration: 3 hours

STATS216v Introduction to Statistical Learning Stanford University, Summer Practice Final (Solutions) Duration: 3 hours STATS216v Intrductin t Statistical Learning Stanfrd University, Summer 2016 Practice Final (Slutins) Duratin: 3 hurs Instructins: (This is a practice final and will nt be graded.) Remember the university

More information

NUMBERS, MATHEMATICS AND EQUATIONS

NUMBERS, MATHEMATICS AND EQUATIONS AUSTRALIAN CURRICULUM PHYSICS GETTING STARTED WITH PHYSICS NUMBERS, MATHEMATICS AND EQUATIONS An integral part t the understanding f ur physical wrld is the use f mathematical mdels which can be used t

More information

Feedforward Neural Networks

Feedforward Neural Networks Feedfrward Neural Netwrks Yagmur Gizem Cinar, Eric Gaussier AMA, LIG, Univ. Grenble Alpes 17 March 2017 Yagmur Gizem Cinar, Eric Gaussier Multilayer Perceptrns (MLP) 17 March 2017 1 / 42 Reference Bk Deep

More information

MATCHING TECHNIQUES. Technical Track Session VI. Emanuela Galasso. The World Bank

MATCHING TECHNIQUES. Technical Track Session VI. Emanuela Galasso. The World Bank MATCHING TECHNIQUES Technical Track Sessin VI Emanuela Galass The Wrld Bank These slides were develped by Christel Vermeersch and mdified by Emanuela Galass fr the purpse f this wrkshp When can we use

More information

Turing Machines. Human-aware Robotics. 2017/10/17 & 19 Chapter 3.2 & 3.3 in Sipser Ø Announcement:

Turing Machines. Human-aware Robotics. 2017/10/17 & 19 Chapter 3.2 & 3.3 in Sipser Ø Announcement: Turing Machines Human-aware Rbtics 2017/10/17 & 19 Chapter 3.2 & 3.3 in Sipser Ø Annuncement: q q q q Slides fr this lecture are here: http://www.public.asu.edu/~yzhan442/teaching/cse355/lectures/tm-ii.pdf

More information

Lab 1 The Scientific Method

Lab 1 The Scientific Method INTRODUCTION The fllwing labratry exercise is designed t give yu, the student, an pprtunity t explre unknwn systems, r universes, and hypthesize pssible rules which may gvern the behavir within them. Scientific

More information

Churn Prediction using Dynamic RFM-Augmented node2vec

Churn Prediction using Dynamic RFM-Augmented node2vec Churn Predictin using Dynamic RFM-Augmented nde2vec Sandra Mitrvić, Jchen de Weerdt, Bart Baesens & Wilfried Lemahieu Department f Decisin Sciences and Infrmatin Management, KU Leuven 18 September 2017,

More information

Biplots in Practice MICHAEL GREENACRE. Professor of Statistics at the Pompeu Fabra University. Chapter 13 Offprint

Biplots in Practice MICHAEL GREENACRE. Professor of Statistics at the Pompeu Fabra University. Chapter 13 Offprint Biplts in Practice MICHAEL GREENACRE Prfessr f Statistics at the Pmpeu Fabra University Chapter 13 Offprint CASE STUDY BIOMEDICINE Cmparing Cancer Types Accrding t Gene Epressin Arrays First published:

More information

A New Evaluation Measure. J. Joiner and L. Werner. The problems of evaluation and the needed criteria of evaluation

A New Evaluation Measure. J. Joiner and L. Werner. The problems of evaluation and the needed criteria of evaluation III-l III. A New Evaluatin Measure J. Jiner and L. Werner Abstract The prblems f evaluatin and the needed criteria f evaluatin measures in the SMART system f infrmatin retrieval are reviewed and discussed.

More information

Distributions, spatial statistics and a Bayesian perspective

Distributions, spatial statistics and a Bayesian perspective Distributins, spatial statistics and a Bayesian perspective Dug Nychka Natinal Center fr Atmspheric Research Distributins and densities Cnditinal distributins and Bayes Thm Bivariate nrmal Spatial statistics

More information

Fall 2013 Physics 172 Recitation 3 Momentum and Springs

Fall 2013 Physics 172 Recitation 3 Momentum and Springs Fall 03 Physics 7 Recitatin 3 Mmentum and Springs Purpse: The purpse f this recitatin is t give yu experience wrking with mmentum and the mmentum update frmula. Readings: Chapter.3-.5 Learning Objectives:.3.

More information

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) >

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) > Btstrap Methd > # Purpse: understand hw btstrap methd wrks > bs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(bs) > mean(bs) [1] 21.64625 > # estimate f lambda > lambda = 1/mean(bs);

More information

NAME: Prof. Ruiz. 1. [5 points] What is the difference between simple random sampling and stratified random sampling?

NAME: Prof. Ruiz. 1. [5 points] What is the difference between simple random sampling and stratified random sampling? CS4445 ata Mining and Kwledge iscery in atabases. B Term 2014 Exam 1 Nember 24, 2014 Prf. Carlina Ruiz epartment f Cmputer Science Wrcester Plytechnic Institute NAME: Prf. Ruiz Prblem I: Prblem II: Prblem

More information

CN700 Additive Models and Trees Chapter 9: Hastie et al. (2001)

CN700 Additive Models and Trees Chapter 9: Hastie et al. (2001) CN700 Additive Mdels and Trees Chapter 9: Hastie et al. (2001) Madhusudana Shashanka Department f Cgnitive and Neural Systems Bstn University CN700 - Additive Mdels and Trees March 02, 2004 p.1/34 Overview

More information

Professional Development. Implementing the NGSS: High School Physics

Professional Development. Implementing the NGSS: High School Physics Prfessinal Develpment Implementing the NGSS: High Schl Physics This is a dem. The 30-min vide webinar is available in the full PD. Get it here. Tday s Learning Objectives NGSS key cncepts why this is different

More information

Activity Guide Loops and Random Numbers

Activity Guide Loops and Random Numbers Unit 3 Lessn 7 Name(s) Perid Date Activity Guide Lps and Randm Numbers CS Cntent Lps are a relatively straightfrward idea in prgramming - yu want a certain chunk f cde t run repeatedly - but it takes a

More information

Hypothesis Tests for One Population Mean

Hypothesis Tests for One Population Mean Hypthesis Tests fr One Ppulatin Mean Chapter 9 Ala Abdelbaki Objective Objective: T estimate the value f ne ppulatin mean Inferential statistics using statistics in rder t estimate parameters We will be

More information

INSTRUMENTAL VARIABLES

INSTRUMENTAL VARIABLES INSTRUMENTAL VARIABLES Technical Track Sessin IV Sergi Urzua University f Maryland Instrumental Variables and IE Tw main uses f IV in impact evaluatin: 1. Crrect fr difference between assignment f treatment

More information

The standards are taught in the following sequence.

The standards are taught in the following sequence. B L U E V A L L E Y D I S T R I C T C U R R I C U L U M MATHEMATICS Third Grade In grade 3, instructinal time shuld fcus n fur critical areas: (1) develping understanding f multiplicatin and divisin and

More information

Lecture 02 CSE 40547/60547 Computing at the Nanoscale

Lecture 02 CSE 40547/60547 Computing at the Nanoscale PN Junctin Ntes: Lecture 02 CSE 40547/60547 Cmputing at the Nanscale Letʼs start with a (very) shrt review f semi-cnducting materials: - N-type material: Obtained by adding impurity with 5 valence elements

More information

Computational modeling techniques

Computational modeling techniques Cmputatinal mdeling techniques Lecture 4: Mdel checing fr ODE mdels In Petre Department f IT, Åb Aademi http://www.users.ab.fi/ipetre/cmpmd/ Cntent Stichimetric matrix Calculating the mass cnservatin relatins

More information

Lesson Plan. Recode: They will do a graphic organizer to sequence the steps of scientific method.

Lesson Plan. Recode: They will do a graphic organizer to sequence the steps of scientific method. Lessn Plan Reach: Ask the students if they ever ppped a bag f micrwave ppcrn and nticed hw many kernels were unppped at the bttm f the bag which made yu wnder if ther brands pp better than the ne yu are

More information

Cells though to send feedback signals from the medulla back to the lamina o L: Lamina Monopolar cells

Cells though to send feedback signals from the medulla back to the lamina o L: Lamina Monopolar cells Classificatin Rules (and Exceptins) Name: Cell type fllwed by either a clumn ID (determined by the visual lcatin f the cell) r a numeric identifier t separate ut different examples f a given cell type

More information

Time, Synchronization, and Wireless Sensor Networks

Time, Synchronization, and Wireless Sensor Networks Time, Synchrnizatin, and Wireless Sensr Netwrks Part II Ted Herman University f Iwa Ted Herman/March 2005 1 Presentatin: Part II metrics and techniques single-hp beacns reginal time znes ruting-structure

More information

CS 109 Lecture 23 May 18th, 2016

CS 109 Lecture 23 May 18th, 2016 CS 109 Lecture 23 May 18th, 2016 New Datasets Heart Ancestry Netflix Our Path Parameter Estimatin Machine Learning: Frmally Many different frms f Machine Learning We fcus n the prblem f predictin Want

More information

NUROP CONGRESS PAPER CHINESE PINYIN TO CHINESE CHARACTER CONVERSION

NUROP CONGRESS PAPER CHINESE PINYIN TO CHINESE CHARACTER CONVERSION NUROP Chinese Pinyin T Chinese Character Cnversin NUROP CONGRESS PAPER CHINESE PINYIN TO CHINESE CHARACTER CONVERSION CHIA LI SHI 1 AND LUA KIM TENG 2 Schl f Cmputing, Natinal University f Singapre 3 Science

More information

Particle Size Distributions from SANS Data Using the Maximum Entropy Method. By J. A. POTTON, G. J. DANIELL AND B. D. RAINFORD

Particle Size Distributions from SANS Data Using the Maximum Entropy Method. By J. A. POTTON, G. J. DANIELL AND B. D. RAINFORD 3 J. Appl. Cryst. (1988). 21,3-8 Particle Size Distributins frm SANS Data Using the Maximum Entrpy Methd By J. A. PTTN, G. J. DANIELL AND B. D. RAINFRD Physics Department, The University, Suthamptn S9

More information

Checking the resolved resonance region in EXFOR database

Checking the resolved resonance region in EXFOR database Checking the reslved resnance regin in EXFOR database Gttfried Bertn Sciété de Calcul Mathématique (SCM) Oscar Cabells OECD/NEA Data Bank JEFF Meetings - Sessin JEFF Experiments Nvember 0-4, 017 Bulgne-Billancurt,

More information

CHAPTER 3 INEQUALITIES. Copyright -The Institute of Chartered Accountants of India

CHAPTER 3 INEQUALITIES. Copyright -The Institute of Chartered Accountants of India CHAPTER 3 INEQUALITIES Cpyright -The Institute f Chartered Accuntants f India INEQUALITIES LEARNING OBJECTIVES One f the widely used decisin making prblems, nwadays, is t decide n the ptimal mix f scarce

More information

Smoothing, penalized least squares and splines

Smoothing, penalized least squares and splines Smthing, penalized least squares and splines Duglas Nychka, www.image.ucar.edu/~nychka Lcally weighted averages Penalized least squares smthers Prperties f smthers Splines and Reprducing Kernels The interplatin

More information

LHS Mathematics Department Honors Pre-Calculus Final Exam 2002 Answers

LHS Mathematics Department Honors Pre-Calculus Final Exam 2002 Answers LHS Mathematics Department Hnrs Pre-alculus Final Eam nswers Part Shrt Prblems The table at the right gives the ppulatin f Massachusetts ver the past several decades Using an epnential mdel, predict the

More information

AP Statistics Notes Unit Two: The Normal Distributions

AP Statistics Notes Unit Two: The Normal Distributions AP Statistics Ntes Unit Tw: The Nrmal Distributins Syllabus Objectives: 1.5 The student will summarize distributins f data measuring the psitin using quartiles, percentiles, and standardized scres (z-scres).

More information

Aircraft Performance - Drag

Aircraft Performance - Drag Aircraft Perfrmance - Drag Classificatin f Drag Ntes: Drag Frce and Drag Cefficient Drag is the enemy f flight and its cst. One f the primary functins f aerdynamicists and aircraft designers is t reduce

More information

, which yields. where z1. and z2

, which yields. where z1. and z2 The Gaussian r Nrmal PDF, Page 1 The Gaussian r Nrmal Prbability Density Functin Authr: Jhn M Cimbala, Penn State University Latest revisin: 11 September 13 The Gaussian r Nrmal Prbability Density Functin

More information

Sequential Allocation with Minimal Switching

Sequential Allocation with Minimal Switching In Cmputing Science and Statistics 28 (1996), pp. 567 572 Sequential Allcatin with Minimal Switching Quentin F. Stut 1 Janis Hardwick 1 EECS Dept., University f Michigan Statistics Dept., Purdue University

More information

Name: Block: Date: Science 10: The Great Geyser Experiment A controlled experiment

Name: Block: Date: Science 10: The Great Geyser Experiment A controlled experiment Science 10: The Great Geyser Experiment A cntrlled experiment Yu will prduce a GEYSER by drpping Ments int a bttle f diet pp Sme questins t think abut are: What are yu ging t test? What are yu ging t measure?

More information

SPH3U1 Lesson 06 Kinematics

SPH3U1 Lesson 06 Kinematics PROJECTILE MOTION LEARNING GOALS Students will: Describe the mtin f an bject thrwn at arbitrary angles thrugh the air. Describe the hrizntal and vertical mtins f a prjectile. Slve prjectile mtin prblems.

More information

MATCHING TECHNIQUES Technical Track Session VI Céline Ferré The World Bank

MATCHING TECHNIQUES Technical Track Session VI Céline Ferré The World Bank MATCHING TECHNIQUES Technical Track Sessin VI Céline Ferré The Wrld Bank When can we use matching? What if the assignment t the treatment is nt dne randmly r based n an eligibility index, but n the basis

More information

Lecture 13: Electrochemical Equilibria

Lecture 13: Electrochemical Equilibria 3.012 Fundamentals f Materials Science Fall 2005 Lecture 13: 10.21.05 Electrchemical Equilibria Tday: LAST TIME...2 An example calculatin...3 THE ELECTROCHEMICAL POTENTIAL...4 Electrstatic energy cntributins

More information

SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical model for microarray data analysis

SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical model for microarray data analysis SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical mdel fr micrarray data analysis David Rssell Department f Bistatistics M.D. Andersn Cancer Center, Hustn, TX 77030, USA rsselldavid@gmail.cm

More information

ENSC Discrete Time Systems. Project Outline. Semester

ENSC Discrete Time Systems. Project Outline. Semester ENSC 49 - iscrete Time Systems Prject Outline Semester 006-1. Objectives The gal f the prject is t design a channel fading simulatr. Upn successful cmpletin f the prject, yu will reinfrce yur understanding

More information

Department of Economics, University of California, Davis Ecn 200C Micro Theory Professor Giacomo Bonanno. Insurance Markets

Department of Economics, University of California, Davis Ecn 200C Micro Theory Professor Giacomo Bonanno. Insurance Markets Department f Ecnmics, University f alifrnia, Davis Ecn 200 Micr Thery Prfessr Giacm Bnann Insurance Markets nsider an individual wh has an initial wealth f. ith sme prbability p he faces a lss f x (0

More information

NGSS High School Physics Domain Model

NGSS High School Physics Domain Model NGSS High Schl Physics Dmain Mdel Mtin and Stability: Frces and Interactins HS-PS2-1: Students will be able t analyze data t supprt the claim that Newtn s secnd law f mtin describes the mathematical relatinship

More information

ENG2410 Digital Design Sequential Circuits: Part A

ENG2410 Digital Design Sequential Circuits: Part A ENG2410 Digital Design Sequential Circuits: Part A Fall 2017 S. Areibi Schl f Engineering University f Guelph Week #6 Tpics Sequential Circuit Definitins Latches Flip-Flps Delays in Sequential Circuits

More information

Coalition Formation and Data Envelopment Analysis

Coalition Formation and Data Envelopment Analysis Jurnal f CENTRU Cathedra Vlume 4, Issue 2, 20 26-223 JCC Jurnal f CENTRU Cathedra Calitin Frmatin and Data Envelpment Analysis Rlf Färe Oregn State University, Crvallis, OR, USA Shawna Grsspf Oregn State

More information

5 th grade Common Core Standards

5 th grade Common Core Standards 5 th grade Cmmn Cre Standards In Grade 5, instructinal time shuld fcus n three critical areas: (1) develping fluency with additin and subtractin f fractins, and develping understanding f the multiplicatin

More information

CONSTRUCTING STATECHART DIAGRAMS

CONSTRUCTING STATECHART DIAGRAMS CONSTRUCTING STATECHART DIAGRAMS The fllwing checklist shws the necessary steps fr cnstructing the statechart diagrams f a class. Subsequently, we will explain the individual steps further. Checklist 4.6

More information

Exam #1. A. Answer any 1 of the following 2 questions. CEE 371 October 8, Please grade the following questions: 1 or 2

Exam #1. A. Answer any 1 of the following 2 questions. CEE 371 October 8, Please grade the following questions: 1 or 2 CEE 371 Octber 8, 2009 Exam #1 Clsed Bk, ne sheet f ntes allwed Please answer ne questin frm the first tw, ne frm the secnd tw and ne frm the last three. The ttal ptential number f pints is 100. Shw all

More information

Computational modeling techniques

Computational modeling techniques Cmputatinal mdeling techniques Lecture 11: Mdeling with systems f ODEs In Petre Department f IT, Ab Akademi http://www.users.ab.fi/ipetre/cmpmd/ Mdeling with differential equatins Mdeling strategy Fcus

More information

A Matrix Representation of Panel Data

A Matrix Representation of Panel Data web Extensin 6 Appendix 6.A A Matrix Representatin f Panel Data Panel data mdels cme in tw brad varieties, distinct intercept DGPs and errr cmpnent DGPs. his appendix presents matrix algebra representatins

More information