2018 CS420, Machine Learning, Lecture 5. Tree Models. Weinan Zhang Shanghai Jiao Tong University
|
|
- Rose Nelson
- 5 years ago
- Views:
Transcription
1 2018 CS420, Machine Learning, Lecture 5 Tree Models Weinan Zhang Shanghai Jiao Tong University
2 ML Task: Function Approximation Problem setting Instance feature space Instance label space Y Unknown underlying function (target) Set of function hypothesis f : 7! Y H = fhjh : 7! Yg Input: training data generated from the unknown f(x (i) ;y (i) )g = f(x (1) ;y (1) );:::;(x (n) ;y (n) )g Output: a hypothesis h 2 H that best approximates Optimize in functional space, not just parameter space f
3 Optimize in Functional Space Tree models Intermediate node for splitting data Leaf node for label prediction Continuous data example x 2 Class 1 Class 2 x 1 < a 1 Root Node a 3 a 2 Yes No x 2 < a 2 x 2 < a 3 Intermediate Node Class 2 Class 1 Yes No Yes No a 1 x 1 y = -1 y = 1 y = 1 y = -1 Leaf Node
4 Optimize in Functional Space Tree models Intermediate node for splitting data Leaf node for label prediction Discrete/categorical data example Outlook Root Node Sunny Overcast Rain Humidity y = 1 Wind Intermediate Node Leaf High Normal Strong Weak Node y = -1 y = 1 y = -1 y = 1 Leaf Node
5 Decision Tree Learning Problem setting Instance feature space Instance label space Y Unknown underlying function (target) Set of function hypothesis f : 7! Y H = fhjh : 7! Yg Input: training data generated from the unknown f(x (i) ;y (i) )g = f(x (1) ;y (1) );:::;(x (n) ;y (n) )g Output: a hypothesis h 2 H that best approximates Here each hypothesis is a decision tree h f
6 Decision Tree Decision Boundary Decision trees divide the feature space into axisparallel (hyper-)rectangles Each rectangular region is labeled with one label or a probabilistic distribution over labels Slide credit: Eric Eaton
7 History of Decision-Tree Research Hunt and colleagues used exhaustive search decision-tree methods (CLS) to model human concept learning in the 1960 s. In the late 70 s, Quinlan developed ID3 with the information gain heuristic to learn expert systems from examples. Simultaneously, Breiman and Friedman and colleagues developed CART (Classification and Regression Trees), similar to ID3. In the 1980 s a variety of improvements were introduced to handle noise, continuous features, missing features, and improved splitting criteria. Various expert-system development tools results. Quinlan s updated decision-tree package (C4.5) released in Sklearn (python)weka (Java) now include ID3 and C4.5 Slide credit: Raymond J. Mooney
8 Decision Trees Tree models Intermediate node for splitting data Leaf node for label prediction Key questions for decision trees How to select node splitting conditions? How to make prediction? How to decide the tree structure?
9 Node Splitting Which node splitting condition to choose? Outlook Temperature Sunny Overcast Rain Hot Mild Cool Choose the features with higher classification capacity Quantitatively, with higher information gain
10 Fundamentals of Information Theory Entropy (more specifically, Shannon entropy) is the expected value (average) of the information contained in each message. Suppose is a random variable with n discrete values then its entropy H() is P ( = x i )=p i H() = n p i log p i i=1 It is easy to verify n H() = p i log p i i=1 n i=1 1 n log 1 n =logn
11 Illustration of Entropy Entropy of binary distribution H() = p 1 log p 1 (1 p 1 )log(1 p 1 )
12 Cross Entropy Cross entropy is used to measure the difference between two random variable distributions H(; Y )= n P ( = i)logp (Y = i) i=1 Continuous formulation Z H(p; q) = Compared to KL divergence p(x)logq(x)dx Z D KL(pkq) = p(x)log p(x) dx = H(p; q) H(p) q(x)
13 KL-Divergence Kullback Leibler divergence (also called relative entropy) is a measure of how one probability distribution diverges from a second, expected probability distribution Z D KL(pkq) = p(x)log p(x) dx = H(p; q) H(p) q(x)
14 Review Cross Entropy in Logistic Regression Logistic regression is a binary classification model p μ (y =1jx) =¾(μ > x)= p μ (y =0jx) = e μ> x 1+e μ> x 1 1+e μ> x Cross entropy loss function Gradient L(y;x; p μ )= ylog ¾(μ > x) (1 y)log(1 ¾(μ > x)) x; p μ ) 1 = ¾(μ > x) ¾(z)(1 ¾(z))x (1 y) 1 1 ¾(μ > ¾(z)(1 ¾(z))x x) =(¾(μ > x) μ Ã μ +(y ¾(μ > x))x = ¾(z)(1 x
15 Conditional Entropy Entropy H() = n P ( = i) log P ( = i) i=1 Specific conditional entropy of given Y = v H(jY = v) = n P ( = ijy = v)logp ( = ijy = v) i=1 Specific conditional entropy of given Y H(jY )= v2values(y ) P (Y = v)h(jy = v) Information Gain or Mutual Information of given Y I(; Y )=H() H(jY )=H(Y ) H(Y j) =H()+H(Y ) H(; Y ) Entropy of (,Y) instead of cross entropy
16 Information Gain Information Gain or Mutual Information of given Y I(; Y )=H() H(jY ) = P ( = v)logp ( = v)+ v u = P ( = v)logp ( = v)+ v u = P ( = v)logp ( = v)+ v u = P ( = v)logp ( = v) v v P (Y = u) P ( = vjy = u)logp ( = vjy = u) v P ( = v; Y = u)logp ( = vjy = u) v P ( = v; Y = u)[log P ( = v; Y = u) log P (Y = u)] v P (Y = v)logp (Y = v)+ P ( = v; Y = u)logp ( = v; Y = u) u;v =H()+H(Y ) H(; Y ) Entropy of (,Y) instead of cross entropy
17 Node Splitting Information gain H(jY = v) = H(jY )= n P ( = ijy = v)logp ( = ijy = v) i=1 v2values(y ) P (Y = v)h(jy = v) Outlook Temperature Sunny Overcast Rain Hot Mild Cool H(jY = S) = 3 5 log log 2 5 =0:9710 H(jY = O) = 4 4 log 4 4 =0 H(jY = R) = 4 5 log log 1 5 =0:7219 H(jY = H) = 2 4 log log 2 4 =1 H(jY = M) = 1 4 log log 3 4 =0:8113 H(jY = C) = 4 6 log log 2 6 =0:9183 H(jY )= : :7219 = 0:6046 H(jY )= : :9183 = 0:9111 I(; Y )=H() H(jY )=1 0:6046 = 0:3954 I(; Y )=H() H(jY )=1 0:9111 = 0:0889
18 Information Gain Ratio The ratio between information gain and the entropy I R (; Y )= I(; Y ) H Y () H() H(jY ) = H Y () where the entropy (of Y) is H Y () = v2values(y ) j y=v j log j y=vj jj jj where j y=v j is the number of observations with the feature y=v
19 Node Splitting Information gain ratio I R (; Y )= I(; Y ) H Y () H() H(jY ) = H Y () Outlook Temperature Sunny Overcast Rain Hot Mild Cool I(; Y )=H() H(jY )=1 0:6046 = 0:3954 H Y () = 5 14 log log log 5 14 =1:5774 I R (; Y )= 0:3954 1:5774 =0:2507 I(; Y )=H() H(jY )=1 0:9111 = 0:0889 H Y () = 4 14 log log log 6 14 =1:5567 I R (; Y )= 0:0889 1:5567 =0:0571
20 Decision Tree Building: ID3 Algorithm Algorithm framework Start from the root node with all data For each node, calculate the information gain of all possible features Choose the feature with the highest information gain Split the data of the node according to the feature Do the above recursively for each leaf node, until There is no information gain for the leaf node Or there is no feature to select
21 Decision Tree Building: ID3 Algorithm An example decision tree from ID3 Outlook Sunny Overcast Rain Temperature Hot Mild Cool Each path only involves a feature at most once
22 Decision Tree Building: ID3 Algorithm An example decision tree from ID3 Outlook Sunny Overcast Rain Temperature Wind Hot Mild Cool Strong Weak How about this tree, yielding perfect partition?
23 Overfitting Tree model can approximate any finite data by just growing a leaf node for each instance Outlook Sunny Overcast Rain Temperature Wind Hot Mild Cool Strong Weak
24 Decision Tree Training Objective Cost function of a tree T over training data jt j C(T )= N t H t (T ) t=1 where for the leaf node t H t (T) is the empirical entropy N t is the instance number, N tk is the instance number of class k H t (T )= k N tk log N tk N t N t Training objective: find a tree to minimize the cost T jt j min C(T )= N t H t (T ) t=1
25 Decision Tree Regularization Cost function over training data C(T )= jt j N t H t (T )+ jt j t=1 where T is the number of leaf nodes of the tree T λ is the hyperparameter of regularization
26 Decision Tree Building: ID3 Algorithm An example decision tree from ID3 Outlook Sunny Overcast Rain Hot Temperature Mild Cool Strong Wind Weak Whether to split this node? Calculate the cost function difference. C(T )= jt j N t H t (T )+ jt j t=1
27 Summary of ID3 A classic and straightforward algorithm of training decision trees Work on discrete/categorical data One branch for each value/category of the feature Algorithm C4.5 is similar and more advanced to ID3 Splitting the node according to information gain ratio Splitting branch number depends on the number of different categorical values of the feature Might lead to very broad tree
28 CART Algorithm Classification and Regression Tree (CART) Proposed by Leo Breiman et al. in 1984 Binary splitting (yes or no for the splitting condition) Can work on continuous/numeric features Can repeatedly use the same feature (with different splitting) Condition 1 Yes No Condition 2 Prediction 3 Yes No Prediction 1 Prediction 2
29 CART Algorithm Regression Tree Output the predicted value Classification Tree Output the predicted class Age > 20 Age > 20 Yes No Yes No Gender=Male 2.8 Gender=Male dislike Yes No Yes No For example: predict the user s rating to a movie like dislike For example: predict whether the user like a move
30 Regression Tree Let the training dataset with continuous targets y D = f(x 1 ;y 1 ); (x 2 ;y 2 );:::;(x N ;y N )g Suppose a regression tree has divided the space into M regions R 1, R 2,, R M, with c m as the prediction for region R m M f(x) = c m I(x 2 R m ) m=1 Loss function for (x i, y i ) 1 2 (y i f(x i )) 2 It is easy to see the optimal prediction for region m is ^c m =avg(y i jx i 2 R m )
31 Regression Tree How to find the optimal splitting regions? How to find the optimal splitting conditions? Defined by a threshold value s on variable j Lead to two regions R 1 (j; s) =fxjx (j) sg R 2 (j; s) =fxjx (j) >sg Training based on current splitting h min min j;s c 1 x2r 1 (j;s) (y i c 1 ) 2 +min c 2 x2r 2 (j;s) (y i c 2 ) 2i ^c m =avg(y i jx i 2 R m )
32 Regression Tree Algorithm INPUT: training data D OUTPUT: regression tree f(x) Repeat until stop condition satisfied: Find the optimal splitting (j,s) h min min j;s c 1 x2r 1 (j;s) (y i c 1 ) 2 +min c 2 x2r 2 (j;s) (y i c 2 ) 2i Calculate the prediction value of the new region R 1, R 2 ^c m =avg(y i jx i 2 R m ) Return the regression tree f(x) = M m=1 ^c m I(x 2 R m )
33 Regression Tree Algorithm How to efficiently find the optimal splitting (j,s)? h min min j;s c 1 x2r 1 (j;s) (y i c 1 ) 2 +min c 2 x2r 2 (j;s) (y i c 2 ) 2i Sort the data ascendingly according to feature j value Splitting threshold s y 1 y 2 y 3 y 4 y 5 y 6 y 7 y 8 y 9 y 10 y 11 y 12 small j value large j value 6 12 loss = (y i c 1 ) 2 + (y i c 2 ) 2 = i=1 i=7 6 y i 2 1 ³ 6 12 y i 2 + yi i=1 i=1 i=7 ³ 12 1 ³ 6 1 y i 2 = y i i=7 Online updated i=1 ³ 12 y i 2 + C i=7
34 Regression Tree Algorithm How to efficiently find the optimal splitting (j,s)? h min min j;s c 1 x2r 1 (j;s) (y i c 1 ) 2 +min c 2 x2r 2 (j;s) (y i c 2 ) 2i Sort the data ascendingly according to feature j value Splitting threshold s y 1 y 2 y 3 y 4 y 5 y 6 y 7 y 8 y 9 y 10 y 11 y 12 small j value large j value ³ 1 loss6;7 = 6 6 ³ 1 12 y i 2 y i 2 + C 6 i=1 i=7
35 Regression Tree Algorithm How to efficiently find the optimal splitting (j,s)? h min min j;s c 1 x2r 1 (j;s) (y i c 1 ) 2 +min c 2 x2r 2 (j;s) (y i c 2 ) 2i Sort the data ascendingly according to feature j value Splitting threshold s y 1 y 2 y 3 y 4 y 5 y 6 y 7 y 8 y 9 y 10 y 11 y 12 small j value large j value loss6;7 = 1 ³ 6 1 y i i=1 loss7;8 = 1 ³ 7 1 y i i=1 ³ 12 y i 2 + C i=7 ³ 12 y i 2 + C Maintain and online update in O(1) Time y i 2 + C i=8 k n Sum(R 1 )= y i Sum(R 2 )= i=1 i=k+1 O(n) in total for checking one feature y i
36 Classification Tree The training dataset with categorical targets y D = f(x 1 ;y 1 ); (x 2 ;y 2 );:::;(x N ;y N )g Suppose a regression tree has divided the space into M regions R 1, R 2,, R M, with c m as the prediction for region R m M f(x) = c m I(x 2 R m ) m=1 Here the leaf node prediction c m is the category distribution ^c m = fp (y k jx i 2 R m )g k=1:::k c m is solved by counting categories P (y k jx i 2 R m )= Ck m C m # instances in leaf m with cat k # instances in leaf m
37 Classification Tree How to find the optimal splitting regions? How to find the optimal splitting conditions? For continuous feature j, defined by a threshold value s Yield two regions R 1 (j; s) =fxjx (j) sg R 2 (j; s) =fxjx (j) >sg For categorical feature j, select a category a Yield two regions R 1 (j; s) =fxjx (j) = ag R 2 (j; s) =fxjx (j) 6= ag How to select? Argmin Gini impurity.
38 Gini Impurity In classification problem suppose there are K classes let p k be the probability of an instance with the class k the Gini impurity index is Gini(p) = K p k (1 p k )=1 k=1 K k=1 Given the training dataset D, the Gini impurity is p 2 k Gini(D) =1 K ³ jd k j jdj k=1 2 # instances in D with cat k # instances in D
39 Gini Impurity For binary classification problem let p be the probability of an instance with the class 1 Gini impurity is Gini(p) = 2p(1 p) Entropy is H(p) = p log p (1 p)log(1 p) Gini impurity and entropy are quite similar in representing classification error rate.
40 Gini Impurity With a categorical feature j and one of its categories a The two split regions R 1, R 2 R 1 (j; a) =fxjx (j) = ag R 2 (j; a) =fxjx (j) 6= ag D j 1 = f(x; y)jx (j) = ag D j 2 = f(x; y)jx (j) 6= ag The Gini impurity of feature j with the selected category a Gini(D j ;j = a) = jd1 j j jd j j Gini(D1 j )+ jd2 j j jd j j Gini(D2 j )
41 Classification Tree Algorithm INPUT: training data D OUTPUT: classification tree f(x) Repeat until stop condition satisfied: Find the optimal splitting (j,a) min Gini(D j;j = a) j;a 1. Node instance number is small 2. Gini impurity is small 3. No more feature Calculate the prediction distribution of the new region R 1, R 2 ^c m = fp (y k jx i 2 R m )g k=1:::k Return the classification tree f(x) = M m=1 ^c m I(x 2 R m )
42 Classification Tree Output Class label output Output the class with the highest conditional probability f(x) =argmax y k m=1 Probabilistic distribution output M I(x 2 R m )P (y k jx i 2 R m ) f(x) = M m=1 ^c m I(x 2 R m ) ^c m = fp (y k jx i 2 R m )g k=1:::k
43 Converting a Tree to Rules Yes Yes Gender=Male Age > 20 No No 2.8 For example: predict the user s rating to a movie IF Age > 20: IF Gender == Male: return 4.8 ELSE: return 4.1 ELSE: return 2.8 Decision tree model is easy to be visualized, explained and debugged.
44 Learning Model Comparison [Table 10.3 from Hastie et al. Elements of Statistical Learning, 2nd Edition]
Learning Decision Trees
Learning Decision Trees Machine Learning Fall 2018 Some slides from Tom Mitchell, Dan Roth and others 1 Key issues in machine learning Modeling How to formulate your problem as a machine learning problem?
More informationLearning Decision Trees
Learning Decision Trees Machine Learning Spring 2018 1 This lecture: Learning Decision Trees 1. Representation: What are decision trees? 2. Algorithm: Learning decision trees The ID3 algorithm: A greedy
More informationMachine Learning 2nd Edi7on
Lecture Slides for INTRODUCTION TO Machine Learning 2nd Edi7on CHAPTER 9: Decision Trees ETHEM ALPAYDIN The MIT Press, 2010 Edited and expanded for CS 4641 by Chris Simpkins alpaydin@boun.edu.tr h1p://www.cmpe.boun.edu.tr/~ethem/i2ml2e
More informationCS 6375 Machine Learning
CS 6375 Machine Learning Decision Trees Instructor: Yang Liu 1 Supervised Classifier X 1 X 2. X M Ref class label 2 1 Three variables: Attribute 1: Hair = {blond, dark} Attribute 2: Height = {tall, short}
More informationDan Roth 461C, 3401 Walnut
CIS 519/419 Applied Machine Learning www.seas.upenn.edu/~cis519 Dan Roth danroth@seas.upenn.edu http://www.cis.upenn.edu/~danroth/ 461C, 3401 Walnut Slides were created by Dan Roth (for CIS519/419 at Penn
More informationMachine Learning 3. week
Machine Learning 3. week Entropy Decision Trees ID3 C4.5 Classification and Regression Trees (CART) 1 What is Decision Tree As a short description, decision tree is a data classification procedure which
More informationthe tree till a class assignment is reached
Decision Trees Decision Tree for Playing Tennis Prediction is done by sending the example down Prediction is done by sending the example down the tree till a class assignment is reached Definitions Internal
More informationCS6375: Machine Learning Gautam Kunapuli. Decision Trees
Gautam Kunapuli Example: Restaurant Recommendation Example: Develop a model to recommend restaurants to users depending on their past dining experiences. Here, the features are cost (x ) and the user s
More information2018 EE448, Big Data Mining, Lecture 4. (Part I) Weinan Zhang Shanghai Jiao Tong University
2018 EE448, Big Data Mining, Lecture 4 Supervised Learning (Part I) Weinan Zhang Shanghai Jiao Tong University http://wnzhang.net http://wnzhang.net/teaching/ee448/index.html Content of Supervised Learning
More informationClassification Using Decision Trees
Classification Using Decision Trees 1. Introduction Data mining term is mainly used for the specific set of six activities namely Classification, Estimation, Prediction, Affinity grouping or Association
More informationLecture 3: Decision Trees
Lecture 3: Decision Trees Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning ID3, Information Gain, Overfitting, Pruning Lecture 3: Decision Trees p. Decision
More informationRule Generation using Decision Trees
Rule Generation using Decision Trees Dr. Rajni Jain 1. Introduction A DT is a classification scheme which generates a tree and a set of rules, representing the model of different classes, from a given
More informationDecision Support. Dr. Johan Hagelbäck.
Decision Support Dr. Johan Hagelbäck johan.hagelback@lnu.se http://aiguy.org Decision Support One of the earliest AI problems was decision support The first solution to this problem was expert systems
More informationCS145: INTRODUCTION TO DATA MINING
CS145: INTRODUCTION TO DATA MINING 4: Vector Data: Decision Tree Instructor: Yizhou Sun yzsun@cs.ucla.edu October 10, 2017 Methods to Learn Vector Data Set Data Sequence Data Text Data Classification Clustering
More informationDecision Trees. Gavin Brown
Decision Trees Gavin Brown Every Learning Method has Limitations Linear model? KNN? SVM? Explain your decisions Sometimes we need interpretable results from our techniques. How do you explain the above
More informationClassification: Decision Trees
Classification: Decision Trees Outline Top-Down Decision Tree Construction Choosing the Splitting Attribute Information Gain and Gain Ratio 2 DECISION TREE An internal node is a test on an attribute. A
More information( D) I(2,3) I(4,0) I(3,2) weighted avg. of entropies
Decision Tree Induction using Information Gain Let I(x,y) as the entropy in a dataset with x number of class 1(i.e., play ) and y number of class (i.e., don t play outcomes. The entropy at the root, i.e.,
More informationDecision Trees. CSC411/2515: Machine Learning and Data Mining, Winter 2018 Luke Zettlemoyer, Carlos Guestrin, and Andrew Moore
Decision Trees Claude Monet, The Mulberry Tree Slides from Pedro Domingos, CSC411/2515: Machine Learning and Data Mining, Winter 2018 Luke Zettlemoyer, Carlos Guestrin, and Andrew Moore Michael Guerzhoy
More informationEECS 349:Machine Learning Bryan Pardo
EECS 349:Machine Learning Bryan Pardo Topic 2: Decision Trees (Includes content provided by: Russel & Norvig, D. Downie, P. Domingos) 1 General Learning Task There is a set of possible examples Each example
More informationDecision Trees. Data Science: Jordan Boyd-Graber University of Maryland MARCH 11, Data Science: Jordan Boyd-Graber UMD Decision Trees 1 / 1
Decision Trees Data Science: Jordan Boyd-Graber University of Maryland MARCH 11, 2018 Data Science: Jordan Boyd-Graber UMD Decision Trees 1 / 1 Roadmap Classification: machines labeling data for us Last
More informationDecision Trees Part 1. Rao Vemuri University of California, Davis
Decision Trees Part 1 Rao Vemuri University of California, Davis Overview What is a Decision Tree Sample Decision Trees How to Construct a Decision Tree Problems with Decision Trees Classification Vs Regression
More informationDecision Tree Learning
Decision Tree Learning Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University References: 1. Machine Learning, Chapter 3 2. Data Mining: Concepts, Models,
More informationM chi h n i e n L e L arni n n i g Decision Trees Mac a h c i h n i e n e L e L a e r a ni n ng
1 Decision Trees 2 Instances Describable by Attribute-Value Pairs Target Function Is Discrete Valued Disjunctive Hypothesis May Be Required Possibly Noisy Training Data Examples Equipment or medical diagnosis
More informationMachine Learning Recitation 8 Oct 21, Oznur Tastan
Machine Learning 10601 Recitation 8 Oct 21, 2009 Oznur Tastan Outline Tree representation Brief information theory Learning decision trees Bagging Random forests Decision trees Non linear classifier Easy
More informationDecision Trees. Tirgul 5
Decision Trees Tirgul 5 Using Decision Trees It could be difficult to decide which pet is right for you. We ll find a nice algorithm to help us decide what to choose without having to think about it. 2
More informationDecision Tree Analysis for Classification Problems. Entscheidungsunterstützungssysteme SS 18
Decision Tree Analysis for Classification Problems Entscheidungsunterstützungssysteme SS 18 Supervised segmentation An intuitive way of thinking about extracting patterns from data in a supervised manner
More informationDecision Trees.
. Machine Learning Decision Trees Prof. Dr. Martin Riedmiller AG Maschinelles Lernen und Natürlichsprachliche Systeme Institut für Informatik Technische Fakultät Albert-Ludwigs-Universität Freiburg riedmiller@informatik.uni-freiburg.de
More informationClassification: Decision Trees
Classification: Decision Trees These slides were assembled by Byron Boots, with grateful acknowledgement to Eric Eaton and the many others who made their course materials freely available online. Feel
More informationDecision Trees. Each internal node : an attribute Branch: Outcome of the test Leaf node or terminal node: class label.
Decision Trees Supervised approach Used for Classification (Categorical values) or regression (continuous values). The learning of decision trees is from class-labeled training tuples. Flowchart like structure.
More informationMachine Learning & Data Mining
Group M L D Machine Learning M & Data Mining Chapter 7 Decision Trees Xin-Shun Xu @ SDU School of Computer Science and Technology, Shandong University Top 10 Algorithm in DM #1: C4.5 #2: K-Means #3: SVM
More informationDecision trees. Special Course in Computer and Information Science II. Adam Gyenge Helsinki University of Technology
Decision trees Special Course in Computer and Information Science II Adam Gyenge Helsinki University of Technology 6.2.2008 Introduction Outline: Definition of decision trees ID3 Pruning methods Bibliography:
More informationDecision Tree Learning Mitchell, Chapter 3. CptS 570 Machine Learning School of EECS Washington State University
Decision Tree Learning Mitchell, Chapter 3 CptS 570 Machine Learning School of EECS Washington State University Outline Decision tree representation ID3 learning algorithm Entropy and information gain
More informationLecture 24: Other (Non-linear) Classifiers: Decision Tree Learning, Boosting, and Support Vector Classification Instructor: Prof. Ganesh Ramakrishnan
Lecture 24: Other (Non-linear) Classifiers: Decision Tree Learning, Boosting, and Support Vector Classification Instructor: Prof Ganesh Ramakrishnan October 20, 2016 1 / 25 Decision Trees: Cascade of step
More informationClassification and Prediction
Classification Classification and Prediction Classification: predict categorical class labels Build a model for a set of classes/concepts Classify loan applications (approve/decline) Prediction: model
More informationLecture 7 Decision Tree Classifier
Machine Learning Dr.Ammar Mohammed Lecture 7 Decision Tree Classifier Decision Tree A decision tree is a simple classifier in the form of a hierarchical tree structure, which performs supervised classification
More informationIntroduction. Decision Tree Learning. Outline. Decision Tree 9/7/2017. Decision Tree Definition
Introduction Decision Tree Learning Practical methods for inductive inference Approximating discrete-valued functions Robust to noisy data and capable of learning disjunctive expression ID3 earch a completely
More informationUniversität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Intelligent Data Analysis. Decision Trees
Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Intelligent Data Analysis Decision Trees Paul Prasse, Niels Landwehr, Tobias Scheffer Decision Trees One of many applications:
More informationIntroduction to Machine Learning CMU-10701
Introduction to Machine Learning CMU-10701 23. Decision Trees Barnabás Póczos Contents Decision Trees: Definition + Motivation Algorithm for Learning Decision Trees Entropy, Mutual Information, Information
More informationDecision-Tree Learning. Chapter 3: Decision Tree Learning. Classification Learning. Decision Tree for PlayTennis
Decision-Tree Learning Chapter 3: Decision Tree Learning CS 536: Machine Learning Littman (Wu, TA) [read Chapter 3] [some of Chapter 2 might help ] [recommended exercises 3.1, 3.2] Decision tree representation
More informationDecision Trees.
. Machine Learning Decision Trees Prof. Dr. Martin Riedmiller AG Maschinelles Lernen und Natürlichsprachliche Systeme Institut für Informatik Technische Fakultät Albert-Ludwigs-Universität Freiburg riedmiller@informatik.uni-freiburg.de
More informationData Mining. CS57300 Purdue University. Bruno Ribeiro. February 8, 2018
Data Mining CS57300 Purdue University Bruno Ribeiro February 8, 2018 Decision trees Why Trees? interpretable/intuitive, popular in medical applications because they mimic the way a doctor thinks model
More informationData Mining Classification: Basic Concepts, Decision Trees, and Model Evaluation
Data Mining Classification: Basic Concepts, Decision Trees, and Model Evaluation Lecture Notes for Chapter 4 Part I Introduction to Data Mining by Tan, Steinbach, Kumar Adapted by Qiang Yang (2010) Tan,Steinbach,
More informationIntroduction to ML. Two examples of Learners: Naïve Bayesian Classifiers Decision Trees
Introduction to ML Two examples of Learners: Naïve Bayesian Classifiers Decision Trees Why Bayesian learning? Probabilistic learning: Calculate explicit probabilities for hypothesis, among the most practical
More informationChapter 3: Decision Tree Learning
Chapter 3: Decision Tree Learning CS 536: Machine Learning Littman (Wu, TA) Administration Books? New web page: http://www.cs.rutgers.edu/~mlittman/courses/ml03/ schedule lecture notes assignment info.
More informationDecision Trees. Nicholas Ruozzi University of Texas at Dallas. Based on the slides of Vibhav Gogate and David Sontag
Decision Trees Nicholas Ruozzi University of Texas at Dallas Based on the slides of Vibhav Gogate and David Sontag Supervised Learning Input: labelled training data i.e., data plus desired output Assumption:
More informationUVA CS 4501: Machine Learning
UVA CS 4501: Machine Learning Lecture 21: Decision Tree / Random Forest / Ensemble Dr. Yanjun Qi University of Virginia Department of Computer Science Where are we? è Five major sections of this course
More informationLecture 3: Decision Trees
Lecture 3: Decision Trees Cognitive Systems - Machine Learning Part I: Basic Approaches of Concept Learning ID3, Information Gain, Overfitting, Pruning last change November 26, 2014 Ute Schmid (CogSys,
More informationDECISION TREE LEARNING. [read Chapter 3] [recommended exercises 3.1, 3.4]
1 DECISION TREE LEARNING [read Chapter 3] [recommended exercises 3.1, 3.4] Decision tree representation ID3 learning algorithm Entropy, Information gain Overfitting Decision Tree 2 Representation: Tree-structured
More informationSupervised Learning! Algorithm Implementations! Inferring Rudimentary Rules and Decision Trees!
Supervised Learning! Algorithm Implementations! Inferring Rudimentary Rules and Decision Trees! Summary! Input Knowledge representation! Preparing data for learning! Input: Concept, Instances, Attributes"
More informationDecision Trees. CS57300 Data Mining Fall Instructor: Bruno Ribeiro
Decision Trees CS57300 Data Mining Fall 2016 Instructor: Bruno Ribeiro Goal } Classification without Models Well, partially without a model } Today: Decision Trees 2015 Bruno Ribeiro 2 3 Why Trees? } interpretable/intuitive,
More informationData Mining Classification: Basic Concepts and Techniques. Lecture Notes for Chapter 3. Introduction to Data Mining, 2nd Edition
Data Mining Classification: Basic Concepts and Techniques Lecture Notes for Chapter 3 by Tan, Steinbach, Karpatne, Kumar 1 Classification: Definition Given a collection of records (training set ) Each
More informationLearning Classification Trees. Sargur Srihari
Learning Classification Trees Sargur srihari@cedar.buffalo.edu 1 Topics in CART CART as an adaptive basis function model Classification and Regression Tree Basics Growing a Tree 2 A Classification Tree
More informationDecision Tree Learning
0. Decision Tree Learning Based on Machine Learning, T. Mitchell, McGRAW Hill, 1997, ch. 3 Acknowledgement: The present slides are an adaptation of slides drawn by T. Mitchell PLAN 1. Concept learning:
More informationClassification and regression trees
Classification and regression trees Pierre Geurts p.geurts@ulg.ac.be Last update: 23/09/2015 1 Outline Supervised learning Decision tree representation Decision tree learning Extensions Regression trees
More informationhttp://xkcd.com/1570/ Strategy: Top Down Recursive divide-and-conquer fashion First: Select attribute for root node Create branch for each possible attribute value Then: Split
More informationLecture 7: DecisionTrees
Lecture 7: DecisionTrees What are decision trees? Brief interlude on information theory Decision tree construction Overfitting avoidance Regression trees COMP-652, Lecture 7 - September 28, 2009 1 Recall:
More informationDecision Trees. Common applications: Health diagnosis systems Bank credit analysis
Decision Trees Rodrigo Fernandes de Mello Invited Professor at Télécom ParisTech Associate Professor at Universidade de São Paulo, ICMC, Brazil http://www.icmc.usp.br/~mello mello@icmc.usp.br Decision
More informationInduction on Decision Trees
Séance «IDT» de l'ue «apprentissage automatique» Bruno Bouzy bruno.bouzy@parisdescartes.fr www.mi.parisdescartes.fr/~bouzy Outline Induction task ID3 Entropy (disorder) minimization Noise Unknown attribute
More informationClassification and Regression Trees
Classification and Regression Trees Ryan P Adams So far, we have primarily examined linear classifiers and regressors, and considered several different ways to train them When we ve found the linearity
More informationMachine Learning Alternatives to Manual Knowledge Acquisition
Machine Learning Alternatives to Manual Knowledge Acquisition Interactive programs which elicit knowledge from the expert during the course of a conversation at the terminal. Programs which learn by scanning
More informationCS 380: ARTIFICIAL INTELLIGENCE MACHINE LEARNING. Santiago Ontañón
CS 380: ARTIFICIAL INTELLIGENCE MACHINE LEARNING Santiago Ontañón so367@drexel.edu Summary so far: Rational Agents Problem Solving Systematic Search: Uninformed Informed Local Search Adversarial Search
More informationDecision Tree Learning Lecture 2
Machine Learning Coms-4771 Decision Tree Learning Lecture 2 January 28, 2008 Two Types of Supervised Learning Problems (recap) Feature (input) space X, label (output) space Y. Unknown distribution D over
More informationDecision Tree And Random Forest
Decision Tree And Random Forest Dr. Ammar Mohammed Associate Professor of Computer Science ISSR, Cairo University PhD of CS ( Uni. Koblenz-Landau, Germany) Spring 2019 Contact: mailto: Ammar@cu.edu.eg
More informationDecision Tree Learning and Inductive Inference
Decision Tree Learning and Inductive Inference 1 Widely used method for inductive inference Inductive Inference Hypothesis: Any hypothesis found to approximate the target function well over a sufficiently
More informationML techniques. symbolic techniques different types of representation value attribute representation representation of the first order
MACHINE LEARNING Definition 1: Learning is constructing or modifying representations of what is being experienced [Michalski 1986], p. 10 Definition 2: Learning denotes changes in the system That are adaptive
More informationDecision Trees. Lewis Fishgold. (Material in these slides adapted from Ray Mooney's slides on Decision Trees)
Decision Trees Lewis Fishgold (Material in these slides adapted from Ray Mooney's slides on Decision Trees) Classification using Decision Trees Nodes test features, there is one branch for each value of
More informationDecision Tree Learning
Decision Tree Learning Goals for the lecture you should understand the following concepts the decision tree representation the standard top-down approach to learning a tree Occam s razor entropy and information
More informationDecision Trees. Danushka Bollegala
Decision Trees Danushka Bollegala Rule-based Classifiers In rule-based learning, the idea is to learn a rule from train data in the form IF X THEN Y (or a combination of nested conditions) that explains
More informationBias Correction in Classification Tree Construction ICML 2001
Bias Correction in Classification Tree Construction ICML 21 Alin Dobra Johannes Gehrke Department of Computer Science Cornell University December 15, 21 Classification Tree Construction Outlook Temp. Humidity
More informationDecision Tree Learning - ID3
Decision Tree Learning - ID3 n Decision tree examples n ID3 algorithm n Occam Razor n Top-Down Induction in Decision Trees n Information Theory n gain from property 1 Training Examples Day Outlook Temp.
More informationEinführung in Web- und Data-Science
Einführung in Web- und Data-Science Prof. Dr. Ralf Möller Universität zu Lübeck Institut für Informationssysteme Tanya Braun (Übungen) Inductive Learning Chapter 18/19 Chapters 3 and 4 Material adopted
More informationImagine we ve got a set of data containing several types, or classes. E.g. information about customers, and class=whether or not they buy anything.
Decision Trees Defining the Task Imagine we ve got a set of data containing several types, or classes. E.g. information about customers, and class=whether or not they buy anything. Can we predict, i.e
More informationQuestion of the Day. Machine Learning 2D1431. Decision Tree for PlayTennis. Outline. Lecture 4: Decision Tree Learning
Question of the Day Machine Learning 2D1431 How can you make the following equation true by drawing only one straight line? 5 + 5 + 5 = 550 Lecture 4: Decision Tree Learning Outline Decision Tree for PlayTennis
More informationDecision Trees. CSC411/2515: Machine Learning and Data Mining, Winter 2018 Luke Zettlemoyer, Carlos Guestrin, and Andrew Moore
Decision Trees Claude Monet, The Mulberry Tree Slides from Pedro Domingos, CSC411/2515: Machine Learning and Data Mining, Winter 2018 Luke Zettlemoyer, Carlos Guestrin, and Andrew Moore Michael Guerzhoy
More informationAdministration. Chapter 3: Decision Tree Learning (part 2) Measuring Entropy. Entropy Function
Administration Chapter 3: Decision Tree Learning (part 2) Book on reserve in the math library. Questions? CS 536: Machine Learning Littman (Wu, TA) Measuring Entropy Entropy Function S is a sample of training
More informationDecision Tree Learning
Decision Tree Learning Goals for the lecture you should understand the following concepts the decision tree representation the standard top-down approach to learning a tree Occam s razor entropy and information
More informationInformation Theory, Statistics, and Decision Trees
Information Theory, Statistics, and Decision Trees Léon Bottou COS 424 4/6/2010 Summary 1. Basic information theory. 2. Decision trees. 3. Information theory and statistics. Léon Bottou 2/31 COS 424 4/6/2010
More informationDecision Tree. Decision Tree Learning. c4.5. Example
Decision ree Decision ree Learning s of systems that learn decision trees: c4., CLS, IDR, ASSISA, ID, CAR, ID. Suitable problems: Instances are described by attribute-value couples he target function has
More informationEnsemble Methods. Charles Sutton Data Mining and Exploration Spring Friday, 27 January 12
Ensemble Methods Charles Sutton Data Mining and Exploration Spring 2012 Bias and Variance Consider a regression problem Y = f(x)+ N(0, 2 ) With an estimate regression function ˆf, e.g., ˆf(x) =w > x Suppose
More informationAlgorithms for Classification: The Basic Methods
Algorithms for Classification: The Basic Methods Outline Simplicity first: 1R Naïve Bayes 2 Classification Task: Given a set of pre-classified examples, build a model or classifier to classify new cases.
More informationData classification (II)
Lecture 4: Data classification (II) Data Mining - Lecture 4 (2016) 1 Outline Decision trees Choice of the splitting attribute ID3 C4.5 Classification rules Covering algorithms Naïve Bayes Classification
More informationThe Quadratic Entropy Approach to Implement the Id3 Decision Tree Algorithm
Journal of Computer Science and Information Technology December 2018, Vol. 6, No. 2, pp. 23-29 ISSN: 2334-2366 (Print), 2334-2374 (Online) Copyright The Author(s). All Rights Reserved. Published by American
More informationECE521 Lectures 9 Fully Connected Neural Networks
ECE521 Lectures 9 Fully Connected Neural Networks Outline Multi-class classification Learning multi-layer neural networks 2 Measuring distance in probability space We learnt that the squared L2 distance
More informationClassification CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2012
Classification CE-717: Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Topics Discriminant functions Logistic regression Perceptron Generative models Generative vs. discriminative
More informationday month year documentname/initials 1
ECE471-571 Pattern Recognition Lecture 13 Decision Tree Hairong Qi, Gonzalez Family Professor Electrical Engineering and Computer Science University of Tennessee, Knoxville http://www.eecs.utk.edu/faculty/qi
More informationReview of Lecture 1. Across records. Within records. Classification, Clustering, Outlier detection. Associations
Review of Lecture 1 This course is about finding novel actionable patterns in data. We can divide data mining algorithms (and the patterns they find) into five groups Across records Classification, Clustering,
More informationBayesian Learning. Artificial Intelligence Programming. 15-0: Learning vs. Deduction
15-0: Learning vs. Deduction Artificial Intelligence Programming Bayesian Learning Chris Brooks Department of Computer Science University of San Francisco So far, we ve seen two types of reasoning: Deductive
More informationDay 3: Classification, logistic regression
Day 3: Classification, logistic regression Introduction to Machine Learning Summer School June 18, 2018 - June 29, 2018, Chicago Instructor: Suriya Gunasekar, TTI Chicago 20 June 2018 Topics so far Supervised
More informationSymbolic methods in TC: Decision Trees
Symbolic methods in TC: Decision Trees ML for NLP Lecturer: Kevin Koidl Assist. Lecturer Alfredo Maldonado https://www.cs.tcd.ie/kevin.koidl/cs0/ kevin.koidl@scss.tcd.ie, maldonaa@tcd.ie 01-017 A symbolic
More informationBayesian Classification. Bayesian Classification: Why?
Bayesian Classification http://css.engineering.uiowa.edu/~comp/ Bayesian Classification: Why? Probabilistic learning: Computation of explicit probabilities for hypothesis, among the most practical approaches
More informationUniversität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Decision Trees. Tobias Scheffer
Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Decision Trees Tobias Scheffer Decision Trees One of many applications: credit risk Employed longer than 3 months Positive credit
More informationDecision Trees / NLP Introduction
Decision Trees / NLP Introduction Dr. Kevin Koidl School of Computer Science and Statistic Trinity College Dublin ADAPT Research Centre The ADAPT Centre is funded under the SFI Research Centres Programme
More informationAdministrative notes. Computational Thinking ct.cs.ubc.ca
Administrative notes Labs this week: project time. Remember, you need to pass the project in order to pass the course! (See course syllabus.) Clicker grades should be on-line now Administrative notes March
More informationSupport Vector Machines and Kernel Methods
2018 CS420 Machine Learning, Lecture 3 Hangout from Prof. Andrew Ng. http://cs229.stanford.edu/notes/cs229-notes3.pdf Support Vector Machines and Kernel Methods Weinan Zhang Shanghai Jiao Tong University
More informationStatistics and learning: Big Data
Statistics and learning: Big Data Learning Decision Trees and an Introduction to Boosting Sébastien Gadat Toulouse School of Economics February 2017 S. Gadat (TSE) SAD 2013 1 / 30 Keywords Decision trees
More informationClassification: Rule Induction Information Retrieval and Data Mining. Prof. Matteo Matteucci
Classification: Rule Induction Information Retrieval and Data Mining Prof. Matteo Matteucci What is Rule Induction? The Weather Dataset 3 Outlook Temp Humidity Windy Play Sunny Hot High False No Sunny
More informationNotes on Machine Learning for and
Notes on Machine Learning for 16.410 and 16.413 (Notes adapted from Tom Mitchell and Andrew Moore.) Learning = improving with experience Improve over task T (e.g, Classification, control tasks) with respect
More informationCSC 411 Lecture 3: Decision Trees
CSC 411 Lecture 3: Decision Trees Roger Grosse, Amir-massoud Farahmand, and Juan Carrasquilla University of Toronto UofT CSC 411: 03-Decision Trees 1 / 33 Today Decision Trees Simple but powerful learning
More informationThe Solution to Assignment 6
The Solution to Assignment 6 Problem 1: Use the 2-fold cross-validation to evaluate the Decision Tree Model for trees up to 2 levels deep (that is, the maximum path length from the root to the leaves is
More informationDecision Trees Entropy, Information Gain, Gain Ratio
Changelog: 14 Oct, 30 Oct Decision Trees Entropy, Information Gain, Gain Ratio Lecture 3: Part 2 Outline Entropy Information gain Gain ratio Marina Santini Acknowledgements Slides borrowed and adapted
More information