Chapter ML:III. III. Decision Trees. Decision Trees Basics Impurity Functions Decision Tree Algorithms Decision Tree Pruning

Size: px
Start display at page:

Download "Chapter ML:III. III. Decision Trees. Decision Trees Basics Impurity Functions Decision Tree Algorithms Decision Tree Pruning"

Transcription

1 Chapter ML:III III. Decision Trees Decision Trees Basics Impurity Functions Decision Tree Algorithms Decision Tree Pruning ML:III-34 Decision Trees STEIN/LETTMANN

2 Splitting Let t be a leaf node of an incomplete decision tree, and let D(t) be the subset of the example set D that is represented by t. [Illustration] Possible criteria for a splitting of X(t) : 1. Size of D(t). D(t) will not be partitioned further if the number of examples, D(t), is below a certain threshold. 2. Purity of D(t). D(t) will not be partitioned further if all examples in D are members of the same class. 3. Ockham s Razor. D(t) will not be partitioned further if the resulting decision tree is not improved significantly by the splitting. ML:III-35 Decision Trees STEIN/LETTMANN

3 Splitting Let t be a leaf node of an incomplete decision tree, and let D(t) be the subset of the example set D that is represented by t. [Illustration] Possible criteria for a splitting of X(t) : 1. Size of D(t). D(t) will not be partitioned further if the number of examples, D(t), is below a certain threshold. 2. Purity of D(t). D(t) will not be partitioned further if all examples in D are members of the same class. 3. Ockham s Razor. D(t) will not be partitioned further if the resulting decision tree is not improved significantly by the splitting. ML:III-36 Decision Trees STEIN/LETTMANN

4 Splitting (continued) Let D be a set of examples over a feature space X and a set of classes C = {c 1, c 2, c 3, c 4 }. Distribution of D for two possible splittings of X : (a) (b) c 1 c 2 c 3 c 4 ML:III-37 Decision Trees STEIN/LETTMANN

5 Splitting (continued) Let D be a set of examples over a feature space X and a set of classes C = {c 1, c 2, c 3, c 4 }. Distribution of D for two possible splittings of X : (a) (b) c 1 c 2 c 3 c 4 Splitting (a) minimizes the impurity of the subsets of D in the leaf nodes and should be preferred over splitting (b). This argument presumes that the misclassification costs are independent of the classes. The impurity is a function defined on P(D), the set of all subsets of an example set D. ML:III-38 Decision Trees STEIN/LETTMANN

6 Definition 4 (Impurity Function ι) Let k N. An impurity function ι : [0; 1] k R is a partial function defined on the standard k 1-simplex, denoted k 1, for which the following properties hold: (a) ι becomes minimum at points (1, 0,..., 0), (0, 1,..., 0),..., (0,..., 0, 1). (b) ι is symmetric with regard to its arguments, p 1,..., p k. (c) ι becomes maximum at point (1/k,..., 1/k). ML:III-39 Decision Trees STEIN/LETTMANN

7 Definition 5 (Impurity of an Example Set ι(d)) Let D be a set of examples, let C = {c 1,..., c k } be set of classes, and let c : X C be the ideal classifier for X. Moreover, let ι : [0; 1] k R an impurity function. Then, the impurity of D, denoted as ι(d), is defined as follows: ( {(x, c(x)) D : c(x) = c1 } ι(d) = ι,..., {(x, c(x)) D : c(x) = c ) k} ML:III-40 Decision Trees STEIN/LETTMANN

8 Definition 5 (Impurity of an Example Set ι(d)) Let D be a set of examples, let C = {c 1,..., c k } be set of classes, and let c : X C be the ideal classifier for X. Moreover, let ι : [0; 1] k R an impurity function. Then, the impurity of D, denoted as ι(d), is defined as follows: ( {(x, c(x)) D : c(x) = c1 } ι(d) = ι,..., {(x, c(x)) D : c(x) = c ) k} Definition 6 (Impurity Reduction ι) Let D 1,..., D s be a partitioning of an example set D, which is induced by a splitting of a feature space X. Then, the resulting impurity reduction, denoted as ι(d, {D 1,..., D s }), is defined as follows: ι(d, {D 1,..., D s }) = ι(d) s j=1 D j ι(d j) ML:III-41 Decision Trees STEIN/LETTMANN

9 Remarks: The standard { k 1-simplex comprises all k-tuples with non-negative elements that sum to 1: k 1 = (p 1,..., p k ) R k : } k i=1 p i = 1 and p i 0 for all i Observe the different domains of the impurity function ι in the Definitions 4 and 5, namely, [0; 1] k and D. The domains correspond to each other: the set of examples, D, defines via its class portions an element from [0; 1] k and vice versa. The properties in the definition of ι suggest to minimize the external path length of T with respect to D in order to minimize the overall impurity characteristics of T. Within the DT -construct algorithm usually a greedy strategy (local optimization) is employed to minimize the overall impurity characteristics of a decision tree T. ML:III-42 Decision Trees STEIN/LETTMANN

10 Impurity Functions Based on the Misclassification Rate Definition for two classes: ι misclass (p 1, p 2 ) = 1 max{p 1, p 2 } = { p1 if 0 p p 1 otherwise { {(x, c(x)) D : c(x) = c1 } ι misclass (D) = 1 max, {(x, c(x)) D : c(x) = c } 2} ML:III-43 Decision Trees STEIN/LETTMANN

11 Impurity Functions Based on the Misclassification Rate Definition for two classes: ι misclass (p 1, p 2 ) = 1 max{p 1, p 2 } = { p1 if 0 p p 1 otherwise { {(x, c(x)) D : c(x) = c1 } ι misclass (D) = 1 max, {(x, c(x)) D : c(x) = c } 2} Graph of the function ι misclass (p 1, 1 p 1 ) : ι p 2 1 p 1 [Graph: Entropy, Gini] ML:III-44 Decision Trees STEIN/LETTMANN

12 Impurity Functions Based on the Misclassification Rate (continued) Definition for k classes: ι misclass (p 1,..., p k ) = 1 max i=1,...,k p i ι misclass (D) = 1 max c C {(x, c(x)) D : c(x) = c} ML:III-45 Decision Trees STEIN/LETTMANN

13 Impurity Functions Based on the Misclassification Rate (continued) Problems: ι misclass = 0 may hold for all possible splittings. The impurity function that is induced by the misclassification rate underestimates pure nodes, as illustrated in splitting (b): (a) (b) c 1 c 2 ML:III-46 Decision Trees STEIN/LETTMANN

14 Impurity Functions Based on the Misclassification Rate (continued) Problems: ι misclass = 0 may hold for all possible splittings. The impurity function that is induced by the misclassification rate underestimates pure nodes, as illustrated in splitting (b): (a) (b) c 1 c 2 ι misclass = ι misclass (D) ( ) D1 ι misclass (D 1 ) + D 2 ι misclass (D 2 ) left splitting: ι misclass = 1 2 ( ) = 1 4 right splitting: ι misclass = 1 2 ( ) = 1 4 ML:III-47 Decision Trees STEIN/LETTMANN

15 Definition 7 (Strict Impurity Function) Let ι : [0; 1] k R be an impurity function and let p, p k 1. Then ι is called strict, if it is strictly concave: (c) (c ) ι(λp + (1 λ)p ) > λ ι(p) + (1 λ) ι(p ), 0 < λ < 1, p p ML:III-48 Decision Trees STEIN/LETTMANN

16 Definition 7 (Strict Impurity Function) Let ι : [0; 1] k R be an impurity function and let p, p k 1. Then ι is called strict, if it is strictly concave: (c) (c ) ι(λp + (1 λ)p ) > λ ι(p) + (1 λ) ι(p ), 0 < λ < 1, p p Lemma 8 Let ι be a strict impurity function and let D 1,..., D s be a partitioning of an example set D, which is induced by a splitting of a feature space X. Then the following inequality holds: ι(d, {D 1,..., D s }) 0 The equality is given iff for all i {1,..., k} and j {1,..., s} holds: {(x, c(x)) D : c(x) = c i } = {(x, c(x)) D j : c(x) = c i } D j ML:III-49 Decision Trees STEIN/LETTMANN

17 Remarks: Strict concavity entails Property (c) of the impurity function definition. For two classes, strict concavity means ι(p 1, 1 p 1 ) > 0, where 0 < p 1 < 1. If ι is a twice differentiable function, strict concavity is equivalent with a negative definite Hessian of ι. With properly chosen coefficients, polynomials of second degree fulfill the properties (a) and (b) of the impurity function definition as well as strict concavity. See impurity functions based on the Gini index in this regard. The impurity function that is induced by the misclassification rate is concave, but it is not strictly concave. The proof of Lemma 8 exploits the strict concavity property of ι. ML:III-50 Decision Trees STEIN/LETTMANN

18 Impurity Functions Based on Entropy Definition 9 (Entropy) Let A denote an event and let P (A) denote the occurrence probability of A. Then the entropy (self-information, information content) of A is defined as log 2 (P (A)). Let A be an experiment with the exclusive outcomes (events) A 1,..., A k. Then the mean information content of A, denoted as H(A), is called Shannon entropy or entropy of experiment A and is defined as follows: H(A) = k P (A i ) log 2 (P (A i )) i=1 ML:III-51 Decision Trees STEIN/LETTMANN

19 Remarks: The smaller the occurrence probability of an event, the larger is its entropy. An event that is certain has zero entropy. The Shannon entropy combines the entropies of an experiment s outcomes, using the outcome probabilities as weights. In the entropy definition we stipulate the identity 0 log 2 (0) = 0. ML:III-52 Decision Trees STEIN/LETTMANN

20 Impurity Functions Based on Entropy (continued) Definition 10 (Conditional Entropy, Information Gain) Let A be an experiment with the exclusive outcomes (events) A 1,..., A k, and let B be another experiment with the outcomes B 1,..., B s. Then the conditional entropy of the combined experiment (A B) is defined as follows: s H(A B) = P (B j ) H(A B j ), j=1 where H(A B j ) = k P (A i B j ) log 2 (P (A i B j )) i=1 The information gain due to experiment B is defined as follows: s H(A) H(A B) = H(A) P (B j ) H(A B j ) j=1 ML:III-53 Decision Trees STEIN/LETTMANN

21 Impurity Functions Based on Entropy (continued) Definition 10 (Conditional Entropy, Information Gain) Let A be an experiment with the exclusive outcomes (events) A 1,..., A k, and let B be another experiment with the outcomes B 1,..., B s. Then the conditional entropy of the combined experiment (A B) is defined as follows: s H(A B) = P (B j ) H(A B j ), j=1 where H(A B j ) = k P (A i B j ) log 2 (P (A i B j )) i=1 The information gain due to experiment B is defined as follows: s H(A) H(A B) = H(A) P (B j ) H(A B j ) j=1 ML:III-54 Decision Trees STEIN/LETTMANN

22 Impurity Functions Based on Entropy (continued) Definition 10 (Conditional Entropy, Information Gain) Let A be an experiment with the exclusive outcomes (events) A 1,..., A k, and let B be another experiment with the outcomes B 1,..., B s. Then the conditional entropy of the combined experiment (A B) is defined as follows: s H(A B) = P (B j ) H(A B j ), j=1 where H(A B j ) = k P (A i B j ) log 2 (P (A i B j )) i=1 The information gain due to experiment B is defined as follows: s H(A) H(A B) = H(A) P (B j ) H(A B j ) j=1 ML:III-55 Decision Trees STEIN/LETTMANN

23 Remarks [Bayes for classification] : Information gain is defined as reduction in entropy. In the context of decision trees, experiment A corresponds to classifying feature vector x with regard to the target concept. A possible question, whose answer will inform us about which event A i A occurred, is the following: Does x belong to class c i? Likewise, experiment B corresponds to evaluating feature B of feature vector x. A possible question, whose answer will inform us about which event B j B occurred, is the following: Does x have value b j for feature B? Rationale: Typically, the events target concept class and feature value are statistically dependent. Hence, the entropy of the event c(x) will become smaller if we learn about the value of some feature of x (recall that the class of x is unknown). We experience an information gain with regard to the outcome of experiment A, which is rooted in our information about the outcome of experiment B. Under no circumstances the information gain will be negative; the information gain is zero if the involved events are conditionally independent: P (A i ) = P (A i B j ), i {1,..., k}, j {1,..., s}, which leads to a split as specified as the special case in Lemma 8. ML:III-56 Decision Trees STEIN/LETTMANN

24 Remarks (continued) : Since H(A) is constant, the feature that provides the maximum information gain (= the maximally informative feature) is given by the minimization of H(A B). The expanded form of H(A B) reads as follows: H(A B) = s P (B j ) j=1 k P (A i B j ) log 2 (P (A i B j )) i=1 ML:III-57 Decision Trees STEIN/LETTMANN

25 Impurity Functions Based on Entropy (continued) Definition for two classes: ι entropy (p 1, p 2 ) = (p 1 log 2 (p 1 ) + p 2 log 2 (p 2 )) ( {(x, c(x)) D : c(x) = c1 } ι entropy (D) = {(x, c(x)) D : c(x) = c 2 } {(x, c(x)) D : c(x) = c 1 } log 2 + ) {(x, c(x)) D : c(x) = c 2 } log 2 ML:III-58 Decision Trees STEIN/LETTMANN

26 Impurity Functions Based on Entropy (continued) Definition for two classes: ι entropy (p 1, p 2 ) = (p 1 log 2 (p 1 ) + p 2 log 2 (p 2 )) ( {(x, c(x)) D : c(x) = c1 } ι entropy (D) = {(x, c(x)) D : c(x) = c 2 } {(x, c(x)) D : c(x) = c 1 } log 2 + ) {(x, c(x)) D : c(x) = c 2 } log 2 Graph of the function ι entropy (p 1, 1 p 1 ) : ι p 2 1 p 1 [Graph: Misclassification, Gini] ML:III-59 Decision Trees STEIN/LETTMANN

27 Impurity Functions Based on Entropy (continued) Graph of the function ι entropy (p 1, p 2, 1 p 1 p 2 ) : 0.4 x Entropy H y x y ML:III-60 Decision Trees STEIN/LETTMANN

28 Impurity Functions Based on Entropy (continued) Definition for k classes: k ι entropy (p 1,..., p k ) = p i log 2 (p i ) i=1 ι entropy (D) = k i=1 {(x, c(x)) D : c(x) = c i } log 2 {(x, c(x)) D : c(x) = c i } ML:III-61 Decision Trees STEIN/LETTMANN

29 Impurity Functions Based on Entropy (continued) ι entropy corresponds to the information gain H(A) H(A B) : s D j ι entropy = ι entropy (D) ι entropy(d j ) } {{ } H(A) j=1 } {{ } H(A B) ML:III-62 Decision Trees STEIN/LETTMANN

30 Impurity Functions Based on Entropy (continued) ι entropy corresponds to the information gain H(A) H(A B) : s D j ι entropy = ι entropy (D) ι entropy(d j ) Legend: } {{ } H(A) j=1 } {{ } H(A B) A i, i = 1,..., k, denotes the event that x X(t) belongs to class c i. The experiment A corresponds to the classification c : X(t) C. B j, j = 1,..., s, denotes the event that x X(t) has value b j for feature B. The experiment B corresponds to evaluating feature B and entails the following splitting: X(t) = X(t 1 )... X(t s ) = {x X(t) : x B = b 1 }... {x X(t) : x B = b s } ι entropy (D) = ι entropy (P (A 1 ),..., P (A k )) = k i=1 P (A i) log 2 (P (A i )) = H(A) D j ι entropy (D j ) = P (B j ) ι entropy (P (A 1 B j ),..., P (A k B j )), j = 1,..., s P (A i ), P (B j ), P (A i B j ) are estimated as relative frequencies based on D. ML:III-63 Decision Trees STEIN/LETTMANN

31 Impurity Functions Based on the Gini Index Definition for two classes: ι Gini (p 1, p 2 ) = 1 (p p 2 2 ) = 2 p 1 p 2 ι Gini (D) = 2 {(x, c(x)) D : c(x) = c 1} {(x, c(x)) D : c(x) = c 2} ML:III-64 Decision Trees STEIN/LETTMANN

32 Impurity Functions Based on the Gini Index Definition for two classes: ι Gini (p 1, p 2 ) = 1 (p p 2 2 ) = 2 p 1 p 2 ι Gini (D) = 2 {(x, c(x)) D : c(x) = c 1} {(x, c(x)) D : c(x) = c 2} Graph of the function ι Gini (p 1, 1 p 1 ) : ι p 2 1 p 1 [Graph: Misclassification, Entropy] ML:III-65 Decision Trees STEIN/LETTMANN

33 Impurity Functions Based on the Gini Index (continued) Definition for k classes: k ι Gini (p 1,..., p k ) = 1 (p i ) 2 i=1 ι Gini (D) = ( k i=1 ) 2 {(x, c(x)) D : c(x) = c i } k ( {(x, c(x)) D : c(x) = ci } i=1 ) 2 = 1 k ( {(x, c(x)) D : c(x) = ci } i=1 ) 2 ML:III-66 Decision Trees STEIN/LETTMANN

the tree till a class assignment is reached

the tree till a class assignment is reached Decision Trees Decision Tree for Playing Tennis Prediction is done by sending the example down Prediction is done by sending the example down the tree till a class assignment is reached Definitions Internal

More information

Informal Definition: Telling things apart

Informal Definition: Telling things apart 9. Decision Trees Informal Definition: Telling things apart 2 Nominal data No numeric feature vector Just a list or properties: Banana: longish, yellow Apple: round, medium sized, different colors like

More information

Data Mining Classification: Basic Concepts and Techniques. Lecture Notes for Chapter 3. Introduction to Data Mining, 2nd Edition

Data Mining Classification: Basic Concepts and Techniques. Lecture Notes for Chapter 3. Introduction to Data Mining, 2nd Edition Data Mining Classification: Basic Concepts and Techniques Lecture Notes for Chapter 3 by Tan, Steinbach, Karpatne, Kumar 1 Classification: Definition Given a collection of records (training set ) Each

More information

Machine Learning and Data Mining. Decision Trees. Prof. Alexander Ihler

Machine Learning and Data Mining. Decision Trees. Prof. Alexander Ihler + Machine Learning and Data Mining Decision Trees Prof. Alexander Ihler Decision trees Func-onal form f(x;µ): nested if-then-else statements Discrete features: fully expressive (any func-on) Structure:

More information

Machine Learning & Data Mining

Machine Learning & Data Mining Group M L D Machine Learning M & Data Mining Chapter 7 Decision Trees Xin-Shun Xu @ SDU School of Computer Science and Technology, Shandong University Top 10 Algorithm in DM #1: C4.5 #2: K-Means #3: SVM

More information

Generalization to Multi-Class and Continuous Responses. STA Data Mining I

Generalization to Multi-Class and Continuous Responses. STA Data Mining I Generalization to Multi-Class and Continuous Responses STA 5703 - Data Mining I 1. Categorical Responses (a) Splitting Criterion Outline Goodness-of-split Criterion Chi-square Tests and Twoing Rule (b)

More information

Knowledge Discovery and Data Mining

Knowledge Discovery and Data Mining Knowledge Discovery and Data Mining Lecture 06 - Regression & Decision Trees Tom Kelsey School of Computer Science University of St Andrews http://tom.home.cs.st-andrews.ac.uk twk@st-andrews.ac.uk Tom

More information

Jeffrey D. Ullman Stanford University

Jeffrey D. Ullman Stanford University Jeffrey D. Ullman Stanford University 3 We are given a set of training examples, consisting of input-output pairs (x,y), where: 1. x is an item of the type we want to evaluate. 2. y is the value of some

More information

Decision trees COMS 4771

Decision trees COMS 4771 Decision trees COMS 4771 1. Prediction functions (again) Learning prediction functions IID model for supervised learning: (X 1, Y 1),..., (X n, Y n), (X, Y ) are iid random pairs (i.e., labeled examples).

More information

Decision Trees. Tirgul 5

Decision Trees. Tirgul 5 Decision Trees Tirgul 5 Using Decision Trees It could be difficult to decide which pet is right for you. We ll find a nice algorithm to help us decide what to choose without having to think about it. 2

More information

Decision Tree Learning

Decision Tree Learning Decision Tree Learning Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University References: 1. Machine Learning, Chapter 3 2. Data Mining: Concepts, Models,

More information

CS6375: Machine Learning Gautam Kunapuli. Decision Trees

CS6375: Machine Learning Gautam Kunapuli. Decision Trees Gautam Kunapuli Example: Restaurant Recommendation Example: Develop a model to recommend restaurants to users depending on their past dining experiences. Here, the features are cost (x ) and the user s

More information

Lecture 7: DecisionTrees

Lecture 7: DecisionTrees Lecture 7: DecisionTrees What are decision trees? Brief interlude on information theory Decision tree construction Overfitting avoidance Regression trees COMP-652, Lecture 7 - September 28, 2009 1 Recall:

More information

Decision T ree Tree Algorithm Week 4 1

Decision T ree Tree Algorithm Week 4 1 Decision Tree Algorithm Week 4 1 Team Homework Assignment #5 Read pp. 105 117 of the text book. Do Examples 3.1, 3.2, 3.3 and Exercise 3.4 (a). Prepare for the results of the homework assignment. Due date

More information

Classification: Decision Trees

Classification: Decision Trees Classification: Decision Trees Outline Top-Down Decision Tree Construction Choosing the Splitting Attribute Information Gain and Gain Ratio 2 DECISION TREE An internal node is a test on an attribute. A

More information

Decision Trees. Lewis Fishgold. (Material in these slides adapted from Ray Mooney's slides on Decision Trees)

Decision Trees. Lewis Fishgold. (Material in these slides adapted from Ray Mooney's slides on Decision Trees) Decision Trees Lewis Fishgold (Material in these slides adapted from Ray Mooney's slides on Decision Trees) Classification using Decision Trees Nodes test features, there is one branch for each value of

More information

Learning Decision Trees

Learning Decision Trees Learning Decision Trees Machine Learning Spring 2018 1 This lecture: Learning Decision Trees 1. Representation: What are decision trees? 2. Algorithm: Learning decision trees The ID3 algorithm: A greedy

More information

Lecture 7 Decision Tree Classifier

Lecture 7 Decision Tree Classifier Machine Learning Dr.Ammar Mohammed Lecture 7 Decision Tree Classifier Decision Tree A decision tree is a simple classifier in the form of a hierarchical tree structure, which performs supervised classification

More information

Error Error Δ = 0.44 (4/9)(0.25) (5/9)(0.2)

Error Error Δ = 0.44 (4/9)(0.25) (5/9)(0.2) Chapter 4, Problem 3: For the classification function, the number of + classifications are 4 and the number of classifications are 5. Hence, the entropy of this collection with respect to the + class:

More information

Lecture 24: Other (Non-linear) Classifiers: Decision Tree Learning, Boosting, and Support Vector Classification Instructor: Prof. Ganesh Ramakrishnan

Lecture 24: Other (Non-linear) Classifiers: Decision Tree Learning, Boosting, and Support Vector Classification Instructor: Prof. Ganesh Ramakrishnan Lecture 24: Other (Non-linear) Classifiers: Decision Tree Learning, Boosting, and Support Vector Classification Instructor: Prof Ganesh Ramakrishnan October 20, 2016 1 / 25 Decision Trees: Cascade of step

More information

Information Theory & Decision Trees

Information Theory & Decision Trees Information Theory & Decision Trees Jihoon ang Sogang University Email: yangjh@sogang.ac.kr Decision tree classifiers Decision tree representation for modeling dependencies among input variables using

More information

Lecture 3: Decision Trees

Lecture 3: Decision Trees Lecture 3: Decision Trees Cognitive Systems - Machine Learning Part I: Basic Approaches of Concept Learning ID3, Information Gain, Overfitting, Pruning last change November 26, 2014 Ute Schmid (CogSys,

More information

Chapter ML:II (continued)

Chapter ML:II (continued) Chapter ML:II (continued) II. Machine Learning Basics Regression Concept Learning: Search in Hypothesis Space Concept Learning: Search in Version Space Measuring Performance ML:II-96 Basics STEIN/LETTMANN

More information

Decision Tree Learning Lecture 2

Decision Tree Learning Lecture 2 Machine Learning Coms-4771 Decision Tree Learning Lecture 2 January 28, 2008 Two Types of Supervised Learning Problems (recap) Feature (input) space X, label (output) space Y. Unknown distribution D over

More information

Decision Tree And Random Forest

Decision Tree And Random Forest Decision Tree And Random Forest Dr. Ammar Mohammed Associate Professor of Computer Science ISSR, Cairo University PhD of CS ( Uni. Koblenz-Landau, Germany) Spring 2019 Contact: mailto: Ammar@cu.edu.eg

More information

Data Mining. CS57300 Purdue University. Bruno Ribeiro. February 8, 2018

Data Mining. CS57300 Purdue University. Bruno Ribeiro. February 8, 2018 Data Mining CS57300 Purdue University Bruno Ribeiro February 8, 2018 Decision trees Why Trees? interpretable/intuitive, popular in medical applications because they mimic the way a doctor thinks model

More information

.. Cal Poly CSC 466: Knowledge Discovery from Data Alexander Dekhtyar.. for each element of the dataset we are given its class label.

.. Cal Poly CSC 466: Knowledge Discovery from Data Alexander Dekhtyar.. for each element of the dataset we are given its class label. .. Cal Poly CSC 466: Knowledge Discovery from Data Alexander Dekhtyar.. Data Mining: Classification/Supervised Learning Definitions Data. Consider a set A = {A 1,...,A n } of attributes, and an additional

More information

Decision Tree Learning

Decision Tree Learning Topics Decision Tree Learning Sattiraju Prabhakar CS898O: DTL Wichita State University What are decision trees? How do we use them? New Learning Task ID3 Algorithm Weka Demo C4.5 Algorithm Weka Demo Implementation

More information

CS145: INTRODUCTION TO DATA MINING

CS145: INTRODUCTION TO DATA MINING CS145: INTRODUCTION TO DATA MINING 4: Vector Data: Decision Tree Instructor: Yizhou Sun yzsun@cs.ucla.edu October 10, 2017 Methods to Learn Vector Data Set Data Sequence Data Text Data Classification Clustering

More information

CS 6375 Machine Learning

CS 6375 Machine Learning CS 6375 Machine Learning Decision Trees Instructor: Yang Liu 1 Supervised Classifier X 1 X 2. X M Ref class label 2 1 Three variables: Attribute 1: Hair = {blond, dark} Attribute 2: Height = {tall, short}

More information

Data Mining and Analysis: Fundamental Concepts and Algorithms

Data Mining and Analysis: Fundamental Concepts and Algorithms Data Mining and Analysis: Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA

More information

Lecture 1: September 25, A quick reminder about random variables and convexity

Lecture 1: September 25, A quick reminder about random variables and convexity Information and Coding Theory Autumn 207 Lecturer: Madhur Tulsiani Lecture : September 25, 207 Administrivia This course will cover some basic concepts in information and coding theory, and their applications

More information

CHAPTER-17. Decision Tree Induction

CHAPTER-17. Decision Tree Induction CHAPTER-17 Decision Tree Induction 17.1 Introduction 17.2 Attribute selection measure 17.3 Tree Pruning 17.4 Extracting Classification Rules from Decision Trees 17.5 Bayesian Classification 17.6 Bayes

More information

Growing a Large Tree

Growing a Large Tree STAT 5703 Fall, 2004 Data Mining Methodology I Decision Tree I Growing a Large Tree Contents 1 A Single Split 2 1.1 Node Impurity.................................. 2 1.2 Computation of i(t)................................

More information

CSCI 5622 Machine Learning

CSCI 5622 Machine Learning CSCI 5622 Machine Learning DATE READ DUE Mon, Aug 31 1, 2 & 3 Wed, Sept 2 3 & 5 Wed, Sept 9 TBA Prelim Proposal www.rodneynielsen.com/teaching/csci5622f09/ Instructor: Rodney Nielsen Assistant Professor

More information

Machine Learning 3. week

Machine Learning 3. week Machine Learning 3. week Entropy Decision Trees ID3 C4.5 Classification and Regression Trees (CART) 1 What is Decision Tree As a short description, decision tree is a data classification procedure which

More information

Introduction to ML. Two examples of Learners: Naïve Bayesian Classifiers Decision Trees

Introduction to ML. Two examples of Learners: Naïve Bayesian Classifiers Decision Trees Introduction to ML Two examples of Learners: Naïve Bayesian Classifiers Decision Trees Why Bayesian learning? Probabilistic learning: Calculate explicit probabilities for hypothesis, among the most practical

More information

Decision Trees Part 1. Rao Vemuri University of California, Davis

Decision Trees Part 1. Rao Vemuri University of California, Davis Decision Trees Part 1 Rao Vemuri University of California, Davis Overview What is a Decision Tree Sample Decision Trees How to Construct a Decision Tree Problems with Decision Trees Classification Vs Regression

More information

brainlinksystem.com $25+ / hr AI Decision Tree Learning Part I Outline Learning 11/9/2010 Carnegie Mellon

brainlinksystem.com $25+ / hr AI Decision Tree Learning Part I Outline Learning 11/9/2010 Carnegie Mellon I Decision Tree Learning Part I brainlinksystem.com $25+ / hr Illah Nourbakhsh s version Chapter 8, Russell and Norvig Thanks to all past instructors Carnegie Mellon Outline Learning and philosophy Induction

More information

Data classification (II)

Data classification (II) Lecture 4: Data classification (II) Data Mining - Lecture 4 (2016) 1 Outline Decision trees Choice of the splitting attribute ID3 C4.5 Classification rules Covering algorithms Naïve Bayes Classification

More information

Learning Decision Trees

Learning Decision Trees Learning Decision Trees Machine Learning Fall 2018 Some slides from Tom Mitchell, Dan Roth and others 1 Key issues in machine learning Modeling How to formulate your problem as a machine learning problem?

More information

Review of Lecture 1. Across records. Within records. Classification, Clustering, Outlier detection. Associations

Review of Lecture 1. Across records. Within records. Classification, Clustering, Outlier detection. Associations Review of Lecture 1 This course is about finding novel actionable patterns in data. We can divide data mining algorithms (and the patterns they find) into five groups Across records Classification, Clustering,

More information

Notes on Machine Learning for and

Notes on Machine Learning for and Notes on Machine Learning for 16.410 and 16.413 (Notes adapted from Tom Mitchell and Andrew Moore.) Learning = improving with experience Improve over task T (e.g, Classification, control tasks) with respect

More information

Lecture VII: Classification I. Dr. Ouiem Bchir

Lecture VII: Classification I. Dr. Ouiem Bchir Lecture VII: Classification I Dr. Ouiem Bchir 1 Classification: Definition Given a collection of records (training set ) Each record contains a set of attributes, one of the attributes is the class. Find

More information

Lecture 3: Decision Trees

Lecture 3: Decision Trees Lecture 3: Decision Trees Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning ID3, Information Gain, Overfitting, Pruning Lecture 3: Decision Trees p. Decision

More information

Kernel Density Estimation

Kernel Density Estimation Kernel Density Estimation If Y {1,..., K} and g k denotes the density for the conditional distribution of X given Y = k the Bayes classifier is f (x) = argmax π k g k (x) k If ĝ k for k = 1,..., K are

More information

Induction on Decision Trees

Induction on Decision Trees Séance «IDT» de l'ue «apprentissage automatique» Bruno Bouzy bruno.bouzy@parisdescartes.fr www.mi.parisdescartes.fr/~bouzy Outline Induction task ID3 Entropy (disorder) minimization Noise Unknown attribute

More information

Classification and regression trees

Classification and regression trees Classification and regression trees Pierre Geurts p.geurts@ulg.ac.be Last update: 23/09/2015 1 Outline Supervised learning Decision tree representation Decision tree learning Extensions Regression trees

More information

Chapter ML:IV. IV. Statistical Learning. Probability Basics Bayes Classification Maximum a-posteriori Hypotheses

Chapter ML:IV. IV. Statistical Learning. Probability Basics Bayes Classification Maximum a-posteriori Hypotheses Chapter ML:IV IV. Statistical Learning Probability Basics Bayes Classification Maximum a-posteriori Hypotheses ML:IV-1 Statistical Learning STEIN 2005-2017 Area Overview Mathematics Statistics...... Stochastics

More information

SF2930 Regression Analysis

SF2930 Regression Analysis SF2930 Regression Analysis Alexandre Chotard Tree-based regression and classication 20 February 2017 1 / 30 Idag Overview Regression trees Pruning Bagging, random forests 2 / 30 Today Overview Regression

More information

Decision Tree Learning

Decision Tree Learning Topics Decision Tree Learning Sattiraju Prabhakar CS898O: DTL Wichita State University What are decision trees? How do we use them? New Learning Task ID3 Algorithm Weka Demo C4.5 Algorithm Weka Demo Implementation

More information

Classification and Regression Trees

Classification and Regression Trees Classification and Regression Trees Ryan P Adams So far, we have primarily examined linear classifiers and regressors, and considered several different ways to train them When we ve found the linearity

More information

Machine Learning

Machine Learning Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University August 30, 2017 Today: Decision trees Overfitting The Big Picture Coming soon Probabilistic learning MLE,

More information

M chi h n i e n L e L arni n n i g Decision Trees Mac a h c i h n i e n e L e L a e r a ni n ng

M chi h n i e n L e L arni n n i g Decision Trees Mac a h c i h n i e n e L e L a e r a ni n ng 1 Decision Trees 2 Instances Describable by Attribute-Value Pairs Target Function Is Discrete Valued Disjunctive Hypothesis May Be Required Possibly Noisy Training Data Examples Equipment or medical diagnosis

More information

Decision Trees. CS57300 Data Mining Fall Instructor: Bruno Ribeiro

Decision Trees. CS57300 Data Mining Fall Instructor: Bruno Ribeiro Decision Trees CS57300 Data Mining Fall 2016 Instructor: Bruno Ribeiro Goal } Classification without Models Well, partially without a model } Today: Decision Trees 2015 Bruno Ribeiro 2 3 Why Trees? } interpretable/intuitive,

More information

Decision Procedures An Algorithmic Point of View

Decision Procedures An Algorithmic Point of View An Algorithmic Point of View ILP References: Integer Programming / Laurence Wolsey Deciding ILPs with Branch & Bound Intro. To mathematical programming / Hillier, Lieberman Daniel Kroening and Ofer Strichman

More information

Question of the Day. Machine Learning 2D1431. Decision Tree for PlayTennis. Outline. Lecture 4: Decision Tree Learning

Question of the Day. Machine Learning 2D1431. Decision Tree for PlayTennis. Outline. Lecture 4: Decision Tree Learning Question of the Day Machine Learning 2D1431 How can you make the following equation true by drawing only one straight line? 5 + 5 + 5 = 550 Lecture 4: Decision Tree Learning Outline Decision Tree for PlayTennis

More information

Classification and Prediction

Classification and Prediction Classification Classification and Prediction Classification: predict categorical class labels Build a model for a set of classes/concepts Classify loan applications (approve/decline) Prediction: model

More information

DECISION TREE LEARNING. [read Chapter 3] [recommended exercises 3.1, 3.4]

DECISION TREE LEARNING. [read Chapter 3] [recommended exercises 3.1, 3.4] 1 DECISION TREE LEARNING [read Chapter 3] [recommended exercises 3.1, 3.4] Decision tree representation ID3 learning algorithm Entropy, Information gain Overfitting Decision Tree 2 Representation: Tree-structured

More information

Rule Generation using Decision Trees

Rule Generation using Decision Trees Rule Generation using Decision Trees Dr. Rajni Jain 1. Introduction A DT is a classification scheme which generates a tree and a set of rules, representing the model of different classes, from a given

More information

Data Mining Project. C4.5 Algorithm. Saber Salah. Naji Sami Abduljalil Abdulhak

Data Mining Project. C4.5 Algorithm. Saber Salah. Naji Sami Abduljalil Abdulhak Data Mining Project C4.5 Algorithm Saber Salah Naji Sami Abduljalil Abdulhak Decembre 9, 2010 1.0 Introduction Before start talking about C4.5 algorithm let s see first what is machine learning? Human

More information

Chapter 6: Classification

Chapter 6: Classification Chapter 6: Classification 1) Introduction Classification problem, evaluation of classifiers, prediction 2) Bayesian Classifiers Bayes classifier, naive Bayes classifier, applications 3) Linear discriminant

More information

day month year documentname/initials 1

day month year documentname/initials 1 ECE471-571 Pattern Recognition Lecture 13 Decision Tree Hairong Qi, Gonzalez Family Professor Electrical Engineering and Computer Science University of Tennessee, Knoxville http://www.eecs.utk.edu/faculty/qi

More information

CART Classification and Regression Trees Trees can be viewed as basis expansions of simple functions

CART Classification and Regression Trees Trees can be viewed as basis expansions of simple functions CART Classification and Regression Trees Trees can be viewed as basis expansions of simple functions f (x) = M c m 1(x R m ) m=1 with R 1,..., R m R p disjoint. The CART algorithm is a heuristic, adaptive

More information

Chapter 14 Combining Models

Chapter 14 Combining Models Chapter 14 Combining Models T-61.62 Special Course II: Pattern Recognition and Machine Learning Spring 27 Laboratory of Computer and Information Science TKK April 3th 27 Outline Independent Mixing Coefficients

More information

Machine Learning Recitation 8 Oct 21, Oznur Tastan

Machine Learning Recitation 8 Oct 21, Oznur Tastan Machine Learning 10601 Recitation 8 Oct 21, 2009 Oznur Tastan Outline Tree representation Brief information theory Learning decision trees Bagging Random forests Decision trees Non linear classifier Easy

More information

Induction of Decision Trees

Induction of Decision Trees Induction of Decision Trees Peter Waiganjo Wagacha This notes are for ICS320 Foundations of Learning and Adaptive Systems Institute of Computer Science University of Nairobi PO Box 30197, 00200 Nairobi.

More information

Intro. ANN & Fuzzy Systems. Lecture 15. Pattern Classification (I): Statistical Formulation

Intro. ANN & Fuzzy Systems. Lecture 15. Pattern Classification (I): Statistical Formulation Lecture 15. Pattern Classification (I): Statistical Formulation Outline Statistical Pattern Recognition Maximum Posterior Probability (MAP) Classifier Maximum Likelihood (ML) Classifier K-Nearest Neighbor

More information

Data Mining Classification: Basic Concepts, Decision Trees, and Model Evaluation

Data Mining Classification: Basic Concepts, Decision Trees, and Model Evaluation Data Mining Classification: Basic Concepts, Decision Trees, and Model Evaluation Lecture Notes for Chapter 4 Part I Introduction to Data Mining by Tan, Steinbach, Kumar Adapted by Qiang Yang (2010) Tan,Steinbach,

More information

Machine Learning

Machine Learning Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University January 14, 2015 Today: The Big Picture Overfitting Review: probability Readings: Decision trees, overfiting

More information

Regression tree methods for subgroup identification I

Regression tree methods for subgroup identification I Regression tree methods for subgroup identification I Xu He Academy of Mathematics and Systems Science, Chinese Academy of Sciences March 25, 2014 Xu He (AMSS, CAS) March 25, 2014 1 / 34 Outline The problem

More information

Suppose that f is continuous on [a, b] and differentiable on (a, b). Then

Suppose that f is continuous on [a, b] and differentiable on (a, b). Then Lectures 1/18 Derivatives and Graphs When we have a picture of the graph of a function f(x), we can make a picture of the derivative f (x) using the slopes of the tangents to the graph of f. In this section

More information

Final Exam, Machine Learning, Spring 2009

Final Exam, Machine Learning, Spring 2009 Name: Andrew ID: Final Exam, 10701 Machine Learning, Spring 2009 - The exam is open-book, open-notes, no electronics other than calculators. - The maximum possible score on this exam is 100. You have 3

More information

Machine Learning 2nd Edi7on

Machine Learning 2nd Edi7on Lecture Slides for INTRODUCTION TO Machine Learning 2nd Edi7on CHAPTER 9: Decision Trees ETHEM ALPAYDIN The MIT Press, 2010 Edited and expanded for CS 4641 by Chris Simpkins alpaydin@boun.edu.tr h1p://www.cmpe.boun.edu.tr/~ethem/i2ml2e

More information

Supervised Learning! Algorithm Implementations! Inferring Rudimentary Rules and Decision Trees!

Supervised Learning! Algorithm Implementations! Inferring Rudimentary Rules and Decision Trees! Supervised Learning! Algorithm Implementations! Inferring Rudimentary Rules and Decision Trees! Summary! Input Knowledge representation! Preparing data for learning! Input: Concept, Instances, Attributes"

More information

Pattern Recognition Approaches to Solving Combinatorial Problems in Free Groups

Pattern Recognition Approaches to Solving Combinatorial Problems in Free Groups Contemporary Mathematics Pattern Recognition Approaches to Solving Combinatorial Problems in Free Groups Robert M. Haralick, Alex D. Miasnikov, and Alexei G. Myasnikov Abstract. We review some basic methodologies

More information

Decision Tree Learning

Decision Tree Learning Decision Tree Learning Goals for the lecture you should understand the following concepts the decision tree representation the standard top-down approach to learning a tree Occam s razor entropy and information

More information

EECS 349:Machine Learning Bryan Pardo

EECS 349:Machine Learning Bryan Pardo EECS 349:Machine Learning Bryan Pardo Topic 2: Decision Trees (Includes content provided by: Russel & Norvig, D. Downie, P. Domingos) 1 General Learning Task There is a set of possible examples Each example

More information

Decision Trees.

Decision Trees. . Machine Learning Decision Trees Prof. Dr. Martin Riedmiller AG Maschinelles Lernen und Natürlichsprachliche Systeme Institut für Informatik Technische Fakultät Albert-Ludwigs-Universität Freiburg riedmiller@informatik.uni-freiburg.de

More information

Generative v. Discriminative classifiers Intuition

Generative v. Discriminative classifiers Intuition Logistic Regression (Continued) Generative v. Discriminative Decision rees Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University January 31 st, 2007 2005-2007 Carlos Guestrin 1 Generative

More information

Machine Learning

Machine Learning Machine Learning 10-701 Tom M. Mitchell Machine Learning Department Carnegie Mellon University January 13, 2011 Today: The Big Picture Overfitting Review: probability Readings: Decision trees, overfiting

More information

16.4 Multiattribute Utility Functions

16.4 Multiattribute Utility Functions 285 Normalized utilities The scale of utilities reaches from the best possible prize u to the worst possible catastrophe u Normalized utilities use a scale with u = 0 and u = 1 Utilities of intermediate

More information

Classification Using Decision Trees

Classification Using Decision Trees Classification Using Decision Trees 1. Introduction Data mining term is mainly used for the specific set of six activities namely Classification, Estimation, Prediction, Affinity grouping or Association

More information

Decision Trees. Nicholas Ruozzi University of Texas at Dallas. Based on the slides of Vibhav Gogate and David Sontag

Decision Trees. Nicholas Ruozzi University of Texas at Dallas. Based on the slides of Vibhav Gogate and David Sontag Decision Trees Nicholas Ruozzi University of Texas at Dallas Based on the slides of Vibhav Gogate and David Sontag Supervised Learning Input: labelled training data i.e., data plus desired output Assumption:

More information

Randomized Decision Trees

Randomized Decision Trees Randomized Decision Trees compiled by Alvin Wan from Professor Jitendra Malik s lecture Discrete Variables First, let us consider some terminology. We have primarily been dealing with real-valued data,

More information

Decision Tree. Decision Tree Learning. c4.5. Example

Decision Tree. Decision Tree Learning. c4.5. Example Decision ree Decision ree Learning s of systems that learn decision trees: c4., CLS, IDR, ASSISA, ID, CAR, ID. Suitable problems: Instances are described by attribute-value couples he target function has

More information

Decision Tree Learning - ID3

Decision Tree Learning - ID3 Decision Tree Learning - ID3 n Decision tree examples n ID3 algorithm n Occam Razor n Top-Down Induction in Decision Trees n Information Theory n gain from property 1 Training Examples Day Outlook Temp.

More information

Decision Trees. Data Science: Jordan Boyd-Graber University of Maryland MARCH 11, Data Science: Jordan Boyd-Graber UMD Decision Trees 1 / 1

Decision Trees. Data Science: Jordan Boyd-Graber University of Maryland MARCH 11, Data Science: Jordan Boyd-Graber UMD Decision Trees 1 / 1 Decision Trees Data Science: Jordan Boyd-Graber University of Maryland MARCH 11, 2018 Data Science: Jordan Boyd-Graber UMD Decision Trees 1 / 1 Roadmap Classification: machines labeling data for us Last

More information

Bayesian Learning. Bayesian Learning Criteria

Bayesian Learning. Bayesian Learning Criteria Bayesian Learning In Bayesian learning, we are interested in the probability of a hypothesis h given the dataset D. By Bayes theorem: P (h D) = P (D h)p (h) P (D) Other useful formulas to remember are:

More information

Tutorial 6. By:Aashmeet Kalra

Tutorial 6. By:Aashmeet Kalra Tutorial 6 By:Aashmeet Kalra AGENDA Candidate Elimination Algorithm Example Demo of Candidate Elimination Algorithm Decision Trees Example Demo of Decision Trees Concept and Concept Learning A Concept

More information

Decision Support. Dr. Johan Hagelbäck.

Decision Support. Dr. Johan Hagelbäck. Decision Support Dr. Johan Hagelbäck johan.hagelback@lnu.se http://aiguy.org Decision Support One of the earliest AI problems was decision support The first solution to this problem was expert systems

More information

Decision Tree Learning

Decision Tree Learning Decision Tree Learning Goals for the lecture you should understand the following concepts the decision tree representation the standard top-down approach to learning a tree Occam s razor entropy and information

More information

Statistics and learning: Big Data

Statistics and learning: Big Data Statistics and learning: Big Data Learning Decision Trees and an Introduction to Boosting Sébastien Gadat Toulouse School of Economics February 2017 S. Gadat (TSE) SAD 2013 1 / 30 Keywords Decision trees

More information

Machine Learning, Midterm Exam: Spring 2009 SOLUTION

Machine Learning, Midterm Exam: Spring 2009 SOLUTION 10-601 Machine Learning, Midterm Exam: Spring 2009 SOLUTION March 4, 2009 Please put your name at the top of the table below. If you need more room to work out your answer to a question, use the back of

More information

Decision Tree Analysis for Classification Problems. Entscheidungsunterstützungssysteme SS 18

Decision Tree Analysis for Classification Problems. Entscheidungsunterstützungssysteme SS 18 Decision Tree Analysis for Classification Problems Entscheidungsunterstützungssysteme SS 18 Supervised segmentation An intuitive way of thinking about extracting patterns from data in a supervised manner

More information

Principles of Pattern Recognition. C. A. Murthy Machine Intelligence Unit Indian Statistical Institute Kolkata

Principles of Pattern Recognition. C. A. Murthy Machine Intelligence Unit Indian Statistical Institute Kolkata Principles of Pattern Recognition C. A. Murthy Machine Intelligence Unit Indian Statistical Institute Kolkata e-mail: murthy@isical.ac.in Pattern Recognition Measurement Space > Feature Space >Decision

More information

Decision Trees.

Decision Trees. . Machine Learning Decision Trees Prof. Dr. Martin Riedmiller AG Maschinelles Lernen und Natürlichsprachliche Systeme Institut für Informatik Technische Fakultät Albert-Ludwigs-Universität Freiburg riedmiller@informatik.uni-freiburg.de

More information

Supervised Learning via Decision Trees

Supervised Learning via Decision Trees Supervised Learning via Decision Trees Lecture 4 1 Outline 1. Learning via feature splits 2. ID3 Information gain 3. Extensions Continuous features Gain ratio Ensemble learning 2 Sequence of decisions

More information

5.1 Learning using Polynomial Threshold Functions

5.1 Learning using Polynomial Threshold Functions CS 395T Computational Learning Theory Lecture 5: September 17, 2007 Lecturer: Adam Klivans Scribe: Aparajit Raghavan 5.1 Learning using Polynomial Threshold Functions 5.1.1 Recap Definition 1 A function

More information

Decision Trees. Each internal node : an attribute Branch: Outcome of the test Leaf node or terminal node: class label.

Decision Trees. Each internal node : an attribute Branch: Outcome of the test Leaf node or terminal node: class label. Decision Trees Supervised approach Used for Classification (Categorical values) or regression (continuous values). The learning of decision trees is from class-labeled training tuples. Flowchart like structure.

More information