Administrative notes Labs this week: project time. Remember, you need to pass the project in order to pass the course! (See course syllabus.) Clicker grades should be on-line now
Administrative notes March 3: Data mining reading quiz March 14: Midterm 2 March 17: In the News call #3 March 30: Project deliverables and individual report due
Data mining: finding patterns in data Part 1: Building decision tree classifiers from data
Learning goals [CT Building Block] Students will be able to build a simple decision tree [CT Building Block] Students will be able to describe what considerations are important in building a decision tree
Why data mining? The world is awash with digital data; trillions of gigabytes and growing How many bytes in a gigabyte? Clicker question A. 1 000 000 B. 1 000 000 000 C. 1 000 000 000 000
Why data mining? The world is awash with digital data; trillions of gigabytes and growing A trillion gigabytes is a zettabyte, or 1 000 000 000 000 000 000 000 bytes
Why data mining? More and more, businesses and institutions are using data mining to make decisions, classifications, diagnoses, and recommendations that affect our lives
Data mining for classification Recall our loan application example
Data mining for classification In the loan strategy example, we focused on fairness of different classifiers, but we didn t focus much on how to build a classifier Today you ll learn how to build decision tree classifiers for simple data mining scenarios
A rooted tree in computer science Before we get to decision trees, we ll define what is a tree
A rooted tree in computer science A collection of nodes such that one node is the designated root a node can have zero or more children; a node with zero children is a leaf all non-root nodes have a single parent
A rooted tree in computer science A collection of nodes such that one node is the designated root a node can have zero or more children; a node with zero children is a leaf all non-root nodes have a single parent edges denote parent-child relationships nodes and/or edges may be labeled by data
A rooted tree in computer science Often but not always drawn with root on top
Are these rooted trees? Clicker question 1 2 A. 1 but not 2 B. 2 both not 1 C. Both 1 and 2 D. Neither 1 nor 2 http://jerome.boulinguez.free.fr/english/file/hotpotatoes/familytree.htm
Is this a rooted tree? Clicker question A. B. C. I m not sure de 2 has two parents, and there s no unique root http://jerome.boulinguez.free.fr/english/file/hotpotatoes/familytree.htm
Decision trees: trees whose node labels are attributes, edge labels are conditions
Decision trees: trees whose node labels are attributes, edge labels are conditions
Decision trees: trees whose node labels are attributes, edge labels are conditions
Decision trees: trees whose node labels are attributes, edge labels are conditions
Decision trees: trees whose node labels are attributes, edge labels are conditions https://gbr.pepperdine.edu/2010/08/how-gerber-used-adecision-tree-in-strategic-decision-making/
Decision trees: trees whose node labels are attributes, edge labels are conditions A decision tree for max profit loan strategy colour credit rating blue orange credit rating > 61 < 61 > 50 < 50 approve deny approve deny (te that some worthy applicants are denied loans, while other unworthy ones get loans)
Exercise: Construct the decision tree for the Group Unaware loan strategy
Building decision trees from training data Should you get an ice cream? You might start out with the following data Weather Wallet Ice Cream? Great Empty Nasty Empty Great Full Okay Full Nasty Full
Building decision trees from training data Should you get an ice cream? You might start out with the following data attributes Weather Wallet Ice Cream? Great Empty Nasty Empty Great Full Okay Full Nasty Full conditions
Building decision trees from training data Should you get an ice cream? You might start out with the following data You might build a decision tree that looks like this: attributes Weather Wallet Ice Cream? Great Empty Empty Wallet Full Nasty Empty Great Full Okay Full Nasty Full conditions Nasty Weather Okay Great
Shall we play a game? Suppose we want to help a soccer league decide whether on not to cancel games. We have some data. Our goal is a decision tree to help officials make decisions Assume that decisions are the same given the same information Outlook Temperature Humidity Windy Play? sunny hot high false sunny hot high true overcast hot high false rain mild high false rain cool normal false rain cool normal true overcast cool normal true sunny mild high false sunny cool normal false rain mild normal false sunny mild normal true overcast mild high true overcast hot normal false rain mild high true Example adapted from http://www.kdnuggets.com/ data_mining_course/index.html#materials
Create a decision tree Group exercise Create a decision tree that decides whether the game should be played or not The leaf nodes should be whether or not to play The non-leaf nodes should be attributes (e.g., Outlook, Windy) The edges should be conditions (e.g., sunny, hot, normal) Outlook Temperature Humidity Windy Play? sunny hot high false sunny hot high true overcast hot high false rain mild high false rain cool normal false rain cool normal true overcast cool normal true sunny mild high false sunny cool normal false rain mild normal false sunny mild normal true overcast mild high true overcast hot normal false rain mild high true
Some example potential starts to the decision tree Outlook? Windy? Overcast Rainy Sunny Temperature? Humidity? Windy? Humidity? true false Humidity?
How did you split up your tree and why? Student responses Looked at each condition of every attribute, case by case, until got to the end, working left to right through the columns There are ways that the tree can be made smaller, by finding patterns. E.g., if it s overcast, the answer is always yes.
Here s that example again Create a decision tree that decides whether the game should be played or not The leaf nodes should be whether or not to play The non-leaf nodes should be attributes (e.g., Outlook, Windy) The edges should be conditions (e.g., sunny, hot, normal) Outlook Temperature Humidity Windy Play? sunny hot high false sunny hot high true overcast hot high false rain mild high false rain cool normal false rain cool normal true overcast cool normal true sunny mild high false sunny cool normal false rain mild normal false sunny mild normal true overcast mild high true overcast hot normal false rain mild high true
Deciding which nodes go where: A decision tree construction algorithm Top-down tree construction At start, all examples are at the root. Partition the examples recursively by choosing one attribute each time. In deciding which attribute to split on, one common method is to try to reduce entropy i.e., each time you split, you should make the resulting groups more homogenous. The more you reduce entropy, the higher the information gain.
Let s go back to our example Intuitively, our goal is to get to having as few mixed and answers together in groups as possible. So in the initial case, we have 14 mixed s and s Outlook Temperature Humidity Windy Play? sunny hot high false sunny hot high true overcast hot high false rain mild high false rain cool normal false rain cool normal true overcast cool normal true sunny mild high false sunny cool normal false rain mild normal false sunny mild normal true overcast mild high true overcast hot normal false rain mild high true
What happens if we split on Temperature? temperature hot mild cool 4 mixed 4 mixed 6 mixed Overall entropy = 4 + 4 + 6 = 14
What s the entropy if you split on Outlook? Group exercise A. 0 B. 5 C. 10 D. 14 E. ne of the above Outlook Temperature Humidity Windy Play? sunny hot high false sunny hot high true overcast hot high false rain mild high false rain cool normal false rain cool normal true overcast cool normal true sunny mild high false sunny cool normal false rain mild normal false sunny mild normal true overcast mild high true overcast hot normal false rain mild high true
What s the entropy if you split on Outlook? Group exercise results Outlook sunny overcast rainy 0 mixed 5 mixed 5 mixed Overall entropy = 5 + 0 + 5 = 10
What if you split on Windy? Windy false true 6 mixed 8 mixed Overall entropy = 8+6=14
What if you split on Humidity? Humidity high normal 7 mixed 7 mixed Overall entropy = 7+7=14
The best option to split on is Outlook It does the best job of reducing entropy
This example suggests why a more complex entropy definition might be better Windy Humidity false true high normal Humidity is better, even though both have entropy 14
Great! w we do the same thing again Here s what we have so far: Outlook sunny overcast rainy For each option, we have to decide which attribute to split on next: Temperature, Windy, or Humidity.
Great! w we do the same thing again Clicker question What s the best attribute to split on for Outlook = sunny? A. B. C. Temperature Windy Humidity hot mild cool false true high normal
We don t need to split for Outlook = overcast The answer was yes each time. So we re done there.
What s the best attribute to split on for Outlook = rain? Clicker question A. B. C. Temperature Windy Humidity hot N/A mild cool false true high normal
This was, of course, a simple example In this example, the algorithm found the tree with the smallest number of nodes We were given the attributes and conditions A simplistic notion of entropy worked (a more sophisticated notion of entropy is typically used to determine which attribute to split on)
This was, of course, a simple example In more complex examples, like the loan application example We may not know which conditions or attributes are best to use The final decision may not be correct in every case (e.g., given two loan applicants with the same colour and credit rating, one may be credit worthy while the other is not) Even if the final decision is always correct, the tree may not be of minimum size
Coding up a decision tree classifier Outlook sunny overcast rainy Humidity Windy high normal false true
Coding up a decision tree classifier Outlook sunny overcast Humidity high normal Can you see the relationship between the hierarchical tree structure and the hierarchical nesting of if statements?
Coding up a decision tree classifier Outlook sunny overcast Humidity high normal Can you extend the code to handle the rainy case?
Learning goals [CT Building Block] Students will be able to build a simple decision tree [CT Building Block] Students will be able to describe what considerations are important in building a decision tree