Data Mining. CS57300 Purdue University. Bruno Ribeiro. February 8, 2018

Size: px
Start display at page:

Download "Data Mining. CS57300 Purdue University. Bruno Ribeiro. February 8, 2018"

Transcription

1 Data Mining CS57300 Purdue University Bruno Ribeiro February 8, 2018

2 Decision trees

3 Why Trees? interpretable/intuitive, popular in medical applications because they mimic the way a doctor thinks model discrete outcomes nicely can be very powerful, can be as complex as you need them C4.5 and CART - from top 10 entries on Kaggle - decision trees are very effective and popular

4 Sure, But Why Trees? Easy to understand knowledge representation Can handle mixed variables Recursive, divide and conquer learning method Efficient inference Example: Play outside =>

5 Divide-and-conquer Classification Consider input tuples (x i,y i ) for i-th observation x 2 E s θ 3 B θ 2 C D x 1 > θ 1 A x 2 θ 2 x 2 > θ 3 θ 1 θ 4 x 1 x 1 θ 4 A B C D E

6 Tree learning Finding best tree is intractable Must consider all 2 m combinations, where m is number of features Often just greedily grow it by splitting attributes one by one. To determine which attribute to split, look at node impurity. Top-down recursive divide and conquer algorithm Start with all examples at root Select best attribute/feature Recurse and repeat Other issues: How to construct features When to stop growing Pruning irrelevant parts of the tree

7 Fraud Age Degree StartYr Series Y 2005 N - 25 N 2003 Y - 31 Y 1995 Y - 27 Y 1999 Y Score each attribute split for these instances: Age, Degree, StartYr, Series N 2006 N - 29 N 2003 N Y choose split on Series7 N Fraud Age Degree StartYr Series7-25 N 2003 Y - 31 Y 1995 Y - 27 Y 1999 Y Y Fraud Age Degree StartYr Series Y 2005 N + 24 N 2006 N - 29 N 2003 N choose split on Age>28 N Score each attribute split for these instances: Age, Degree, StartYr Fraud Age Degree StartYr Series7-29 N 2003 N Fraud Age Degree StartYr Series Y 2005 N + 24 N 2006 N

8 t 1 t 3 Overview (with two features and 1D target) X 1 X 1 X 1 t 1 X 2 t 2 X 1 t 3 Y X 2 t 4 R 1 R 2 R 3 X 2 X 1 R 4 R 5 FIGURE 9.2. Partitions and CART. Top right panel shows a partition of a Features: X two-dimensional feature space by recursive 1, X binary 2 splitting, as usedincart, applied to some fake data. Top left Target: panel shows Y a general partition that cannot be obtained from recursive binary splitting. Bottom left panel shows the tree corresponding to the partition in the top right panel, and a perspective plot of the prediction surface appears in the bottom right panel.

9 Tree models Most well-known systems CART: Breiman, Friedman, Olshen and Stone ID3, C4.5: Quinlan How do they differ? Split scoring function Stopping criterion Pruning mechanism Predictions in leaf nodes

10 Scoring functions: Local split value

11 Choosing an attribute/feature Idea: a good feature splits the examples into subsets that distinguish among the class labels as much as possible... ideally into pure sets of "all positive" or "all negative" Patrons? Type? None Some Full French Italian Thai Burger Bias-variance tradeoff: Choosing most discriminating attribute first may not be best tree (bias), but it can make tree small (low variance).

12 Association between attribute and class label Data Income High Med Low no no yes yes yes no yes yes yes no yes no yes yes Contingency table Attribute value Class label value Buy No buy High 2 2 Med 4 2 Low 3 1

13 Mathematically Defining Good Split We start with information theory How uncertain of the answer will be if we split the tree this way? Say need to decide between k options Uncertainty in the answer Y {1,,k} when probability is (p 1,..., p k ) can be quantified via entropy: H(p 1,...,p k )= X i p i log 2 p i Convenient notation: B(p) = H(p, 1 p), number of bits necessary to encode

14 Amount of Information in the Tree Suppose we have p positive and n negative examples at the root B(p/(p + n)) bits needed to classify a new example Information is always conserved If encoding the information in the leaves is lossless then tree has lossless encoding The entropy of the leaves (amount of bits) + the tree information (bits) carried in the tree = total information in the data Let split Y i have p i positive and n i negative examples B(p i /(p i + n i )) bits needed to classify a new example expected number of bits per example over all branches is X i p i + n i p + n B(p i/(p i + n i )) choose the next attribute to split that minimizes the remaining information needed Which maximizes the information in the tree (as information is conserved)

15 <latexit sha1_base64="xbivkjhhtbzalqgmioraqk8ih6y=">aaahunic3vvnb9naen22tcmhqathlluqsag1krpxkvspiesllkuifcm2qvv6kqyy9lq747bpyn+jxwmhdvanohjhhvuhadreqikvld/pzppmvb3thqkubj3v68li0o3lldrqzfqttdt37q5v3ptgvky5dlmssn8mmqepeuiiqakfuw0sdiuchqm3hf/wglqrknmp4xscma0s0recotmdre/7mcmhz9k+zqmpciqwcknxcnqautwnjgsdmgusrbfdrk8edimja3emctwutsyp1re8ljdzdb60k7bfqnvwtlh8w48uz2jiketmtk/tprhyplfwcxndzwykji/yahrrsuhnwmiwgt2ddj3thvnhtk+0exkke+t5kmwxmem4djffj+airzbe5utl2h8rwjgkgulcy0t9tfjutjcqrkidrzl2ghetxlmud5lmhj3qm1mkf6ns0ut1mqwxg4fwkg4szwkhoeu2keznlamauosa6be1q5acacxcckfopf8jacomrwmnkor28tp86m2g0v3kbsvwwlucsyr6bp23ftc1yaghubw+czwnamqz1p0isc+4iybdr9uobl3rafcibuc6itq76efqjaftdra9r3rnhza936btgo4drxypumfzgkyzfshlbigz2c4+jyio+fxot8+02e6k6ljn4jvezbgdqzxk6uhp/yvn+hmuvv3dnkurc0ldet2fe9jtfox0/fnctl1xknpnz/+ctpdnqdp2r0/p/6tpoumtz/qgfu508fukmqhsxufahofsxaknrfyzo1hs2on1noe/savdlygs2lxqyag4gjx0s5slddde++klma+6ndbllvfuydbes+q+wcupyepsjg3ynoyrfxjauostt+qz+ua+r3xz+vlbqc2voyslfec+mvm1tv8xnmos</latexit> <latexit sha1_base64="59igpv32zwfgr8k1ohldlaycw/g=">aaahmhic3vxnbhmxehabhhl+wjhycamcamqjtcsvukvkxjc4fefopeyq8nonirxvemv726bwvgfpa0d4kz4qv668alpzvwiatjeqwfkub2fm88x8tjxhkowxnne8conylcxa1avr9es3bt66vbxy54nrmebq5uoqvrsya1ik0lxcsthnnba4llatjl4v/p190eao5l0dpxdebjcivudmomlv+yefmzvktlp3ofuthfphhaf2cbqskztqifmw7y2ves1vsug8afdgjvrre29l8acfkz7fuauxzjhe20tt4ji2gkvi635migv8xabqi/zfahiwgwnc4asnndbqh9g+0vhllj1yt5ici40zxyfgfh2y077cejavl9n+88cjjm0sjlxm1m8ktyowateim+zwjhewrgwws/mqacytyjitpdjbkivnxp/jelmrakxiwhgwcjczbmff6kgyfvckudm9dmbiujctwbio6kz6lwbdxprooags3e3+vp94q6hertaogqou4pgl0spnv+njnqhdpcid8w1wnlojjqduf0l6bo8ynr1wo3d1rqnbiroadhnpdtczq5eetnlz9x7sjq3a9h6dtgk6cf4ij+jvqkgmzl6qcjougawxnxmuvhwunj4zbby7quxuixgnurmoikisnb897b/ktd+dsqolmhn1dc6ok6/nc0liwceo458scuo6v9elev7ntd3rlqk2f/2w/k+anpdu+uwpyogvg69s0mwqxtykb8iopyifna7yz55eswohf9pqf5ow4mtaeychfyozfq+wkn5pw1ace+3tq2eeddutfy3v7eo1rafvvfgi98h90irt8oxskddkm3qjjx/jj/kffk19rh3xvtw+l6gxfiroxtkzaj9+as+jwuw=</latexit> <latexit sha1_base64="59igpv32zwfgr8k1ohldlaycw/g=">aaahmhic3vxnbhmxehabhhl+wjhycamcamqjtcsvukvkxjc4fefopeyq8nonirxvemv726bwvgfpa0d4kz4qv668alpzvwiatjeqwfkub2fm88x8tjxhkowxnne8conylcxa1avr9es3bt66vbxy54nrmebq5uoqvrsya1ik0lxcsthnnba4llatjl4v/p190eao5l0dpxdebjcivudmomlv+yefmzvktlp3ofuthfphhaf2cbqskztqifmw7y2ves1vsug8afdgjvrre29l8acfkz7fuauxzjhe20tt4ji2gkvi635migv8xabqi/zfahiwgwnc4asnndbqh9g+0vhllj1yt5ici40zxyfgfh2y077cejavl9n+88cjjm0sjlxm1m8ktyowateim+zwjhewrgwws/mqacytyjitpdjbkivnxp/jelmrakxiwhgwcjczbmff6kgyfvckudm9dmbiujctwbio6kz6lwbdxprooags3e3+vp94q6hertaogqou4pgl0spnv+njnqhdpcid8w1wnlojjqduf0l6bo8ynr1wo3d1rqnbiroadhnpdtczq5eetnlz9x7sjq3a9h6dtgk6cf4ij+jvqkgmzl6qcjougawxnxmuvhwunj4zbby7quxuixgnurmoikisnb897b/ktd+dsqolmhn1dc6ok6/nc0liwceo458scuo6v9elev7ntd3rlqk2f/2w/k+anpdu+uwpyogvg69s0mwqxtykb8iopyifna7yz55eswohf9pqf5ow4mtaeychfyozfq+wkn5pw1ace+3tq2eeddutfy3v7eo1rafvvfgi98h90irt8oxskddkm3qjjx/jj/kffk19rh3xvtw+l6gxfiroxtkzaj9+as+jwuw=</latexit> <latexit sha1_base64="59igpv32zwfgr8k1ohldlaycw/g=">aaahmhic3vxnbhmxehabhhl+wjhycamcamqjtcsvukvkxjc4fefopeyq8nonirxvemv726bwvgfpa0d4kz4qv668alpzvwiatjeqwfkub2fm88x8tjxhkowxnne8conylcxa1avr9es3bt66vbxy54nrmebq5uoqvrsya1ik0lxcsthnnba4llatjl4v/p190eao5l0dpxdebjcivudmomlv+yefmzvktlp3ofuthfphhaf2cbqskztqifmw7y2ves1vsug8afdgjvrre29l8acfkz7fuauxzjhe20tt4ji2gkvi635migv8xabqi/zfahiwgwnc4asnndbqh9g+0vhllj1yt5ici40zxyfgfh2y077cejavl9n+88cjjm0sjlxm1m8ktyowateim+zwjhewrgwws/mqacytyjitpdjbkivnxp/jelmrakxiwhgwcjczbmff6kgyfvckudm9dmbiujctwbio6kz6lwbdxprooags3e3+vp94q6hertaogqou4pgl0spnv+njnqhdpcid8w1wnlojjqduf0l6bo8ynr1wo3d1rqnbiroadhnpdtczq5eetnlz9x7sjq3a9h6dtgk6cf4ij+jvqkgmzl6qcjougawxnxmuvhwunj4zbby7quxuixgnurmoikisnb897b/ktd+dsqolmhn1dc6ok6/nc0liwceo458scuo6v9elev7ntd3rlqk2f/2w/k+anpdu+uwpyogvg69s0mwqxtykb8iopyifna7yz55eswohf9pqf5ow4mtaeychfyozfq+wkn5pw1ace+3tq2eeddutfy3v7eo1rafvvfgi98h90irt8oxskddkm3qjjx/jj/kffk19rh3xvtw+l6gxfiroxtkzaj9+as+jwuw=</latexit> Information gain Information Gain (Gain) is the amount of information that the tree structure encodes H[X] is the entropy: expected number of bits to encode a randomly selected subset X A is the set of subsets of the data with a given split S is the entire data X Gain(S, A) =H[S] A A A S H[A] H[buys_computer] = -9/14 log 9/14-5/14 log 5/14 =

16 <latexit sha1_base64="i7j2vmunum2dglfkylffyjf+qu4=">aaahe3ic3vxnbhmxehyldsx8txdksqwkffabbsiqifspiasslyirwim7qrzeswlfxq9sb9pu2seai7wij8svb+a5eafms6vqng1viigl1x6emc8z/sayo1rwy33/x9lylasrtwur1+s3bt66fwdt/e57ozlnomuuupogogyet6brurvwkgqgmhkwh41eff79i9cgq+sdnaqqsjpiej8zathucys1q0afe5kfrm36lx86vexqrsamqcbe4frkzybwljoqwcaomb22n9rquw05e5dxg8xastmidqaxh/hujfsccd3xtozca6a/9vpk45dyb2o9txjugjoreuywzzqzvsj4nq+x2f6z0pekzswkrezuz4rnlvdo4mvca7nigoayzbfcjw2ppsyiunnzirwtuslk9bkmko5akyvdx2jcqmy7newjk8puqmejtfxemsfnwbqknwzvme5xghux1aobbkhyn/1tppe3ioflyvacy6akpen8yavv+tj2knkd3lnayowpnfwe6kgrpgewj7djt9qhqzcadq/iaebuyk3hptvkseg1o1v+q29722v6v0gnbb0el5dn4wlbqwbmphdijxizbbxtkqorpuma++w1253uivduvia4decqvkkujp7tv+tmpmfz1sxmhbo659sv1/mfibhxenx8u0loxbcqesme/zltzzulqo1fp6x/k6anjhub1qpj8xyivaqawqwli3tm7vbwya1xlx++eywnhl9oq/9zwmshygnrveijulialqhkg4bim9e++ygsgm6n9bzlv328ubtbvrer5d55qjqktz6sxfka7jeuyusrd+qt+vz7wpts+1r7voyul1wce2ru1l7/avu/tg4=</latexit> Gain(S, A) =H[S] X A A A S H[A] Income Entropy(Income=high) = -2/4 log 2/4-2/4 log 2/4 = 1 High Med Low Entropy(Income=med) = -4/6 log 4/6-2/6 log 2/6 = A no no yes yes yes no yes yes yes no yes no yes yes Entropy(Income=low) = -3/4 log 3/4-1/4 log 1/4 = Gain(D,Income) = (4/14 [1] + 6/14 [0.9183] + 4/14 [0.8113]) = 0.029

17 Gini gain Similar to information gain Uses gini index instead of entropy Measures decrease in gini index after split: Gain(S, A) = Gigi(S) X A A A S Gini(A)

18 Comparing information gain to gini gain Gini index Misclassification error Entropy Gini Entropy Information Gain Gini Gain p Fraction of target A into branch that outputs B

19 Comparing information gain to gini gain Gini index Misclassification error Entropy Gini Entropy p Fraction of target A into branch that outputs B

20 How does score function affect feature selection? Entropy Entropy Gini Gini x 2 Gini score can produce larger gain

21 <latexit sha1_base64="etvy7ia0eyqzalmlvign6e+f2vw=">aaahqhic3vxnbhmxehyldsx8txdk4ljfslebbvb8ckwqxawjs5eirztdrl7vjlfir1e2t21q7zvwnhcef+arochejrpe7co0/bubbjzw++3mfdsz31h2lhkmjed9wvi8cnwpdm35ev3gzvu376ys3n2nzayodknkuu1franncxqnmxz2ugverbx2o/hlwr97aeozmbw1kxrcqyyjgzbkjdp1v54egpgrjdzu5fs+7ubaz6jvwaed749xmfce2qbsm7yfoc829v3cunfex1n3wt504bogxyf1vk2d/ursjycwnboqgmqj1r22l5rqemuy5zdxg0xdsuiydkexh7buj0sadu3rtmccn5w/xgop3jmyplwejfkitj6iyeuwhentvsj4nq+xmcgz0likzqwktew0ydg2ehec4zgpoizphcbumvcupipiddfo1rksxb+nlfzn9bkmgoxbsslcs0lcgc+7rwhj48puqm4irdte6hfjqbce09spm+2xg7exueohakhyo32tpfbwiu5+5yisokrscjled23weud2sbspyw5tof3lqdhsgopbkasn3cih47xaoa03gg0m8rbcn7eihz0zykmim/6mt4g3tndt+w38evgovha87lawe2tghddooxhpylp4nkkw4lom3jxxs+2nxnfpxcuiy3ahwirjxdgz/kvo7dmsq7qeeayu/5y68np+rkg3eof0/fnczlwxknpjz/+ctuftuqftx9+l/5omjys1avfdwdzpemgufdfsfqfpitmjzgqz2lb++umunhj0oc35t9mimwjdiuffksffasx5unpcqlsm2qcvhbog67eet7w3j9a3t6v7yhndrw9qe7xru7snxqed1euuvucf0cf0ufax9rx2rfa9df1cqdj30nyq/fwfzw5eoa==</latexit> Chi-Square score Widely used to test independence between two categorical attributes (e.g., feature and class label) Hypothesis H 0 : Attributes are independent Consider a contingency table with k entries (k = rows x columns) Considers counts in a contingency table and calculates the normalized squared deviation of observed (predicted) values from expected (actual) values given H 0 X 2 = kx i=1 (o i e i ) 2 e i If counts are large (large number of examples), sampling distribution can be approximated by a chi-square distribution

22 Contingency tables Buy No buy Income High 2 2 Med 4 2 Low

23 <latexit sha1_base64="etvy7ia0eyqzalmlvign6e+f2vw=">aaahqhic3vxnbhmxehyldsx8txdk4ljfslebbvb8ckwqxawjs5eirztdrl7vjlfir1e2t21q7zvwnhcef+arochejrpe7co0/bubbjzw++3mfdsz31h2lhkmjed9wvi8cnwpdm35ev3gzvu376ys3n2nzayodknkuu1franncxqnmxz2ugverbx2o/hlwr97aeozmbw1kxrcqyyjgzbkjdp1v54egpgrjdzu5fs+7ubaz6jvwaed749xmfce2qbsm7yfoc829v3cunfex1n3wt504bogxyf1vk2d/ursjycwnboqgmqj1r22l5rqemuy5zdxg0xdsuiydkexh7buj0sadu3rtmccn5w/xgop3jmyplwejfkitj6iyeuwhentvsj4nq+xmcgz0likzqwktew0ydg2ehec4zgpoizphcbumvcupipiddfo1rksxb+nlfzn9bkmgoxbsslcs0lcgc+7rwhj48puqm4irdte6hfjqbce09spm+2xg7exueohakhyo32tpfbwiu5+5yisokrscjled23weud2sbspyw5tof3lqdhsgopbkasn3cih47xaoa03gg0m8rbcn7eihz0zykmim/6mt4g3tndt+w38evgovha87lawe2tghddooxhpylp4nkkw4lom3jxxs+2nxnfpxcuiy3ahwirjxdgz/kvo7dmsq7qeeayu/5y68np+rkg3eof0/fnczlwxknpjz/+ctuftuqftx9+l/5omjys1avfdwdzpemgufdfsfqfpitmjzgqz2lb++umunhj0oc35t9mimwjdiuffksffasx5unpcqlsm2qcvhbog67eet7w3j9a3t6v7yhndrw9qe7xru7snxqed1euuvucf0cf0ufax9rx2rfa9df1cqdj30nyq/fwfzw5eoa==</latexit> Calculating expected values for a cell Class X 2 = kx i=1 (o i e i ) 2 e i Attribute + 0 a b 1 c d o (0,+) = a N e (0,+) = p(a =0,C =+) N = p(a = 0)p(C =+ A = 0) N = p(a = 0)p(C = +) N (assuming independence) apple apple a + b a + c = N N N

24 Example calculation Observed Expected Buy No buy Buy No buy High 2 2 Med 4 2 Low 3 1 High Med Low χ 2 = ( ) 2 k o i e i % = ' & i=1 e i % + ' & (2 2.57) = 0.57 (2 1.43) ( % * + ' ) & ( % * + ' ) & (4 3.86) (2 2.14) ( % * + ' ) & (3 2.57) ( * ) ( % ( * + ' (1 1.43)2 * ) & 1.43 )

25 Tree learning Top-down recursive divide and conquer algorithm Start with all examples at root Select best attribute/feature Partition examples by selected attribute Recurse and repeat Other issues: How to construct features When to stop growing Pruning irrelevant parts of the tree

26 Controlling Variance One major problem with trees is their high variance. Often a small change in the data can result in a very different series of splits, making interpretation somewhat precarious. The major reason for this instability is the hierarchical nature of the process: the effect of an error in the top split is propagated down to all of the splits below it.

27 Overfitting Consider a distribution D of data representing a population and a sample DS drawn from D, which is used as training data Given a model space M, a score function S, and a learning algorithm that returns a model m M, the algorithm overfits the training data DS if: m' M such that S(m,DS) > S(m',DS) but S(m,D) < S(m',D) In other words, there is another model (m ) that is better on the entire distribution and if we had learned from the full data we would have selected it instead

28 Task: Devise a rule to classify items based on the attribute X Example learning problem Knowledge representation: If-then rules Example rule: If x > 25 then + Else - + What is the model space? All possible thresholds - What score function? Prediction error rate X

29 Approaches to avoid overfitting Regularization (Priors) Hold out evaluation set, used to adjust structure of learned model e.g., pruning in decision trees Statistical tests during learning to only include structure with significant associations e.g., pre-pruning in decision trees Penalty term in classifier scoring function i.e., change scorre function to prefer simpler models

30 How to avoid overfitting in decision trees Postpruning Use a separate set of examples to evaluate the utility of pruning nodes from the tree (after tree is fully grown) Prepruning Apply a statistical test to decide whether to expand a node Use an explicit measure of complexity to penalize large trees (e.g., Minimum Description Length)

31 Algorithm comparison CART Evaluation criterion: Gini index Search algorithm: Simple to complex, hill-climbing search Stopping criterion: When leaves are pure Pruning mechanism: Cross-validation to select Gini threshold C4.5 Evaluation criterion: Information gain Search algorithm: Simple to complex, hill-climbing search Stopping criterion: When leaves are pure Pruning mechanism: Reduce error pruning

32 CART: Finding Good Gini Threshold Background: K-fold cross validation Randomly partition training data into k folds For i=1 to k Learn model on D - i th fold; evaluate model on i th fold Average results from all k trials Train1 Train2 Train3 Train4 Train5 Train6 Dataset Y X1 X2 Y X1 X2 Y X1 X2 Y X1 X2 Y X1 X2 Y X1 X2 Y X1 X2 Test1 Test2 Test3 Test4 Test5 Test6 Y X1 X2 Y X1 X2 Y X1 X2 Y X1 X2 Y X1 X2 Y X1 X2

33 Choosing a Gini threshold with cross validation For i in 1.. k For t in threshold set (e.g, [0.0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8]) Learn decision tree on Traini with Gini gain threshold t (i.e. stop growing when max Gini gain is less than t) Evaluate learned tree on Testi (e.g., with accuracy) Set tmax,i to be the t with best performance on Testi Set tmax to the average of tmax,i over the k trials Relearn the tree on all the data using tmax as Gini gain threshold

34 C4.5: reduced error pruning Use pruning set to estimate accuracy in sub-trees and for individual nodes Let T be a sub-tree rooted at node v v Define: T Repeat: Prune at node with largest gain until until only negative gain nodes remain Bottom-up restriction : T can only be pruned if it does not contain a sub-tree with lower error than T Source:

35 Pre-pruning methods Stop growing tree at some point during top-down construction when there is no longer sufficient data to make reliable decisions Approach: Choose threshold on feature score Stop splitting if the best feature score is below threshold

36 <latexit sha1_base64="etvy7ia0eyqzalmlvign6e+f2vw=">aaahqhic3vxnbhmxehyldsx8txdk4ljfslebbvb8ckwqxawjs5eirztdrl7vjlfir1e2t21q7zvwnhcef+arochejrpe7co0/bubbjzw++3mfdsz31h2lhkmjed9wvi8cnwpdm35ev3gzvu376ys3n2nzayodknkuu1franncxqnmxz2ugverbx2o/hlwr97aeozmbw1kxrcqyyjgzbkjdp1v54egpgrjdzu5fs+7ubaz6jvwaed749xmfce2qbsm7yfoc829v3cunfex1n3wt504bogxyf1vk2d/ursjycwnboqgmqj1r22l5rqemuy5zdxg0xdsuiydkexh7buj0sadu3rtmccn5w/xgop3jmyplwejfkitj6iyeuwhentvsj4nq+xmcgz0likzqwktew0ydg2ehec4zgpoizphcbumvcupipiddfo1rksxb+nlfzn9bkmgoxbsslcs0lcgc+7rwhj48puqm4irdte6hfjqbce09spm+2xg7exueohakhyo32tpfbwiu5+5yisokrscjled23weud2sbspyw5tof3lqdhsgopbkasn3cih47xaoa03gg0m8rbcn7eihz0zykmim/6mt4g3tndt+w38evgovha87lawe2tghddooxhpylp4nkkw4lom3jxxs+2nxnfpxcuiy3ahwirjxdgz/kvo7dmsq7qeeayu/5y68np+rkg3eof0/fnczlwxknpjz/+ctuftuqftx9+l/5omjys1avfdwdzpemgufdfsfqfpitmjzgqz2lb++umunhj0oc35t9mimwjdiuffksffasx5unpcqlsm2qcvhbog67eet7w3j9a3t6v7yhndrw9qe7xru7snxqed1euuvucf0cf0ufax9rx2rfa9df1cqdj30nyq/fwfzw5eoa==</latexit> Determine chi-square threshold analytically Stop growing when chi-square feature score is not statistically significant Chi-square has known sampling distribution, can look up significance threshold Degrees of freedom= (#rows-1)(#cols-1) 2X2 table: 3.84 is 95% critical value X 2 = kx i=1 (o i e i ) 2 e i

Decision Trees. CS57300 Data Mining Fall Instructor: Bruno Ribeiro

Decision Trees. CS57300 Data Mining Fall Instructor: Bruno Ribeiro Decision Trees CS57300 Data Mining Fall 2016 Instructor: Bruno Ribeiro Goal } Classification without Models Well, partially without a model } Today: Decision Trees 2015 Bruno Ribeiro 2 3 Why Trees? } interpretable/intuitive,

More information

CS145: INTRODUCTION TO DATA MINING

CS145: INTRODUCTION TO DATA MINING CS145: INTRODUCTION TO DATA MINING 4: Vector Data: Decision Tree Instructor: Yizhou Sun yzsun@cs.ucla.edu October 10, 2017 Methods to Learn Vector Data Set Data Sequence Data Text Data Classification Clustering

More information

CS6375: Machine Learning Gautam Kunapuli. Decision Trees

CS6375: Machine Learning Gautam Kunapuli. Decision Trees Gautam Kunapuli Example: Restaurant Recommendation Example: Develop a model to recommend restaurants to users depending on their past dining experiences. Here, the features are cost (x ) and the user s

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 23. Decision Trees Barnabás Póczos Contents Decision Trees: Definition + Motivation Algorithm for Learning Decision Trees Entropy, Mutual Information, Information

More information

the tree till a class assignment is reached

the tree till a class assignment is reached Decision Trees Decision Tree for Playing Tennis Prediction is done by sending the example down Prediction is done by sending the example down the tree till a class assignment is reached Definitions Internal

More information

EECS 349:Machine Learning Bryan Pardo

EECS 349:Machine Learning Bryan Pardo EECS 349:Machine Learning Bryan Pardo Topic 2: Decision Trees (Includes content provided by: Russel & Norvig, D. Downie, P. Domingos) 1 General Learning Task There is a set of possible examples Each example

More information

Machine Learning 2nd Edi7on

Machine Learning 2nd Edi7on Lecture Slides for INTRODUCTION TO Machine Learning 2nd Edi7on CHAPTER 9: Decision Trees ETHEM ALPAYDIN The MIT Press, 2010 Edited and expanded for CS 4641 by Chris Simpkins alpaydin@boun.edu.tr h1p://www.cmpe.boun.edu.tr/~ethem/i2ml2e

More information

Learning Decision Trees

Learning Decision Trees Learning Decision Trees CS194-10 Fall 2011 Lecture 8 CS194-10 Fall 2011 Lecture 8 1 Outline Decision tree models Tree construction Tree pruning Continuous input features CS194-10 Fall 2011 Lecture 8 2

More information

Decision Tree Learning Lecture 2

Decision Tree Learning Lecture 2 Machine Learning Coms-4771 Decision Tree Learning Lecture 2 January 28, 2008 Two Types of Supervised Learning Problems (recap) Feature (input) space X, label (output) space Y. Unknown distribution D over

More information

Statistics and learning: Big Data

Statistics and learning: Big Data Statistics and learning: Big Data Learning Decision Trees and an Introduction to Boosting Sébastien Gadat Toulouse School of Economics February 2017 S. Gadat (TSE) SAD 2013 1 / 30 Keywords Decision trees

More information

Classification and Prediction

Classification and Prediction Classification Classification and Prediction Classification: predict categorical class labels Build a model for a set of classes/concepts Classify loan applications (approve/decline) Prediction: model

More information

Decision trees. Special Course in Computer and Information Science II. Adam Gyenge Helsinki University of Technology

Decision trees. Special Course in Computer and Information Science II. Adam Gyenge Helsinki University of Technology Decision trees Special Course in Computer and Information Science II Adam Gyenge Helsinki University of Technology 6.2.2008 Introduction Outline: Definition of decision trees ID3 Pruning methods Bibliography:

More information

Machine Learning 3. week

Machine Learning 3. week Machine Learning 3. week Entropy Decision Trees ID3 C4.5 Classification and Regression Trees (CART) 1 What is Decision Tree As a short description, decision tree is a data classification procedure which

More information

CS 380: ARTIFICIAL INTELLIGENCE MACHINE LEARNING. Santiago Ontañón

CS 380: ARTIFICIAL INTELLIGENCE MACHINE LEARNING. Santiago Ontañón CS 380: ARTIFICIAL INTELLIGENCE MACHINE LEARNING Santiago Ontañón so367@drexel.edu Summary so far: Rational Agents Problem Solving Systematic Search: Uninformed Informed Local Search Adversarial Search

More information

Lecture 7 Decision Tree Classifier

Lecture 7 Decision Tree Classifier Machine Learning Dr.Ammar Mohammed Lecture 7 Decision Tree Classifier Decision Tree A decision tree is a simple classifier in the form of a hierarchical tree structure, which performs supervised classification

More information

Learning Decision Trees

Learning Decision Trees Learning Decision Trees Machine Learning Fall 2018 Some slides from Tom Mitchell, Dan Roth and others 1 Key issues in machine learning Modeling How to formulate your problem as a machine learning problem?

More information

Induction of Decision Trees

Induction of Decision Trees Induction of Decision Trees Peter Waiganjo Wagacha This notes are for ICS320 Foundations of Learning and Adaptive Systems Institute of Computer Science University of Nairobi PO Box 30197, 00200 Nairobi.

More information

Decision Tree Learning

Decision Tree Learning Decision Tree Learning Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University References: 1. Machine Learning, Chapter 3 2. Data Mining: Concepts, Models,

More information

C4.5 - pruning decision trees

C4.5 - pruning decision trees C4.5 - pruning decision trees Quiz 1 Quiz 1 Q: Is a tree with only pure leafs always the best classifier you can have? A: No. Quiz 1 Q: Is a tree with only pure leafs always the best classifier you can

More information

Data Mining Classification: Basic Concepts and Techniques. Lecture Notes for Chapter 3. Introduction to Data Mining, 2nd Edition

Data Mining Classification: Basic Concepts and Techniques. Lecture Notes for Chapter 3. Introduction to Data Mining, 2nd Edition Data Mining Classification: Basic Concepts and Techniques Lecture Notes for Chapter 3 by Tan, Steinbach, Karpatne, Kumar 1 Classification: Definition Given a collection of records (training set ) Each

More information

Knowledge Discovery and Data Mining

Knowledge Discovery and Data Mining Knowledge Discovery and Data Mining Lecture 06 - Regression & Decision Trees Tom Kelsey School of Computer Science University of St Andrews http://tom.home.cs.st-andrews.ac.uk twk@st-andrews.ac.uk Tom

More information

Decision Trees. Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University. February 5 th, Carlos Guestrin 1

Decision Trees. Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University. February 5 th, Carlos Guestrin 1 Decision Trees Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University February 5 th, 2007 2005-2007 Carlos Guestrin 1 Linear separability A dataset is linearly separable iff 9 a separating

More information

Machine Learning and Data Mining. Decision Trees. Prof. Alexander Ihler

Machine Learning and Data Mining. Decision Trees. Prof. Alexander Ihler + Machine Learning and Data Mining Decision Trees Prof. Alexander Ihler Decision trees Func-onal form f(x;µ): nested if-then-else statements Discrete features: fully expressive (any func-on) Structure:

More information

Learning Decision Trees

Learning Decision Trees Learning Decision Trees Machine Learning Spring 2018 1 This lecture: Learning Decision Trees 1. Representation: What are decision trees? 2. Algorithm: Learning decision trees The ID3 algorithm: A greedy

More information

Notes on Machine Learning for and

Notes on Machine Learning for and Notes on Machine Learning for 16.410 and 16.413 (Notes adapted from Tom Mitchell and Andrew Moore.) Learning = improving with experience Improve over task T (e.g, Classification, control tasks) with respect

More information

day month year documentname/initials 1

day month year documentname/initials 1 ECE471-571 Pattern Recognition Lecture 13 Decision Tree Hairong Qi, Gonzalez Family Professor Electrical Engineering and Computer Science University of Tennessee, Knoxville http://www.eecs.utk.edu/faculty/qi

More information

CS 6375 Machine Learning

CS 6375 Machine Learning CS 6375 Machine Learning Decision Trees Instructor: Yang Liu 1 Supervised Classifier X 1 X 2. X M Ref class label 2 1 Three variables: Attribute 1: Hair = {blond, dark} Attribute 2: Height = {tall, short}

More information

Decision Trees. Nicholas Ruozzi University of Texas at Dallas. Based on the slides of Vibhav Gogate and David Sontag

Decision Trees. Nicholas Ruozzi University of Texas at Dallas. Based on the slides of Vibhav Gogate and David Sontag Decision Trees Nicholas Ruozzi University of Texas at Dallas Based on the slides of Vibhav Gogate and David Sontag Supervised Learning Input: labelled training data i.e., data plus desired output Assumption:

More information

Decision Trees. CSC411/2515: Machine Learning and Data Mining, Winter 2018 Luke Zettlemoyer, Carlos Guestrin, and Andrew Moore

Decision Trees. CSC411/2515: Machine Learning and Data Mining, Winter 2018 Luke Zettlemoyer, Carlos Guestrin, and Andrew Moore Decision Trees Claude Monet, The Mulberry Tree Slides from Pedro Domingos, CSC411/2515: Machine Learning and Data Mining, Winter 2018 Luke Zettlemoyer, Carlos Guestrin, and Andrew Moore Michael Guerzhoy

More information

Decision Trees. Each internal node : an attribute Branch: Outcome of the test Leaf node or terminal node: class label.

Decision Trees. Each internal node : an attribute Branch: Outcome of the test Leaf node or terminal node: class label. Decision Trees Supervised approach Used for Classification (Categorical values) or regression (continuous values). The learning of decision trees is from class-labeled training tuples. Flowchart like structure.

More information

Empirical Risk Minimization, Model Selection, and Model Assessment

Empirical Risk Minimization, Model Selection, and Model Assessment Empirical Risk Minimization, Model Selection, and Model Assessment CS6780 Advanced Machine Learning Spring 2015 Thorsten Joachims Cornell University Reading: Murphy 5.7-5.7.2.4, 6.5-6.5.3.1 Dietterich,

More information

Lecture 7: DecisionTrees

Lecture 7: DecisionTrees Lecture 7: DecisionTrees What are decision trees? Brief interlude on information theory Decision tree construction Overfitting avoidance Regression trees COMP-652, Lecture 7 - September 28, 2009 1 Recall:

More information

Informal Definition: Telling things apart

Informal Definition: Telling things apart 9. Decision Trees Informal Definition: Telling things apart 2 Nominal data No numeric feature vector Just a list or properties: Banana: longish, yellow Apple: round, medium sized, different colors like

More information

Supervised Learning! Algorithm Implementations! Inferring Rudimentary Rules and Decision Trees!

Supervised Learning! Algorithm Implementations! Inferring Rudimentary Rules and Decision Trees! Supervised Learning! Algorithm Implementations! Inferring Rudimentary Rules and Decision Trees! Summary! Input Knowledge representation! Preparing data for learning! Input: Concept, Instances, Attributes"

More information

Dan Roth 461C, 3401 Walnut

Dan Roth   461C, 3401 Walnut CIS 519/419 Applied Machine Learning www.seas.upenn.edu/~cis519 Dan Roth danroth@seas.upenn.edu http://www.cis.upenn.edu/~danroth/ 461C, 3401 Walnut Slides were created by Dan Roth (for CIS519/419 at Penn

More information

Lecture 3: Decision Trees

Lecture 3: Decision Trees Lecture 3: Decision Trees Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning ID3, Information Gain, Overfitting, Pruning Lecture 3: Decision Trees p. Decision

More information

Machine Learning & Data Mining

Machine Learning & Data Mining Group M L D Machine Learning M & Data Mining Chapter 7 Decision Trees Xin-Shun Xu @ SDU School of Computer Science and Technology, Shandong University Top 10 Algorithm in DM #1: C4.5 #2: K-Means #3: SVM

More information

Decision Trees. Tirgul 5

Decision Trees. Tirgul 5 Decision Trees Tirgul 5 Using Decision Trees It could be difficult to decide which pet is right for you. We ll find a nice algorithm to help us decide what to choose without having to think about it. 2

More information

M chi h n i e n L e L arni n n i g Decision Trees Mac a h c i h n i e n e L e L a e r a ni n ng

M chi h n i e n L e L arni n n i g Decision Trees Mac a h c i h n i e n e L e L a e r a ni n ng 1 Decision Trees 2 Instances Describable by Attribute-Value Pairs Target Function Is Discrete Valued Disjunctive Hypothesis May Be Required Possibly Noisy Training Data Examples Equipment or medical diagnosis

More information

Data Mining and Analysis: Fundamental Concepts and Algorithms

Data Mining and Analysis: Fundamental Concepts and Algorithms Data Mining and Analysis: Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA

More information

Lecture 3: Decision Trees

Lecture 3: Decision Trees Lecture 3: Decision Trees Cognitive Systems - Machine Learning Part I: Basic Approaches of Concept Learning ID3, Information Gain, Overfitting, Pruning last change November 26, 2014 Ute Schmid (CogSys,

More information

Supervised Learning via Decision Trees

Supervised Learning via Decision Trees Supervised Learning via Decision Trees Lecture 4 1 Outline 1. Learning via feature splits 2. ID3 Information gain 3. Extensions Continuous features Gain ratio Ensemble learning 2 Sequence of decisions

More information

Decision Tree Learning

Decision Tree Learning Decision Tree Learning Goals for the lecture you should understand the following concepts the decision tree representation the standard top-down approach to learning a tree Occam s razor entropy and information

More information

Decision Tree Analysis for Classification Problems. Entscheidungsunterstützungssysteme SS 18

Decision Tree Analysis for Classification Problems. Entscheidungsunterstützungssysteme SS 18 Decision Tree Analysis for Classification Problems Entscheidungsunterstützungssysteme SS 18 Supervised segmentation An intuitive way of thinking about extracting patterns from data in a supervised manner

More information

Statistical Learning. Philipp Koehn. 10 November 2015

Statistical Learning. Philipp Koehn. 10 November 2015 Statistical Learning Philipp Koehn 10 November 2015 Outline 1 Learning agents Inductive learning Decision tree learning Measuring learning performance Bayesian learning Maximum a posteriori and maximum

More information

Classification and Regression Trees

Classification and Regression Trees Classification and Regression Trees Ryan P Adams So far, we have primarily examined linear classifiers and regressors, and considered several different ways to train them When we ve found the linearity

More information

Machine Learning Recitation 8 Oct 21, Oznur Tastan

Machine Learning Recitation 8 Oct 21, Oznur Tastan Machine Learning 10601 Recitation 8 Oct 21, 2009 Oznur Tastan Outline Tree representation Brief information theory Learning decision trees Bagging Random forests Decision trees Non linear classifier Easy

More information

Classification: Decision Trees

Classification: Decision Trees Classification: Decision Trees Outline Top-Down Decision Tree Construction Choosing the Splitting Attribute Information Gain and Gain Ratio 2 DECISION TREE An internal node is a test on an attribute. A

More information

Decision Trees. Gavin Brown

Decision Trees. Gavin Brown Decision Trees Gavin Brown Every Learning Method has Limitations Linear model? KNN? SVM? Explain your decisions Sometimes we need interpretable results from our techniques. How do you explain the above

More information

Decision Tree Learning Mitchell, Chapter 3. CptS 570 Machine Learning School of EECS Washington State University

Decision Tree Learning Mitchell, Chapter 3. CptS 570 Machine Learning School of EECS Washington State University Decision Tree Learning Mitchell, Chapter 3 CptS 570 Machine Learning School of EECS Washington State University Outline Decision tree representation ID3 learning algorithm Entropy and information gain

More information

Classification Using Decision Trees

Classification Using Decision Trees Classification Using Decision Trees 1. Introduction Data mining term is mainly used for the specific set of six activities namely Classification, Estimation, Prediction, Affinity grouping or Association

More information

CHAPTER-17. Decision Tree Induction

CHAPTER-17. Decision Tree Induction CHAPTER-17 Decision Tree Induction 17.1 Introduction 17.2 Attribute selection measure 17.3 Tree Pruning 17.4 Extracting Classification Rules from Decision Trees 17.5 Bayesian Classification 17.6 Bayes

More information

Decision trees COMS 4771

Decision trees COMS 4771 Decision trees COMS 4771 1. Prediction functions (again) Learning prediction functions IID model for supervised learning: (X 1, Y 1),..., (X n, Y n), (X, Y ) are iid random pairs (i.e., labeled examples).

More information

UVA CS 4501: Machine Learning

UVA CS 4501: Machine Learning UVA CS 4501: Machine Learning Lecture 21: Decision Tree / Random Forest / Ensemble Dr. Yanjun Qi University of Virginia Department of Computer Science Where are we? è Five major sections of this course

More information

Decision Tree. Decision Tree Learning. c4.5. Example

Decision Tree. Decision Tree Learning. c4.5. Example Decision ree Decision ree Learning s of systems that learn decision trees: c4., CLS, IDR, ASSISA, ID, CAR, ID. Suitable problems: Instances are described by attribute-value couples he target function has

More information

Decision Trees. CS 341 Lectures 8/9 Dan Sheldon

Decision Trees. CS 341 Lectures 8/9 Dan Sheldon Decision rees CS 341 Lectures 8/9 Dan Sheldon Review: Linear Methods Y! So far, we ve looked at linear methods! Linear regression! Fit a line/plane/hyperplane X 2 X 1! Logistic regression! Decision boundary

More information

Rule Generation using Decision Trees

Rule Generation using Decision Trees Rule Generation using Decision Trees Dr. Rajni Jain 1. Introduction A DT is a classification scheme which generates a tree and a set of rules, representing the model of different classes, from a given

More information

Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Decision Trees. Tobias Scheffer

Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Decision Trees. Tobias Scheffer Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Decision Trees Tobias Scheffer Decision Trees One of many applications: credit risk Employed longer than 3 months Positive credit

More information

Statistical Consulting Topics Classification and Regression Trees (CART)

Statistical Consulting Topics Classification and Regression Trees (CART) Statistical Consulting Topics Classification and Regression Trees (CART) Suppose the main goal in a data analysis is the prediction of a categorical variable outcome. Such as in the examples below. Given

More information

Classification and regression trees

Classification and regression trees Classification and regression trees Pierre Geurts p.geurts@ulg.ac.be Last update: 23/09/2015 1 Outline Supervised learning Decision tree representation Decision tree learning Extensions Regression trees

More information

Imagine we ve got a set of data containing several types, or classes. E.g. information about customers, and class=whether or not they buy anything.

Imagine we ve got a set of data containing several types, or classes. E.g. information about customers, and class=whether or not they buy anything. Decision Trees Defining the Task Imagine we ve got a set of data containing several types, or classes. E.g. information about customers, and class=whether or not they buy anything. Can we predict, i.e

More information

Decision T ree Tree Algorithm Week 4 1

Decision T ree Tree Algorithm Week 4 1 Decision Tree Algorithm Week 4 1 Team Homework Assignment #5 Read pp. 105 117 of the text book. Do Examples 3.1, 3.2, 3.3 and Exercise 3.4 (a). Prepare for the results of the homework assignment. Due date

More information

Learning from Observations

Learning from Observations RN, Chapter 18 18.3 Learning from Observations Learning Decision Trees Framework Classification Learning Bias Def'n: Decision Trees Algorithm for Learning Decision Trees Entropy, Inductive Bias (Occam's

More information

Learning from Observations. Chapter 18, Sections 1 3 1

Learning from Observations. Chapter 18, Sections 1 3 1 Learning from Observations Chapter 18, Sections 1 3 Chapter 18, Sections 1 3 1 Outline Learning agents Inductive learning Decision tree learning Measuring learning performance Chapter 18, Sections 1 3

More information

DECISION TREE LEARNING. [read Chapter 3] [recommended exercises 3.1, 3.4]

DECISION TREE LEARNING. [read Chapter 3] [recommended exercises 3.1, 3.4] 1 DECISION TREE LEARNING [read Chapter 3] [recommended exercises 3.1, 3.4] Decision tree representation ID3 learning algorithm Entropy, Information gain Overfitting Decision Tree 2 Representation: Tree-structured

More information

Decision Trees Part 1. Rao Vemuri University of California, Davis

Decision Trees Part 1. Rao Vemuri University of California, Davis Decision Trees Part 1 Rao Vemuri University of California, Davis Overview What is a Decision Tree Sample Decision Trees How to Construct a Decision Tree Problems with Decision Trees Classification Vs Regression

More information

A Decision Stump. Decision Trees, cont. Boosting. Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University. October 1 st, 2007

A Decision Stump. Decision Trees, cont. Boosting. Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University. October 1 st, 2007 Decision Trees, cont. Boosting Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University October 1 st, 2007 1 A Decision Stump 2 1 The final tree 3 Basic Decision Tree Building Summarized

More information

2018 CS420, Machine Learning, Lecture 5. Tree Models. Weinan Zhang Shanghai Jiao Tong University

2018 CS420, Machine Learning, Lecture 5. Tree Models. Weinan Zhang Shanghai Jiao Tong University 2018 CS420, Machine Learning, Lecture 5 Tree Models Weinan Zhang Shanghai Jiao Tong University http://wnzhang.net http://wnzhang.net/teaching/cs420/index.html ML Task: Function Approximation Problem setting

More information

Classification: Decision Trees

Classification: Decision Trees Classification: Decision Trees These slides were assembled by Byron Boots, with grateful acknowledgement to Eric Eaton and the many others who made their course materials freely available online. Feel

More information

From inductive inference to machine learning

From inductive inference to machine learning From inductive inference to machine learning ADAPTED FROM AIMA SLIDES Russel&Norvig:Artificial Intelligence: a modern approach AIMA: Inductive inference AIMA: Inductive inference 1 Outline Bayesian inferences

More information

Jeffrey D. Ullman Stanford University

Jeffrey D. Ullman Stanford University Jeffrey D. Ullman Stanford University 3 We are given a set of training examples, consisting of input-output pairs (x,y), where: 1. x is an item of the type we want to evaluate. 2. y is the value of some

More information

SF2930 Regression Analysis

SF2930 Regression Analysis SF2930 Regression Analysis Alexandre Chotard Tree-based regression and classication 20 February 2017 1 / 30 Idag Overview Regression trees Pruning Bagging, random forests 2 / 30 Today Overview Regression

More information

Regression tree methods for subgroup identification I

Regression tree methods for subgroup identification I Regression tree methods for subgroup identification I Xu He Academy of Mathematics and Systems Science, Chinese Academy of Sciences March 25, 2014 Xu He (AMSS, CAS) March 25, 2014 1 / 34 Outline The problem

More information

Decision Tree Learning

Decision Tree Learning Decision Tree Learning Goals for the lecture you should understand the following concepts the decision tree representation the standard top-down approach to learning a tree Occam s razor entropy and information

More information

Decision Trees.

Decision Trees. . Machine Learning Decision Trees Prof. Dr. Martin Riedmiller AG Maschinelles Lernen und Natürlichsprachliche Systeme Institut für Informatik Technische Fakultät Albert-Ludwigs-Universität Freiburg riedmiller@informatik.uni-freiburg.de

More information

Decision-Tree Learning. Chapter 3: Decision Tree Learning. Classification Learning. Decision Tree for PlayTennis

Decision-Tree Learning. Chapter 3: Decision Tree Learning. Classification Learning. Decision Tree for PlayTennis Decision-Tree Learning Chapter 3: Decision Tree Learning CS 536: Machine Learning Littman (Wu, TA) [read Chapter 3] [some of Chapter 2 might help ] [recommended exercises 3.1, 3.2] Decision tree representation

More information

CS 380: ARTIFICIAL INTELLIGENCE

CS 380: ARTIFICIAL INTELLIGENCE CS 380: ARTIFICIAL INTELLIGENCE MACHINE LEARNING 11/11/2013 Santiago Ontañón santi@cs.drexel.edu https://www.cs.drexel.edu/~santi/teaching/2013/cs380/intro.html Summary so far: Rational Agents Problem

More information

Holdout and Cross-Validation Methods Overfitting Avoidance

Holdout and Cross-Validation Methods Overfitting Avoidance Holdout and Cross-Validation Methods Overfitting Avoidance Decision Trees Reduce error pruning Cost-complexity pruning Neural Networks Early stopping Adjusting Regularizers via Cross-Validation Nearest

More information

1 Handling of Continuous Attributes in C4.5. Algorithm

1 Handling of Continuous Attributes in C4.5. Algorithm .. Spring 2009 CSC 466: Knowledge Discovery from Data Alexander Dekhtyar.. Data Mining: Classification/Supervised Learning Potpourri Contents 1. C4.5. and continuous attributes: incorporating continuous

More information

Generative v. Discriminative classifiers Intuition

Generative v. Discriminative classifiers Intuition Logistic Regression (Continued) Generative v. Discriminative Decision rees Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University January 31 st, 2007 2005-2007 Carlos Guestrin 1 Generative

More information

SUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION

SUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION SUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION 1 Outline Basic terminology Features Training and validation Model selection Error and loss measures Statistical comparison Evaluation measures 2 Terminology

More information

Data Mining. Preamble: Control Application. Industrial Researcher s Approach. Practitioner s Approach. Example. Example. Goal: Maintain T ~Td

Data Mining. Preamble: Control Application. Industrial Researcher s Approach. Practitioner s Approach. Example. Example. Goal: Maintain T ~Td Data Mining Andrew Kusiak 2139 Seamans Center Iowa City, Iowa 52242-1527 Preamble: Control Application Goal: Maintain T ~Td Tel: 319-335 5934 Fax: 319-335 5669 andrew-kusiak@uiowa.edu http://www.icaen.uiowa.edu/~ankusiak

More information

Decision Tree And Random Forest

Decision Tree And Random Forest Decision Tree And Random Forest Dr. Ammar Mohammed Associate Professor of Computer Science ISSR, Cairo University PhD of CS ( Uni. Koblenz-Landau, Germany) Spring 2019 Contact: mailto: Ammar@cu.edu.eg

More information

Generalization to Multi-Class and Continuous Responses. STA Data Mining I

Generalization to Multi-Class and Continuous Responses. STA Data Mining I Generalization to Multi-Class and Continuous Responses STA 5703 - Data Mining I 1. Categorical Responses (a) Splitting Criterion Outline Goodness-of-split Criterion Chi-square Tests and Twoing Rule (b)

More information

Decision Trees. Introduction. Some facts about decision trees: They represent data-classification models.

Decision Trees. Introduction. Some facts about decision trees: They represent data-classification models. Decision Trees Introduction Some facts about decision trees: They represent data-classification models. An internal node of the tree represents a question about feature-vector attribute, and whose answer

More information

Learning and Neural Networks

Learning and Neural Networks Artificial Intelligence Learning and Neural Networks Readings: Chapter 19 & 20.5 of Russell & Norvig Example: A Feed-forward Network w 13 I 1 H 3 w 35 w 14 O 5 I 2 w 23 w 24 H 4 w 45 a 5 = g 5 (W 3,5 a

More information

Introduction to ML. Two examples of Learners: Naïve Bayesian Classifiers Decision Trees

Introduction to ML. Two examples of Learners: Naïve Bayesian Classifiers Decision Trees Introduction to ML Two examples of Learners: Naïve Bayesian Classifiers Decision Trees Why Bayesian learning? Probabilistic learning: Calculate explicit probabilities for hypothesis, among the most practical

More information

Decision Trees Lecture 12

Decision Trees Lecture 12 Decision Trees Lecture 12 David Sontag New York University Slides adapted from Luke Zettlemoyer, Carlos Guestrin, and Andrew Moore Machine Learning in the ER Physician documentation Triage Information

More information

Predictive Modeling: Classification. KSE 521 Topic 6 Mun Yi

Predictive Modeling: Classification. KSE 521 Topic 6 Mun Yi Predictive Modeling: Classification Topic 6 Mun Yi Agenda Models and Induction Entropy and Information Gain Tree-Based Classifier Probability Estimation 2 Introduction Key concept of BI: Predictive modeling

More information

Review of Lecture 1. Across records. Within records. Classification, Clustering, Outlier detection. Associations

Review of Lecture 1. Across records. Within records. Classification, Clustering, Outlier detection. Associations Review of Lecture 1 This course is about finding novel actionable patterns in data. We can divide data mining algorithms (and the patterns they find) into five groups Across records Classification, Clustering,

More information

Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Intelligent Data Analysis. Decision Trees

Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Intelligent Data Analysis. Decision Trees Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Intelligent Data Analysis Decision Trees Paul Prasse, Niels Landwehr, Tobias Scheffer Decision Trees One of many applications:

More information

Data classification (II)

Data classification (II) Lecture 4: Data classification (II) Data Mining - Lecture 4 (2016) 1 Outline Decision trees Choice of the splitting attribute ID3 C4.5 Classification rules Covering algorithms Naïve Bayes Classification

More information

The exam is closed book, closed notes except your one-page (two sides) or two-page (one side) crib sheet.

The exam is closed book, closed notes except your one-page (two sides) or two-page (one side) crib sheet. CS 189 Spring 013 Introduction to Machine Learning Final You have 3 hours for the exam. The exam is closed book, closed notes except your one-page (two sides) or two-page (one side) crib sheet. Please

More information

Chapter 3: Decision Tree Learning

Chapter 3: Decision Tree Learning Chapter 3: Decision Tree Learning CS 536: Machine Learning Littman (Wu, TA) Administration Books? New web page: http://www.cs.rutgers.edu/~mlittman/courses/ml03/ schedule lecture notes assignment info.

More information

1. Courses are either tough or boring. 2. Not all courses are boring. 3. Therefore there are tough courses. (Cx, Tx, Bx, )

1. Courses are either tough or boring. 2. Not all courses are boring. 3. Therefore there are tough courses. (Cx, Tx, Bx, ) Logic FOL Syntax FOL Rules (Copi) 1. Courses are either tough or boring. 2. Not all courses are boring. 3. Therefore there are tough courses. (Cx, Tx, Bx, ) Dealing with Time Translate into first-order

More information

Decision Trees. Lewis Fishgold. (Material in these slides adapted from Ray Mooney's slides on Decision Trees)

Decision Trees. Lewis Fishgold. (Material in these slides adapted from Ray Mooney's slides on Decision Trees) Decision Trees Lewis Fishgold (Material in these slides adapted from Ray Mooney's slides on Decision Trees) Classification using Decision Trees Nodes test features, there is one branch for each value of

More information

Information Theory & Decision Trees

Information Theory & Decision Trees Information Theory & Decision Trees Jihoon ang Sogang University Email: yangjh@sogang.ac.kr Decision tree classifiers Decision tree representation for modeling dependencies among input variables using

More information

Apprentissage automatique et fouille de données (part 2)

Apprentissage automatique et fouille de données (part 2) Apprentissage automatique et fouille de données (part 2) Telecom Saint-Etienne Elisa Fromont (basé sur les cours d Hendrik Blockeel et de Tom Mitchell) 1 Induction of decision trees : outline (adapted

More information

Decision Trees.

Decision Trees. . Machine Learning Decision Trees Prof. Dr. Martin Riedmiller AG Maschinelles Lernen und Natürlichsprachliche Systeme Institut für Informatik Technische Fakultät Albert-Ludwigs-Universität Freiburg riedmiller@informatik.uni-freiburg.de

More information

IMBALANCED DATA. Phishing. Admin 9/30/13. Assignment 3: - how did it go? - do the experiments help? Assignment 4. Course feedback

IMBALANCED DATA. Phishing. Admin 9/30/13. Assignment 3: - how did it go? - do the experiments help? Assignment 4. Course feedback 9/3/3 Admin Assignment 3: - how did it go? - do the experiments help? Assignment 4 IMBALANCED DATA Course feedback David Kauchak CS 45 Fall 3 Phishing 9/3/3 Setup Imbalanced data. for hour, google collects

More information