Data Mining Classification: Alternative Techniques Lecture Notes for Chapter Introduction to Data Mining by Tan, Steinbach, Kumar © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 Rule-Based Classifier Classify records by using a collection of “if…then…” rules Rule: (Condition) y – where Condition is a conjunctions of attributes y is the class label – LHS: rule antecedent or condition – RHS: rule consequent – Examples of classification rules: (Blood Type=Warm) (Lay Eggs=Yes) Birds (Taxable Income < 50K) (Refund=Yes) Evade=No © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Rule-based Classifier (Example) Name human python salmon whale frog komodo bat pigeon cat leopard shark turtle penguin porcupine eel salamander gila monster platypus owl dolphin eagle Blood Type warm cold cold warm cold cold warm warm warm cold cold warm warm cold cold cold warm warm warm warm Give Birth yes no no yes no no yes no yes yes no no yes no no no no no yes no Can Fly no no no no no no yes yes no no no no no no no no no yes no yes Live in Water no no yes yes sometimes no no no no yes sometimes sometimes no yes sometimes no no no yes no Class mammals reptiles fishes mammals amphibians reptiles mammals birds mammals fishes reptiles birds mammals fishes amphibians reptiles mammals birds mammals birds R1: (Give Birth = no) (Can Fly = yes) Birds R2: (Give Birth = no) (Live in Water = yes) Fishes R3: (Give Birth = yes) (Blood Type = warm) Mammals R4: (Give Birth = no) (Can Fly = no) Reptiles R5: (Live in Water = sometimes) Amphibians © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Application of Rule-Based Classifier A rule r covers an instance x if the attributes of the instance satisfy the condition of the rule R1: (Give Birth = no) (Can Fly = yes) Birds R2: (Give Birth = no) (Live in Water = yes) Fishes R3: (Give Birth = yes) (Blood Type = warm) Mammals R4: (Give Birth = no) (Can Fly = no) Reptiles R5: (Live in Water = sometimes) Amphibians Name hawk grizzly bear Blood Type warm warm Give Birth Can Fly Live in Water Class no yes yes no no no ? ? The rule R1 covers a hawk => Bird The rule R3 covers the grizzly bear => Mammal © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Rule Coverage and Accuracy Tid Refund Marital Status Coverage of a rule: Yes Single – Fraction of records No Married that satisfy the No Single antecedent of a rule Yes Married No Divorced Accuracy of a rule: No Married – Fraction of records Yes Divorced that satisfy both the No Single No Married antecedent and 10 No Single consequent of a (Status=Single) No rule Taxable Income Class 125K No 100K No 70K No 120K No 95K Yes 60K No 220K No 85K Yes 75K No 90K Yes 10 Coverage = 40%, Accuracy = 50% © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› How does Rule-based Classifier Work? R1: (Give Birth = no) (Can Fly = yes) Birds R2: (Give Birth = no) (Live in Water = yes) Fishes R3: (Give Birth = yes) (Blood Type = warm) Mammals R4: (Give Birth = no) (Can Fly = no) Reptiles R5: (Live in Water = sometimes) Amphibians Name lemur turtle dogfish shark Blood Type warm cold cold Give Birth Can Fly Live in Water Class yes no yes no no no no sometimes yes ? ? ? A lemur triggers rule R3, so it is classified as a mammal A turtle triggers both R4 and R5 A dogfish shark triggers none of the rules © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Characteristics of Rule-Based Classifier Mutually exclusive rules – Classifier contains mutually exclusive rules if the rules are independent of each other – Every record is covered by at most one rule Exhaustive rules – Classifier has exhaustive coverage if it accounts for every possible combination of attribute values – Each record is covered by at least one rule © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› From Decision Trees To Rules Classification Rules (Refund=Yes) ==> No Refund Yes No NO Marita l Status {Single, Divorced} (Refund=No, Marital Status={Single,Divorced}, Taxable Income No {Married} (Refund=No, Marital Status={Single,Divorced}, Taxable Income>80K) ==> Yes (Refund=No, Marital Status={Married}) ==> No NO Taxable Income < 80K NO > 80K YES Rules are mutually exclusive and exhaustive Rule set contains as much information as the tree © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Rules Can Be Simplified Tid Refund Marital Status Taxable Income Cheat Yes Single 125K No No Married 100K No No Single 70K No Yes Married 120K No No Divorced 95K No Married Yes Divorced 220K No No Single 85K Yes No Married 75K No 10 No Single 90K Yes Refund Yes No NO {Single, Divorced} Marita l Status {Married} NO Taxable Income < 80K NO > 80K YES 60K Yes No 10 Initial Rule: (Refund=No) (Status=Married) No Simplified Rule: (Status=Married) No © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Effect of Rule Simplification Rules are no longer mutually exclusive – A record may trigger more than one rule – Solution? Ordered rule set Unordered rule set – use voting schemes Rules are no longer exhaustive – A record may not trigger any rules – Solution? Use a default class © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Support Vector Machines What if the problem is not linearly separable? © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Support Vector Machines What if the problem is not linearly separable? – Introduce slack variables Need to minimize: || w || N k L( w) C i i 1 Subject to: if w x i b - i 1 f ( xi ) if w x i b 1 i © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Nonlinear Support Vector Machines What if decision boundary is not linear? © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Nonlinear Support Vector Machines Transform data into higher dimensional space © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Ensemble Methods Construct a set of classifiers from the training data Predict class label of previously unseen records by aggregating predictions made by multiple classifiers © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› General Idea D Step 1: Create Multiple Data Sets Step 2: Build Multiple Classifiers D1 D2 C1 C2 Step 3: Combine Classifiers © Tan,Steinbach, Kumar Original Training data Dt-1 Dt Ct -1 Ct C* Introduction to Data Mining 4/18/2004 ‹#› Why does it work? Suppose there are 25 base classifiers – Each classifier has error rate, = 0.35 – Assume classifiers are independent – Probability that the ensemble classifier makes a wrong prediction: 25 i 25 i ( ) 0.06 i i 13 25 © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Examples of Ensemble Methods How to generate an ensemble of classifiers? – Bagging – Boosting © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Bagging Sampling with replacement Original Data Bagging (Round 1) Bagging (Round 2) Bagging (Round 3) 1 8 10 10 2 5 10 10 3 10 Build classifier on each bootstrap sample Each sample has probability (1 – 1/n)n of being selected © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Boosting An iterative procedure to adaptively change distribution of training data by focusing more on previously misclassified records – Initially, all N records are assigned equal weights – Unlike bagging, weights may change at the end of boosting round © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Boosting Records that are wrongly classified will have their weights increased Records that are classified correctly will have their weights decreased Original Data Boosting (Round 1) Boosting (Round 2) Boosting (Round 3) 4 8 10 5 4 10 10 • Example is hard to classify • Its weight is increased, therefore it is more likely to be chosen again in subsequent rounds © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Example: AdaBoost Base classifiers: C1, C2, …, CT Error rate: i N w C ( x ) y N j 1 j i j j Importance of a classifier: i i ln i © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› Example: AdaBoost Weight update: j if C j ( xi ) yi w exp ( j 1) wi Z j exp j if C j ( xi ) yi where Z j is the normalization factor ( j) i If any intermediate rounds produce error rate higher than 50%, the weights are reverted back to 1/n and the resampling procedure is repeated Classification: T C * ( x ) arg max j C j ( x ) y y © Tan,Steinbach, Kumar Introduction to Data Mining j 1 4/18/2004 ‹#› Illustrating AdaBoost Initial weights for each data point Original Data 0.1 0.1 0.1 +++ - - - - - ++ Data points for training B1 0.0094 Boosting Round +++ © Tan,Steinbach, Kumar 0.0094 0.4623 - - - - - - - Introduction to Data Mining = 1.9459 4/18/2004 ‹#› Illustrating AdaBoost B1 0.0094 Boosting Round 0.0094 +++ 0.4623 - - - - - - - = 1.9459 B2 Boosting Round 0.0009 0.3037 - - - - - - - - 0.0422 ++ = 2.9323 B3 0.0276 0.1819 0.0038 Boosting Round +++ ++ ++ + ++ Overall +++ - - - - - © Tan,Steinbach, Kumar Introduction to Data Mining = 3.8744 ++ 4/18/2004 ‹#› ... 4/18/2004 ‹#› Building Classification Rules Direct Method: Extract rules directly from data e.g.: RIPPER, CN2, Holte’s 1R Indirect Method: Extract rules from other classification models... rule © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹#› From Decision Trees To Rules Classification Rules (Refund=Yes) ==> No Refund Yes No NO Marita l Status {Single, Divorced} (Refund=No,... y is the class label – LHS: rule antecedent or condition – RHS: rule consequent – Examples of classification rules: (Blood Type=Warm) (Lay Eggs=Yes) Birds (Taxable Income < 50K) (Refund=Yes)