SlideShare a Scribd company logo
1 of 51
Download to read offline
Prof. Pier Luca Lanzi
Classification Rules
Data Mining andText Mining (UIC 583 @ Politecnico di Milano)
Prof. Pier Luca Lanzi
The Weather Dataset
Outlook Temp Humidity Windy Play
Sunny Hot High False No
Sunny Hot High True No
Overcast Hot High False Yes
Rainy Mild High False Yes
Rainy Cool Normal False Yes
Rainy Cool Normal True No
Overcast Cool Normal True Yes
Sunny Mild High False No
Sunny Cool Normal False Yes
Rainy Mild Normal False Yes
Sunny Mild Normal True Yes
Overcast Mild High True Yes
Overcast Hot Normal False Yes
Rainy Mild High True No
2
Prof. Pier Luca Lanzi
A Rule Set to Classify the Data
• IF (humidity = high) and (outlook = sunny)
THEN play=no (3.0/0.0)
• IF (outlook = rainy) and (windy = TRUE)
THEN play=no (2.0/0.0)
• OTHERWISE play=yes (9.0/0.0)
3
Prof. Pier Luca Lanzi
Let’s Check the First Rule
Outlook Temp Humidity Windy Play
Sunny Hot High False No
Sunny Hot High True No
Overcast Hot High False Yes
Rainy Mild High False Yes
Rainy Cool Normal False Yes
Rainy Cool Normal True No
Overcast Cool Normal True Yes
Sunny Mild High False No
Sunny Cool Normal False Yes
Rainy Mild Normal False Yes
Sunny Mild Normal True Yes
Overcast Mild High True Yes
Overcast Hot Normal False Yes
Rainy Mild High True No
4
IF (humidity = high) and (outlook = sunny) THEN play=no (3.0/0.0)
Prof. Pier Luca Lanzi
Then, The Second Rule
Outlook Temp Humidity Windy Play
Sunny Hot High False No
Sunny Hot High True No
Overcast Hot High False Yes
Rainy Mild High False Yes
Rainy Cool Normal False Yes
Rainy Cool Normal True No
Overcast Cool Normal True Yes
Sunny Mild High False No
Sunny Cool Normal False Yes
Rainy Mild Normal False Yes
Sunny Mild Normal True Yes
Overcast Mild High True Yes
Overcast Hot Normal False Yes
Rainy Mild High True No
5
IF (outlook = rainy) and (windy = TRUE) THEN play=no (2.0/0.0)
Prof. Pier Luca Lanzi
Finally, the Third Rule
Outlook Temp Humidity Windy Play
Sunny Hot High False No
Sunny Hot High True No
Overcast Hot High False Yes
Rainy Mild High False Yes
Rainy Cool Normal False Yes
Rainy Cool Normal True No
Overcast Cool Normal True Yes
Sunny Mild High False No
Sunny Cool Normal False Yes
Rainy Mild Normal False Yes
Sunny Mild Normal True Yes
Overcast Mild High True Yes
Overcast Hot Normal False Yes
Rainy Mild High True No
6
IF (outlook = rainy) and (windy = TRUE) THEN play=no (2.0/0.0)
Prof. Pier Luca Lanzi
A Simpler Solution
• IF (outlook = sunny) THEN play IS no
ELSE IF (outlook = overcast) THEN play IS yes
ELSE IF (outlook = rainy) THEN play IS yes
(6/14 instances correct)
7
Prof. Pier Luca Lanzi
What is a Classification Rule?
8
Prof. Pier Luca Lanzi
What Is A Classification Rules?
Why Rules?
• They are IF-THEN rules
§The IF part states a condition over the data
§The THEN part includes a class label
• What types of conditions?
§Propositional, with attribute-value comparisons
§First order Horn clauses, with variables
• Why rules?
§Because they are one of the most expressive and most human
readable representation for hypotheses
9
Prof. Pier Luca Lanzi
Coverage and Accuracy
• IF (humidity = high) and (outlook = sunny)
THEN play=no (3.0/0.0)
• ncovers = number of examples covered by the rule
• ncorrect = number of examples correctly classified by the rule
• coverage(R) = ncovers/size of the |training data set
• accuracy(R) = ncorrect/ncovers
10
Prof. Pier Luca Lanzi
Conflict Resolution
• If more than one rule is triggered, we need conflict resolution
• Size ordering
§Assign the highest priority to the triggering rules that has the
“toughest” requirement (i.e., with the most attribute test)
• Class-based ordering
§Decreasing order of prevalence or misclassification
cost per class
• Rule-based ordering (decision list)
§Rules are organized into one long priority list, according to
some measure of rule quality or by experts
11
Prof. Pier Luca Lanzi
Two Approaches for Rule Learning
• Direct Methods
§Directly learn the rules from the training data
• Indirect Methods
§Learn decision tree, then convert to rules
§Learn neural networks, then extract rules
12
Prof. Pier Luca Lanzi
1R Classifier
13
Prof. Pier Luca Lanzi
Inferring Rudimentary Rules
• 1R Classifier learns a simple rule involving one attribute
§Assumes nominal attributes
§The rule tests all the values of one particular attribute
• Basic version
§One branch for each value
§Each branch assigns most frequent class
§Attribute performance is measured using the error rate
computed as the proportion of instances that don’t belong to
the majority class of their corresponding branch
§Choose attribute with lowest error rate
§“missing” is treated as a separate value
14
Prof. Pier Luca Lanzi
For each attribute,
For each value of the attribute,
make a rule as follows:
count how often each class appears
find the most frequent class
make the rule assign that class to
this attribute-value
Calculate the error rate of the rules
Choose the rules with the smallest error rate
Pseudo-Code for OneRule 15
Prof. Pier Luca Lanzi
Evaluating the Weather Attributes 16
3/6True => No
5/14*2/8False =>YesWindy
1/7Normal =>Yes
4/143/7High => NoHumidity
5/14*
4/14
Total
errors
1/4Cool => Yes
2/6Mild =>Yes
2/4Hot => NoTemp
2/5Rainy =>Yes
0/4Overcast =>Yes
2/5Sunny => NoOutlook
ErrorsRulesAttribute
NoTrueHighMildRainy
YesFalseNormalHotOvercast
YesTrueHighMildOvercast
YesTrueNormalMildSunny
YesFalseNormalMildRainy
YesFalseNormalCoolSunny
NoFalseHighMildSunny
YesTrueNormalCoolOvercast
NoTrueNormalCoolRainy
YesFalseNormalCoolRainy
YesFalseHighMildRainy
YesFalseHighHotOvercast
NoTrueHighHotSunny
NoFalseHighHotSunny
PlayWindyHumidityTempOutlook
* indicates a tie
Prof. Pier Luca Lanzi
Contact Lens Data 17
NoneReducedYesHypermetropePre-presbyopic
NoneNormalYesHypermetropePre-presbyopic
NoneReducedNoMyopePresbyopic
NoneNormalNoMyopePresbyopic
NoneReducedYesMyopePresbyopic
HardNormalYesMyopePresbyopic
NoneReducedNoHypermetropePresbyopic
SoftNormalNoHypermetropePresbyopic
NoneReducedYesHypermetropePresbyopic
NoneNormalYesHypermetropePresbyopic
SoftNormalNoHypermetropePre-presbyopic
NoneReducedNoHypermetropePre-presbyopic
HardNormalYesMyopePre-presbyopic
NoneReducedYesMyopePre-presbyopic
SoftNormalNoMyopePre-presbyopic
NoneReducedNoMyopePre-presbyopic
hardNormalYesHypermetropeYoung
NoneReducedYesHypermetropeYoung
SoftNormalNoHypermetropeYoung
NoneReducedNoHypermetropeYoung
HardNormalYesMyopeYoung
NoneReducedYesMyopeYoung
SoftNormalNoMyopeYoung
NoneReducedNoMyopeYoung
Recommended lensesTear production rateAstigmatismSpectacle prescriptionAge
Prof. Pier Luca Lanzi
1R Applied to Contact Lens Data
• IF tear-prod-rate=normal THEN soft
ELSE IF tear-prod-rate=reduced THEN none
(17/24 instances correct)
18
Prof. Pier Luca Lanzi
How can we deal with
numerical attributes using 1R?
We can discretize them before applying 1R
We can directly use 1R discretization
Prof. Pier Luca Lanzi
1R and Numerical Attributes
• Applies simple supervised discretization
• Sort instances according to attribute’s values
• Place breakpoints where class changes (majority class)
• This procedure is however very sensitive to noise since one
example with an incorrect class label may produce a separate
interval. This is likely to lead to overfitting.
• In the case of the temperature,
20
64 65 68 69 70 71 72 72 75 75 80 81 83 85
Yes | No | Yes Yes Yes | No No Yes | Yes Yes | No | Yes Yes | No
Prof. Pier Luca Lanzi
1R and Numerical Attributes
• To limit overfitting, enforce minimum number of instances in
majority class per interval.
• For instance, in the case of the temperature, if we set the
minimum number of majority class instances to 3, we have
21
64 65 68 69 70 71 72 72 75 75 80 81 83 85
Yes | No | Yes Yes Yes | No No Yes | Yes Yes | No | Yes Yes | No
64 65 68 69 70 71 72 72 75 75 80 81 83 85
Yes | No | Yes Yes Yes | No No Yes | Yes Yes | No | Yes Yes | No
64 65 68 69 70 71 72 72 75 75 80 81 83 85
Yes No Yes Yes Yes | No No Yes Yes Yes | No Yes Yes No
join the intervals to get at least 3 examples
join the intervals with the same majority class
Prof. Pier Luca Lanzi
1R Applied to the Numerical Version
of the Weather Dataset
22
0/1> 95.5 =>Yes
3/6True => No*
5/142/8False =>YesWindy
2/6> 82.5 and ≤ 95.5 => No
3/141/7≤ 82.5 => YesHumidity
5/14
4/14
Total errors
2/4> 77.5 => No*
3/10≤ 77.5 =>YesTemperature
2/5Rainy =>Yes
0/4Overcast =>Yes
2/5Sunny => NoOutlook
ErrorsRulesAttribute
Prof. Pier Luca Lanzi
Sequential Covering
23
Prof. Pier Luca Lanzi
Sequential Covering Algorithms
• Consider the set E of positive and negative examples
• Repeat
§Learn one rule with high accuracy, any coverage
§Remove positive examples covered by this rule
• Until all the examples are covered
24
Prof. Pier Luca Lanzi
Basic Sequential Covering Algorithm
procedure Covering (Examples, Classifier)
input: a set of positive and negative examples for class c
// rule set is initially empty
classifier = {}
while PositiveExamples(Examples)!={}
// find the best rule possible
Rule = FindBestRule(Examples)
// check if we need more rules
if Stop(Examples, Rule, Classifier) breakwhile
// remove covered examples and update the model
Examples = ExamplesCover(Rule,Examples)
Classifier = Classifier U {Rule}
Endwhile
// post-process the rules (sort them, simplify them, etc.)
Classifier = PostProcessing(Classifier)
output: Classifier
25
Prof. Pier Luca Lanzi
Finding the Best Rule Possible 26
IF THEN Play=yes
IF Wind=No
THEN Play=yes
IF Humidity=Normal
THEN Play=yes
IF Humidity=High
THEN Play=yes
IF …
THEN …
IF Wind=yes
THEN Play=yes
IF Humidity=Normal
AND Wind=yes
THEN Play=yes
IF Humidity=Normal
AND Wind=No
THEN Play=yes
IF Humidity=Normal
AND Outlook=Rainy
THEN Play=yes
P=5/10 = 0.5
P=6/8=0.75 P=6/7=0.86 P=3/7=0.43
Prof. Pier Luca Lanzi
And Another Viewpoint 27
(i) Original Data (ii) Step 1
(iii) Step 2
R1
(iv) Step 3
R1
R2
Prof. Pier Luca Lanzi
Learning Just One Rule
LearnOneRule(Attributes, Examples, k)
init BH to the most general hypothesis
init CH to {BH}
while CH not empty Do
Generate Next More Specific CH in NCH
// check all the NCH for an hypothesis that
// improves the performance of BH
Update BH
Update CH with the k best NCH
endwhile
return a rule “IF BH THEN prediction”
28
Prof. Pier Luca Lanzi
An Example Using Contact Lens Data 29
NoneReducedYesHypermetropePre-presbyopic
NoneNormalYesHypermetropePre-presbyopic
NoneReducedNoMyopePresbyopic
NoneNormalNoMyopePresbyopic
NoneReducedYesMyopePresbyopic
HardNormalYesMyopePresbyopic
NoneReducedNoHypermetropePresbyopic
SoftNormalNoHypermetropePresbyopic
NoneReducedYesHypermetropePresbyopic
NoneNormalYesHypermetropePresbyopic
SoftNormalNoHypermetropePre-presbyopic
NoneReducedNoHypermetropePre-presbyopic
HardNormalYesMyopePre-presbyopic
NoneReducedYesMyopePre-presbyopic
SoftNormalNoMyopePre-presbyopic
NoneReducedNoMyopePre-presbyopic
hardNormalYesHypermetropeYoung
NoneReducedYesHypermetropeYoung
SoftNormalNoHypermetropeYoung
NoneReducedNoHypermetropeYoung
HardNormalYesMyopeYoung
NoneReducedYesMyopeYoung
SoftNormalNoMyopeYoung
NoneReducedNoMyopeYoung
Recommended lensesTear production rateAstigmatismSpectacle prescriptionAge
Prof. Pier Luca Lanzi
First Step: the Most General Rule
• Rule we seek:
• Possible tests:
30
4/12Tear production rate = Normal
0/12Tear production rate = Reduced
4/12Astigmatism = yes
0/12Astigmatism = no
1/12Spectacle prescription = Hypermetrope
3/12Spectacle prescription = Myope
1/8Age = Presbyopic
1/8Age = Pre-presbyopic
2/8Age = Young
If ?
then recommendation = hard
Prof. Pier Luca Lanzi
Adding the First Clause
• Rule with best test added,
• Instances covered by modified rule,
31
NoneReducedYesHypermetropePre-presbyopic
NoneNormalYesHypermetropePre-presbyopic
NoneReducedYesMyopePresbyopic
HardNormalYesMyopePresbyopic
NoneReducedYesHypermetropePresbyopic
NoneNormalYesHypermetropePresbyopic
HardNormalYesMyopePre-presbyopic
NoneReducedYesMyopePre-presbyopic
hardNormalYesHypermetropeYoung
NoneReducedYesHypermetropeYoung
HardNormalYesMyopeYoung
NoneReducedYesMyopeYoung
Recommended lensesTear production rateAstigmatismSpectacle prescriptionAge
If astigmatism = yes
then recommendation = hard
Prof. Pier Luca Lanzi
Extending the First Rule
• Current state,
• Possible tests,
32
4/6Tear production rate = Normal
0/6Tear production rate = Reduced
1/6Spectacle prescription = Hypermetrope
3/6Spectacle prescription = Myope
1/4Age = Presbyopic
1/4Age = Pre-presbyopic
2/4Age = Young
If astigmatism = yes
and ?
then recommendation = hard
Prof. Pier Luca Lanzi
The Second Rule
• Rule with best test added:
• Instances covered by modified rule
33
NoneNormalYesHypermetropePrepresbyopic
HardNormalYesMyopePresbyopic
NoneNormalYesHypermetropePresbyopic
HardNormalYesMyopePrepresbyopic
HardNormalYesHypermetropeYoung
HardNormalYesMyopeYoung
Recommended lensesTear production rateAstigmatismSpectacle prescriptionAge
If astigmatism = yes
and tear production rate = normal
then recommendation = Hard
Prof. Pier Luca Lanzi
Adding the Third Clause
• Current state:
• Possible tests:
• Tie between the first and the fourth test,
we choose the one with greater coverage
34
1/3Spectacle prescription = Hypermetrope
3/3Spectacle prescription = Myope
1/2Age = Presbyopic
1/2Age = Pre-presbyopic
2/2Age = Young
If astigmatism = yes
and tear production rate = normal
and ?
then recommendation = hard
Prof. Pier Luca Lanzi
The Final Result
• Final rule:
• Second rule for recommending “hard lenses”:
(built from instances not covered by first rule)
• These two rules cover all “hard lenses”:
• Process is repeated with other two classes
35
If astigmatism = yes
and tear production rate = normal
and spectacle prescription = myope
then recommendation = hard
If age = young and astigmatism = yes
and tear production rate = normal
then recommendation = hard
Prof. Pier Luca Lanzi
Testing for the Best Rule
• Measure 1: Accuracy (p/t)
§t total instances covered by rule
pnumber of these that are positive
§Produce rules that do not cover negative instances,
as quickly as possible
§May produce rules with very small coverage
—special cases or noise?
• Measure 2: Information gain p (log(p/t) – log(P/T))
§P and T the positive and total numbers before the new
condition was added
§Information gain emphasizes positive rather than negative
instances
• These measures interact with the pruning mechanism used
36
Prof. Pier Luca Lanzi
Eliminating Instances
• Why do we need to eliminate instances?
§Otherwise, the next rule is identical to previous rule
• Why do we remove positive instances?
§To ensure that the next rule is different
§Prevent overestimating the accuracy of rule
§So that we have a more robust estimate of accuracy (why?)
• Why do we remove negative instances?
§Prevent underestimating the accuracy of rule
§Compare rules R2 and R3 in the following diagram
37
Prof. Pier Luca Lanzi
Eliminating Instances 38
class = +
class = -
+
+ +
+
+
+
+
+
+
+
+
+
+
+
+
+
++
+
+
-
-
-
-
- -
-
-
-
- -
-
-
-
-
-
-
-
-
-
-
+
+
++
+
+
+
R1
R3 R2
+
+
Prof. Pier Luca Lanzi
Missing Values and Numeric
Attributes
• Missing values usually fail the test
• Covering algorithm must either
§Use other tests to separate out positive instances
§Leave them uncovered until later in the process
• In some cases it is better to treat “missing” as a separate value
• Numeric attributes are treated as in decision trees
39
Prof. Pier Luca Lanzi
Stopping Criterion and Rule Pruning
• The process usually stops when there is no significant
improvement by adding the new rule
• Rule pruning is similar to post-pruning of decision trees
• Reduced Error Pruning:
§ Remove one of the conjuncts in the rule
§ Compare error rate on validation set
§ If error improves, prune the conjunct
40
Prof. Pier Luca Lanzi
run the KNIME workflows
on decision rules
41
Prof. Pier Luca Lanzi
Prof. Pier Luca Lanzi
Indirect Methods
43
Prof. Pier Luca Lanzi
Rules vs. Trees
• Rule sets can be more readable
• Decision trees suffer from replicated subtrees
• Rule sets are collections of local models, trees represent models
over the whole domain
• The covering algorithm concentrates on one class at a time
whereas decision tree learner takes all classes into account
44
Prof. Pier Luca Lanzi
Indirect Methods
Rule Set
r1: (P=No,Q=No) ==> -
r2: (P=No,Q=Yes) ==> +
r3: (P=Yes,R=No) ==> +
r4: (P=Yes,R=Yes,Q=No) ==> -
r5: (P=Yes,R=Yes,Q=Yes) ==> +
P
Q R
Q- + +
- +
No No
No
Yes Yes
Yes
No Yes
45
Prof. Pier Luca Lanzi
run the KNIME workflows
on decision rules
46
Prof. Pier Luca Lanzi
Prof. Pier Luca Lanzi
Summary
48
Prof. Pier Luca Lanzi
Summary
• Advantages of Rule-Based Classifiers
§As highly expressive as decision trees
§Easy to interpret
§Easy to generate
§Can classify new instances rapidly
§Performance comparable to decision trees
• Two approaches: direct and indirect methods
49
Prof. Pier Luca Lanzi
Summary
• Direct Methods, typically apply sequential covering approach
§Grow a single rule
§Remove Instances from rule
§Prune the rule (if necessary)
§Add rule to Current Rule Set
§Repeat
• Other approaches exist
§Specific to general exploration (RISE)
§Post processing of neural networks,
association rules, decision trees, etc.
50
Prof. Pier Luca Lanzi
Homework
• Generate the rule set for the Weather dataset by repeatedly
applying the procedure to learn one rule until no improvement
can be produced or the covered examples are too few
• Check the problems provided in the previous exams and apply
both OneRule and Sequential Covering to generate the first rule.
Then, check the result with one of the implementations available
in Weka
• Apply one or more sequential covering algorithms using the
KNIME workflow. Compute the usual performance statistics.
51

More Related Content

What's hot

DMTM 2015 - 14 Evaluation of Classification Models
DMTM 2015 - 14 Evaluation of Classification ModelsDMTM 2015 - 14 Evaluation of Classification Models
DMTM 2015 - 14 Evaluation of Classification ModelsPier Luca Lanzi
 
DMTM 2015 - 06 Introduction to Clustering
DMTM 2015 - 06 Introduction to ClusteringDMTM 2015 - 06 Introduction to Clustering
DMTM 2015 - 06 Introduction to ClusteringPier Luca Lanzi
 
DMTM Lecture 10 Classification ensembles
DMTM Lecture 10 Classification ensemblesDMTM Lecture 10 Classification ensembles
DMTM Lecture 10 Classification ensemblesPier Luca Lanzi
 
DMTM Lecture 20 Data preparation
DMTM Lecture 20 Data preparationDMTM Lecture 20 Data preparation
DMTM Lecture 20 Data preparationPier Luca Lanzi
 
DMTM Lecture 15 Clustering evaluation
DMTM Lecture 15 Clustering evaluationDMTM Lecture 15 Clustering evaluation
DMTM Lecture 15 Clustering evaluationPier Luca Lanzi
 
DMTM Lecture 03 Regression
DMTM Lecture 03 RegressionDMTM Lecture 03 Regression
DMTM Lecture 03 RegressionPier Luca Lanzi
 
DMTM Lecture 06 Classification evaluation
DMTM Lecture 06 Classification evaluationDMTM Lecture 06 Classification evaluation
DMTM Lecture 06 Classification evaluationPier Luca Lanzi
 
DMTM Lecture 11 Clustering
DMTM Lecture 11 ClusteringDMTM Lecture 11 Clustering
DMTM Lecture 11 ClusteringPier Luca Lanzi
 
DMTM 2015 - 07 Hierarchical Clustering
DMTM 2015 - 07 Hierarchical ClusteringDMTM 2015 - 07 Hierarchical Clustering
DMTM 2015 - 07 Hierarchical ClusteringPier Luca Lanzi
 
DMTM 2015 - 03 Data Representation
DMTM 2015 - 03 Data RepresentationDMTM 2015 - 03 Data Representation
DMTM 2015 - 03 Data RepresentationPier Luca Lanzi
 
DMTM Lecture 12 Hierarchical clustering
DMTM Lecture 12 Hierarchical clusteringDMTM Lecture 12 Hierarchical clustering
DMTM Lecture 12 Hierarchical clusteringPier Luca Lanzi
 
DMTM 2015 - 04 Data Exploration
DMTM 2015 - 04 Data ExplorationDMTM 2015 - 04 Data Exploration
DMTM 2015 - 04 Data ExplorationPier Luca Lanzi
 
DMTM 2015 - 09 Density Based Clustering
DMTM 2015 - 09 Density Based ClusteringDMTM 2015 - 09 Density Based Clustering
DMTM 2015 - 09 Density Based ClusteringPier Luca Lanzi
 
DMTM 2015 - 10 Introduction to Classification
DMTM 2015 - 10 Introduction to ClassificationDMTM 2015 - 10 Introduction to Classification
DMTM 2015 - 10 Introduction to ClassificationPier Luca Lanzi
 
DMTM Lecture 14 Density based clustering
DMTM Lecture 14 Density based clusteringDMTM Lecture 14 Density based clustering
DMTM Lecture 14 Density based clusteringPier Luca Lanzi
 
VDP2016 - Lecture 15 PCG with Unity
VDP2016 - Lecture 15 PCG with UnityVDP2016 - Lecture 15 PCG with Unity
VDP2016 - Lecture 15 PCG with UnityPier Luca Lanzi
 
2013-1 Machine Learning Lecture 06 - Lucila Ohno-Machado - Ensemble Methods
2013-1 Machine Learning Lecture 06 - Lucila Ohno-Machado - Ensemble Methods2013-1 Machine Learning Lecture 06 - Lucila Ohno-Machado - Ensemble Methods
2013-1 Machine Learning Lecture 06 - Lucila Ohno-Machado - Ensemble MethodsDongseo University
 
DMTM Lecture 18 Graph mining
DMTM Lecture 18 Graph miningDMTM Lecture 18 Graph mining
DMTM Lecture 18 Graph miningPier Luca Lanzi
 
Mathematics online: some common algorithms
Mathematics online: some common algorithmsMathematics online: some common algorithms
Mathematics online: some common algorithmsMark Moriarty
 
Mixed Effects Models - Empirical Logit
Mixed Effects Models - Empirical LogitMixed Effects Models - Empirical Logit
Mixed Effects Models - Empirical LogitScott Fraundorf
 

What's hot (20)

DMTM 2015 - 14 Evaluation of Classification Models
DMTM 2015 - 14 Evaluation of Classification ModelsDMTM 2015 - 14 Evaluation of Classification Models
DMTM 2015 - 14 Evaluation of Classification Models
 
DMTM 2015 - 06 Introduction to Clustering
DMTM 2015 - 06 Introduction to ClusteringDMTM 2015 - 06 Introduction to Clustering
DMTM 2015 - 06 Introduction to Clustering
 
DMTM Lecture 10 Classification ensembles
DMTM Lecture 10 Classification ensemblesDMTM Lecture 10 Classification ensembles
DMTM Lecture 10 Classification ensembles
 
DMTM Lecture 20 Data preparation
DMTM Lecture 20 Data preparationDMTM Lecture 20 Data preparation
DMTM Lecture 20 Data preparation
 
DMTM Lecture 15 Clustering evaluation
DMTM Lecture 15 Clustering evaluationDMTM Lecture 15 Clustering evaluation
DMTM Lecture 15 Clustering evaluation
 
DMTM Lecture 03 Regression
DMTM Lecture 03 RegressionDMTM Lecture 03 Regression
DMTM Lecture 03 Regression
 
DMTM Lecture 06 Classification evaluation
DMTM Lecture 06 Classification evaluationDMTM Lecture 06 Classification evaluation
DMTM Lecture 06 Classification evaluation
 
DMTM Lecture 11 Clustering
DMTM Lecture 11 ClusteringDMTM Lecture 11 Clustering
DMTM Lecture 11 Clustering
 
DMTM 2015 - 07 Hierarchical Clustering
DMTM 2015 - 07 Hierarchical ClusteringDMTM 2015 - 07 Hierarchical Clustering
DMTM 2015 - 07 Hierarchical Clustering
 
DMTM 2015 - 03 Data Representation
DMTM 2015 - 03 Data RepresentationDMTM 2015 - 03 Data Representation
DMTM 2015 - 03 Data Representation
 
DMTM Lecture 12 Hierarchical clustering
DMTM Lecture 12 Hierarchical clusteringDMTM Lecture 12 Hierarchical clustering
DMTM Lecture 12 Hierarchical clustering
 
DMTM 2015 - 04 Data Exploration
DMTM 2015 - 04 Data ExplorationDMTM 2015 - 04 Data Exploration
DMTM 2015 - 04 Data Exploration
 
DMTM 2015 - 09 Density Based Clustering
DMTM 2015 - 09 Density Based ClusteringDMTM 2015 - 09 Density Based Clustering
DMTM 2015 - 09 Density Based Clustering
 
DMTM 2015 - 10 Introduction to Classification
DMTM 2015 - 10 Introduction to ClassificationDMTM 2015 - 10 Introduction to Classification
DMTM 2015 - 10 Introduction to Classification
 
DMTM Lecture 14 Density based clustering
DMTM Lecture 14 Density based clusteringDMTM Lecture 14 Density based clustering
DMTM Lecture 14 Density based clustering
 
VDP2016 - Lecture 15 PCG with Unity
VDP2016 - Lecture 15 PCG with UnityVDP2016 - Lecture 15 PCG with Unity
VDP2016 - Lecture 15 PCG with Unity
 
2013-1 Machine Learning Lecture 06 - Lucila Ohno-Machado - Ensemble Methods
2013-1 Machine Learning Lecture 06 - Lucila Ohno-Machado - Ensemble Methods2013-1 Machine Learning Lecture 06 - Lucila Ohno-Machado - Ensemble Methods
2013-1 Machine Learning Lecture 06 - Lucila Ohno-Machado - Ensemble Methods
 
DMTM Lecture 18 Graph mining
DMTM Lecture 18 Graph miningDMTM Lecture 18 Graph mining
DMTM Lecture 18 Graph mining
 
Mathematics online: some common algorithms
Mathematics online: some common algorithmsMathematics online: some common algorithms
Mathematics online: some common algorithms
 
Mixed Effects Models - Empirical Logit
Mixed Effects Models - Empirical LogitMixed Effects Models - Empirical Logit
Mixed Effects Models - Empirical Logit
 

Similar to DMTM Lecture 08 Classification rules

One R (1R) Algorithm
One R (1R) AlgorithmOne R (1R) Algorithm
One R (1R) AlgorithmMLCollab
 
DMTM 2015 - 11 Decision Trees
DMTM 2015 - 11 Decision TreesDMTM 2015 - 11 Decision Trees
DMTM 2015 - 11 Decision TreesPier Luca Lanzi
 
3.Classification.ppt
3.Classification.ppt3.Classification.ppt
3.Classification.pptKhalilDaiyob1
 
Machine Learning and Data Mining: 12 Classification Rules
Machine Learning and Data Mining: 12 Classification RulesMachine Learning and Data Mining: 12 Classification Rules
Machine Learning and Data Mining: 12 Classification RulesPier Luca Lanzi
 
lec06_Classification_NaiveBayes_RuleBased.pptx
lec06_Classification_NaiveBayes_RuleBased.pptxlec06_Classification_NaiveBayes_RuleBased.pptx
lec06_Classification_NaiveBayes_RuleBased.pptxMmine1
 
Random forest sgv_ai_talk_oct_2_2018
Random forest sgv_ai_talk_oct_2_2018Random forest sgv_ai_talk_oct_2_2018
Random forest sgv_ai_talk_oct_2_2018digitalzombie
 
Shift-Left Testing: QA in a DevOps World by David Laulusa
Shift-Left Testing: QA in a DevOps World by David LaulusaShift-Left Testing: QA in a DevOps World by David Laulusa
Shift-Left Testing: QA in a DevOps World by David LaulusaQA or the Highway
 
Test-Driven Development
 Test-Driven Development  Test-Driven Development
Test-Driven Development Amir Assad
 
mod_02_intro_ml.ppt
mod_02_intro_ml.pptmod_02_intro_ml.ppt
mod_02_intro_ml.pptbutest
 
Drools Planner Chtijug 2010
Drools Planner Chtijug 2010Drools Planner Chtijug 2010
Drools Planner Chtijug 2010Nicolas Heron
 
OARD 30: What part of 'NO' is so hard for the DNS to understand?
OARD 30: What part of 'NO' is so hard for the DNS to understand?OARD 30: What part of 'NO' is so hard for the DNS to understand?
OARD 30: What part of 'NO' is so hard for the DNS to understand?APNIC
 
Machine Learning: finding patterns Outline
Machine Learning: finding patterns OutlineMachine Learning: finding patterns Outline
Machine Learning: finding patterns Outlinebutest
 
Rule-Based Classifiers
Rule-Based ClassifiersRule-Based Classifiers
Rule-Based ClassifiersPhmVitAnh22
 
Decision tree.10.11
Decision tree.10.11Decision tree.10.11
Decision tree.10.11okeee
 
Test Driven Development
Test Driven DevelopmentTest Driven Development
Test Driven Developmentguy_davis
 
Decision trees for machine learning
Decision trees for machine learningDecision trees for machine learning
Decision trees for machine learningAmr BARAKAT
 
decision tree.pdf
decision tree.pdfdecision tree.pdf
decision tree.pdfDivitGoyal2
 

Similar to DMTM Lecture 08 Classification rules (20)

One R (1R) Algorithm
One R (1R) AlgorithmOne R (1R) Algorithm
One R (1R) Algorithm
 
DMTM 2015 - 11 Decision Trees
DMTM 2015 - 11 Decision TreesDMTM 2015 - 11 Decision Trees
DMTM 2015 - 11 Decision Trees
 
3.Classification.ppt
3.Classification.ppt3.Classification.ppt
3.Classification.ppt
 
Machine Learning and Data Mining: 12 Classification Rules
Machine Learning and Data Mining: 12 Classification RulesMachine Learning and Data Mining: 12 Classification Rules
Machine Learning and Data Mining: 12 Classification Rules
 
Machine Learning and Data Mining
Machine Learning and Data MiningMachine Learning and Data Mining
Machine Learning and Data Mining
 
205_April_22.pptx
205_April_22.pptx205_April_22.pptx
205_April_22.pptx
 
lec06_Classification_NaiveBayes_RuleBased.pptx
lec06_Classification_NaiveBayes_RuleBased.pptxlec06_Classification_NaiveBayes_RuleBased.pptx
lec06_Classification_NaiveBayes_RuleBased.pptx
 
Random forest sgv_ai_talk_oct_2_2018
Random forest sgv_ai_talk_oct_2_2018Random forest sgv_ai_talk_oct_2_2018
Random forest sgv_ai_talk_oct_2_2018
 
Shift-Left Testing: QA in a DevOps World by David Laulusa
Shift-Left Testing: QA in a DevOps World by David LaulusaShift-Left Testing: QA in a DevOps World by David Laulusa
Shift-Left Testing: QA in a DevOps World by David Laulusa
 
Test-Driven Development
 Test-Driven Development  Test-Driven Development
Test-Driven Development
 
jake.pptx
jake.pptxjake.pptx
jake.pptx
 
mod_02_intro_ml.ppt
mod_02_intro_ml.pptmod_02_intro_ml.ppt
mod_02_intro_ml.ppt
 
Drools Planner Chtijug 2010
Drools Planner Chtijug 2010Drools Planner Chtijug 2010
Drools Planner Chtijug 2010
 
OARD 30: What part of 'NO' is so hard for the DNS to understand?
OARD 30: What part of 'NO' is so hard for the DNS to understand?OARD 30: What part of 'NO' is so hard for the DNS to understand?
OARD 30: What part of 'NO' is so hard for the DNS to understand?
 
Machine Learning: finding patterns Outline
Machine Learning: finding patterns OutlineMachine Learning: finding patterns Outline
Machine Learning: finding patterns Outline
 
Rule-Based Classifiers
Rule-Based ClassifiersRule-Based Classifiers
Rule-Based Classifiers
 
Decision tree.10.11
Decision tree.10.11Decision tree.10.11
Decision tree.10.11
 
Test Driven Development
Test Driven DevelopmentTest Driven Development
Test Driven Development
 
Decision trees for machine learning
Decision trees for machine learningDecision trees for machine learning
Decision trees for machine learning
 
decision tree.pdf
decision tree.pdfdecision tree.pdf
decision tree.pdf
 

More from Pier Luca Lanzi

11 Settembre 2021 - Giocare con i Videogiochi
11 Settembre 2021 - Giocare con i Videogiochi11 Settembre 2021 - Giocare con i Videogiochi
11 Settembre 2021 - Giocare con i VideogiochiPier Luca Lanzi
 
Breve Viaggio al Centro dei Videogiochi
Breve Viaggio al Centro dei VideogiochiBreve Viaggio al Centro dei Videogiochi
Breve Viaggio al Centro dei VideogiochiPier Luca Lanzi
 
Global Game Jam 19 @ POLIMI - Morning Welcome
Global Game Jam 19 @ POLIMI - Morning WelcomeGlobal Game Jam 19 @ POLIMI - Morning Welcome
Global Game Jam 19 @ POLIMI - Morning WelcomePier Luca Lanzi
 
Data Driven Game Design @ Campus Party 2018
Data Driven Game Design @ Campus Party 2018Data Driven Game Design @ Campus Party 2018
Data Driven Game Design @ Campus Party 2018Pier Luca Lanzi
 
GGJ18 al Politecnico di Milano - Presentazione che precede la presentazione d...
GGJ18 al Politecnico di Milano - Presentazione che precede la presentazione d...GGJ18 al Politecnico di Milano - Presentazione che precede la presentazione d...
GGJ18 al Politecnico di Milano - Presentazione che precede la presentazione d...Pier Luca Lanzi
 
GGJ18 al Politecnico di Milano - Presentazione di apertura
GGJ18 al Politecnico di Milano - Presentazione di aperturaGGJ18 al Politecnico di Milano - Presentazione di apertura
GGJ18 al Politecnico di Milano - Presentazione di aperturaPier Luca Lanzi
 
Presentation for UNITECH event - January 8, 2018
Presentation for UNITECH event - January 8, 2018Presentation for UNITECH event - January 8, 2018
Presentation for UNITECH event - January 8, 2018Pier Luca Lanzi
 
DMTM Lecture 19 Data exploration
DMTM Lecture 19 Data explorationDMTM Lecture 19 Data exploration
DMTM Lecture 19 Data explorationPier Luca Lanzi
 
DMTM Lecture 17 Text mining
DMTM Lecture 17 Text miningDMTM Lecture 17 Text mining
DMTM Lecture 17 Text miningPier Luca Lanzi
 
DMTM Lecture 16 Association rules
DMTM Lecture 16 Association rulesDMTM Lecture 16 Association rules
DMTM Lecture 16 Association rulesPier Luca Lanzi
 
DMTM Lecture 05 Data representation
DMTM Lecture 05 Data representationDMTM Lecture 05 Data representation
DMTM Lecture 05 Data representationPier Luca Lanzi
 
DMTM Lecture 01 Introduction
DMTM Lecture 01 IntroductionDMTM Lecture 01 Introduction
DMTM Lecture 01 IntroductionPier Luca Lanzi
 
DMTM Lecture 02 Data mining
DMTM Lecture 02 Data miningDMTM Lecture 02 Data mining
DMTM Lecture 02 Data miningPier Luca Lanzi
 
VDP2016 - Lecture 16 Rendering pipeline
VDP2016 - Lecture 16 Rendering pipelineVDP2016 - Lecture 16 Rendering pipeline
VDP2016 - Lecture 16 Rendering pipelinePier Luca Lanzi
 
VDP2016 - Lecture 14 Procedural content generation
VDP2016 - Lecture 14 Procedural content generationVDP2016 - Lecture 14 Procedural content generation
VDP2016 - Lecture 14 Procedural content generationPier Luca Lanzi
 
VDP2016 - Lecture 13 Data driven game design
VDP2016 - Lecture 13 Data driven game designVDP2016 - Lecture 13 Data driven game design
VDP2016 - Lecture 13 Data driven game designPier Luca Lanzi
 

More from Pier Luca Lanzi (16)

11 Settembre 2021 - Giocare con i Videogiochi
11 Settembre 2021 - Giocare con i Videogiochi11 Settembre 2021 - Giocare con i Videogiochi
11 Settembre 2021 - Giocare con i Videogiochi
 
Breve Viaggio al Centro dei Videogiochi
Breve Viaggio al Centro dei VideogiochiBreve Viaggio al Centro dei Videogiochi
Breve Viaggio al Centro dei Videogiochi
 
Global Game Jam 19 @ POLIMI - Morning Welcome
Global Game Jam 19 @ POLIMI - Morning WelcomeGlobal Game Jam 19 @ POLIMI - Morning Welcome
Global Game Jam 19 @ POLIMI - Morning Welcome
 
Data Driven Game Design @ Campus Party 2018
Data Driven Game Design @ Campus Party 2018Data Driven Game Design @ Campus Party 2018
Data Driven Game Design @ Campus Party 2018
 
GGJ18 al Politecnico di Milano - Presentazione che precede la presentazione d...
GGJ18 al Politecnico di Milano - Presentazione che precede la presentazione d...GGJ18 al Politecnico di Milano - Presentazione che precede la presentazione d...
GGJ18 al Politecnico di Milano - Presentazione che precede la presentazione d...
 
GGJ18 al Politecnico di Milano - Presentazione di apertura
GGJ18 al Politecnico di Milano - Presentazione di aperturaGGJ18 al Politecnico di Milano - Presentazione di apertura
GGJ18 al Politecnico di Milano - Presentazione di apertura
 
Presentation for UNITECH event - January 8, 2018
Presentation for UNITECH event - January 8, 2018Presentation for UNITECH event - January 8, 2018
Presentation for UNITECH event - January 8, 2018
 
DMTM Lecture 19 Data exploration
DMTM Lecture 19 Data explorationDMTM Lecture 19 Data exploration
DMTM Lecture 19 Data exploration
 
DMTM Lecture 17 Text mining
DMTM Lecture 17 Text miningDMTM Lecture 17 Text mining
DMTM Lecture 17 Text mining
 
DMTM Lecture 16 Association rules
DMTM Lecture 16 Association rulesDMTM Lecture 16 Association rules
DMTM Lecture 16 Association rules
 
DMTM Lecture 05 Data representation
DMTM Lecture 05 Data representationDMTM Lecture 05 Data representation
DMTM Lecture 05 Data representation
 
DMTM Lecture 01 Introduction
DMTM Lecture 01 IntroductionDMTM Lecture 01 Introduction
DMTM Lecture 01 Introduction
 
DMTM Lecture 02 Data mining
DMTM Lecture 02 Data miningDMTM Lecture 02 Data mining
DMTM Lecture 02 Data mining
 
VDP2016 - Lecture 16 Rendering pipeline
VDP2016 - Lecture 16 Rendering pipelineVDP2016 - Lecture 16 Rendering pipeline
VDP2016 - Lecture 16 Rendering pipeline
 
VDP2016 - Lecture 14 Procedural content generation
VDP2016 - Lecture 14 Procedural content generationVDP2016 - Lecture 14 Procedural content generation
VDP2016 - Lecture 14 Procedural content generation
 
VDP2016 - Lecture 13 Data driven game design
VDP2016 - Lecture 13 Data driven game designVDP2016 - Lecture 13 Data driven game design
VDP2016 - Lecture 13 Data driven game design
 

Recently uploaded

Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxVishalSingh1417
 
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...KokoStevan
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxnegromaestrong
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
An Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfAn Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfSanaAli374401
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphThiyagu K
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Shubhangi Sonawane
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingTeacherCyreneCayanan
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDThiyagu K
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxVishalSingh1417
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docxPoojaSen20
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...christianmathematics
 

Recently uploaded (20)

Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
Advance Mobile Application Development class 07
Advance Mobile Application Development class 07Advance Mobile Application Development class 07
Advance Mobile Application Development class 07
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
An Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfAn Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdf
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docx
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 

DMTM Lecture 08 Classification rules

  • 1. Prof. Pier Luca Lanzi Classification Rules Data Mining andText Mining (UIC 583 @ Politecnico di Milano)
  • 2. Prof. Pier Luca Lanzi The Weather Dataset Outlook Temp Humidity Windy Play Sunny Hot High False No Sunny Hot High True No Overcast Hot High False Yes Rainy Mild High False Yes Rainy Cool Normal False Yes Rainy Cool Normal True No Overcast Cool Normal True Yes Sunny Mild High False No Sunny Cool Normal False Yes Rainy Mild Normal False Yes Sunny Mild Normal True Yes Overcast Mild High True Yes Overcast Hot Normal False Yes Rainy Mild High True No 2
  • 3. Prof. Pier Luca Lanzi A Rule Set to Classify the Data • IF (humidity = high) and (outlook = sunny) THEN play=no (3.0/0.0) • IF (outlook = rainy) and (windy = TRUE) THEN play=no (2.0/0.0) • OTHERWISE play=yes (9.0/0.0) 3
  • 4. Prof. Pier Luca Lanzi Let’s Check the First Rule Outlook Temp Humidity Windy Play Sunny Hot High False No Sunny Hot High True No Overcast Hot High False Yes Rainy Mild High False Yes Rainy Cool Normal False Yes Rainy Cool Normal True No Overcast Cool Normal True Yes Sunny Mild High False No Sunny Cool Normal False Yes Rainy Mild Normal False Yes Sunny Mild Normal True Yes Overcast Mild High True Yes Overcast Hot Normal False Yes Rainy Mild High True No 4 IF (humidity = high) and (outlook = sunny) THEN play=no (3.0/0.0)
  • 5. Prof. Pier Luca Lanzi Then, The Second Rule Outlook Temp Humidity Windy Play Sunny Hot High False No Sunny Hot High True No Overcast Hot High False Yes Rainy Mild High False Yes Rainy Cool Normal False Yes Rainy Cool Normal True No Overcast Cool Normal True Yes Sunny Mild High False No Sunny Cool Normal False Yes Rainy Mild Normal False Yes Sunny Mild Normal True Yes Overcast Mild High True Yes Overcast Hot Normal False Yes Rainy Mild High True No 5 IF (outlook = rainy) and (windy = TRUE) THEN play=no (2.0/0.0)
  • 6. Prof. Pier Luca Lanzi Finally, the Third Rule Outlook Temp Humidity Windy Play Sunny Hot High False No Sunny Hot High True No Overcast Hot High False Yes Rainy Mild High False Yes Rainy Cool Normal False Yes Rainy Cool Normal True No Overcast Cool Normal True Yes Sunny Mild High False No Sunny Cool Normal False Yes Rainy Mild Normal False Yes Sunny Mild Normal True Yes Overcast Mild High True Yes Overcast Hot Normal False Yes Rainy Mild High True No 6 IF (outlook = rainy) and (windy = TRUE) THEN play=no (2.0/0.0)
  • 7. Prof. Pier Luca Lanzi A Simpler Solution • IF (outlook = sunny) THEN play IS no ELSE IF (outlook = overcast) THEN play IS yes ELSE IF (outlook = rainy) THEN play IS yes (6/14 instances correct) 7
  • 8. Prof. Pier Luca Lanzi What is a Classification Rule? 8
  • 9. Prof. Pier Luca Lanzi What Is A Classification Rules? Why Rules? • They are IF-THEN rules §The IF part states a condition over the data §The THEN part includes a class label • What types of conditions? §Propositional, with attribute-value comparisons §First order Horn clauses, with variables • Why rules? §Because they are one of the most expressive and most human readable representation for hypotheses 9
  • 10. Prof. Pier Luca Lanzi Coverage and Accuracy • IF (humidity = high) and (outlook = sunny) THEN play=no (3.0/0.0) • ncovers = number of examples covered by the rule • ncorrect = number of examples correctly classified by the rule • coverage(R) = ncovers/size of the |training data set • accuracy(R) = ncorrect/ncovers 10
  • 11. Prof. Pier Luca Lanzi Conflict Resolution • If more than one rule is triggered, we need conflict resolution • Size ordering §Assign the highest priority to the triggering rules that has the “toughest” requirement (i.e., with the most attribute test) • Class-based ordering §Decreasing order of prevalence or misclassification cost per class • Rule-based ordering (decision list) §Rules are organized into one long priority list, according to some measure of rule quality or by experts 11
  • 12. Prof. Pier Luca Lanzi Two Approaches for Rule Learning • Direct Methods §Directly learn the rules from the training data • Indirect Methods §Learn decision tree, then convert to rules §Learn neural networks, then extract rules 12
  • 13. Prof. Pier Luca Lanzi 1R Classifier 13
  • 14. Prof. Pier Luca Lanzi Inferring Rudimentary Rules • 1R Classifier learns a simple rule involving one attribute §Assumes nominal attributes §The rule tests all the values of one particular attribute • Basic version §One branch for each value §Each branch assigns most frequent class §Attribute performance is measured using the error rate computed as the proportion of instances that don’t belong to the majority class of their corresponding branch §Choose attribute with lowest error rate §“missing” is treated as a separate value 14
  • 15. Prof. Pier Luca Lanzi For each attribute, For each value of the attribute, make a rule as follows: count how often each class appears find the most frequent class make the rule assign that class to this attribute-value Calculate the error rate of the rules Choose the rules with the smallest error rate Pseudo-Code for OneRule 15
  • 16. Prof. Pier Luca Lanzi Evaluating the Weather Attributes 16 3/6True => No 5/14*2/8False =>YesWindy 1/7Normal =>Yes 4/143/7High => NoHumidity 5/14* 4/14 Total errors 1/4Cool => Yes 2/6Mild =>Yes 2/4Hot => NoTemp 2/5Rainy =>Yes 0/4Overcast =>Yes 2/5Sunny => NoOutlook ErrorsRulesAttribute NoTrueHighMildRainy YesFalseNormalHotOvercast YesTrueHighMildOvercast YesTrueNormalMildSunny YesFalseNormalMildRainy YesFalseNormalCoolSunny NoFalseHighMildSunny YesTrueNormalCoolOvercast NoTrueNormalCoolRainy YesFalseNormalCoolRainy YesFalseHighMildRainy YesFalseHighHotOvercast NoTrueHighHotSunny NoFalseHighHotSunny PlayWindyHumidityTempOutlook * indicates a tie
  • 17. Prof. Pier Luca Lanzi Contact Lens Data 17 NoneReducedYesHypermetropePre-presbyopic NoneNormalYesHypermetropePre-presbyopic NoneReducedNoMyopePresbyopic NoneNormalNoMyopePresbyopic NoneReducedYesMyopePresbyopic HardNormalYesMyopePresbyopic NoneReducedNoHypermetropePresbyopic SoftNormalNoHypermetropePresbyopic NoneReducedYesHypermetropePresbyopic NoneNormalYesHypermetropePresbyopic SoftNormalNoHypermetropePre-presbyopic NoneReducedNoHypermetropePre-presbyopic HardNormalYesMyopePre-presbyopic NoneReducedYesMyopePre-presbyopic SoftNormalNoMyopePre-presbyopic NoneReducedNoMyopePre-presbyopic hardNormalYesHypermetropeYoung NoneReducedYesHypermetropeYoung SoftNormalNoHypermetropeYoung NoneReducedNoHypermetropeYoung HardNormalYesMyopeYoung NoneReducedYesMyopeYoung SoftNormalNoMyopeYoung NoneReducedNoMyopeYoung Recommended lensesTear production rateAstigmatismSpectacle prescriptionAge
  • 18. Prof. Pier Luca Lanzi 1R Applied to Contact Lens Data • IF tear-prod-rate=normal THEN soft ELSE IF tear-prod-rate=reduced THEN none (17/24 instances correct) 18
  • 19. Prof. Pier Luca Lanzi How can we deal with numerical attributes using 1R? We can discretize them before applying 1R We can directly use 1R discretization
  • 20. Prof. Pier Luca Lanzi 1R and Numerical Attributes • Applies simple supervised discretization • Sort instances according to attribute’s values • Place breakpoints where class changes (majority class) • This procedure is however very sensitive to noise since one example with an incorrect class label may produce a separate interval. This is likely to lead to overfitting. • In the case of the temperature, 20 64 65 68 69 70 71 72 72 75 75 80 81 83 85 Yes | No | Yes Yes Yes | No No Yes | Yes Yes | No | Yes Yes | No
  • 21. Prof. Pier Luca Lanzi 1R and Numerical Attributes • To limit overfitting, enforce minimum number of instances in majority class per interval. • For instance, in the case of the temperature, if we set the minimum number of majority class instances to 3, we have 21 64 65 68 69 70 71 72 72 75 75 80 81 83 85 Yes | No | Yes Yes Yes | No No Yes | Yes Yes | No | Yes Yes | No 64 65 68 69 70 71 72 72 75 75 80 81 83 85 Yes | No | Yes Yes Yes | No No Yes | Yes Yes | No | Yes Yes | No 64 65 68 69 70 71 72 72 75 75 80 81 83 85 Yes No Yes Yes Yes | No No Yes Yes Yes | No Yes Yes No join the intervals to get at least 3 examples join the intervals with the same majority class
  • 22. Prof. Pier Luca Lanzi 1R Applied to the Numerical Version of the Weather Dataset 22 0/1> 95.5 =>Yes 3/6True => No* 5/142/8False =>YesWindy 2/6> 82.5 and ≤ 95.5 => No 3/141/7≤ 82.5 => YesHumidity 5/14 4/14 Total errors 2/4> 77.5 => No* 3/10≤ 77.5 =>YesTemperature 2/5Rainy =>Yes 0/4Overcast =>Yes 2/5Sunny => NoOutlook ErrorsRulesAttribute
  • 23. Prof. Pier Luca Lanzi Sequential Covering 23
  • 24. Prof. Pier Luca Lanzi Sequential Covering Algorithms • Consider the set E of positive and negative examples • Repeat §Learn one rule with high accuracy, any coverage §Remove positive examples covered by this rule • Until all the examples are covered 24
  • 25. Prof. Pier Luca Lanzi Basic Sequential Covering Algorithm procedure Covering (Examples, Classifier) input: a set of positive and negative examples for class c // rule set is initially empty classifier = {} while PositiveExamples(Examples)!={} // find the best rule possible Rule = FindBestRule(Examples) // check if we need more rules if Stop(Examples, Rule, Classifier) breakwhile // remove covered examples and update the model Examples = ExamplesCover(Rule,Examples) Classifier = Classifier U {Rule} Endwhile // post-process the rules (sort them, simplify them, etc.) Classifier = PostProcessing(Classifier) output: Classifier 25
  • 26. Prof. Pier Luca Lanzi Finding the Best Rule Possible 26 IF THEN Play=yes IF Wind=No THEN Play=yes IF Humidity=Normal THEN Play=yes IF Humidity=High THEN Play=yes IF … THEN … IF Wind=yes THEN Play=yes IF Humidity=Normal AND Wind=yes THEN Play=yes IF Humidity=Normal AND Wind=No THEN Play=yes IF Humidity=Normal AND Outlook=Rainy THEN Play=yes P=5/10 = 0.5 P=6/8=0.75 P=6/7=0.86 P=3/7=0.43
  • 27. Prof. Pier Luca Lanzi And Another Viewpoint 27 (i) Original Data (ii) Step 1 (iii) Step 2 R1 (iv) Step 3 R1 R2
  • 28. Prof. Pier Luca Lanzi Learning Just One Rule LearnOneRule(Attributes, Examples, k) init BH to the most general hypothesis init CH to {BH} while CH not empty Do Generate Next More Specific CH in NCH // check all the NCH for an hypothesis that // improves the performance of BH Update BH Update CH with the k best NCH endwhile return a rule “IF BH THEN prediction” 28
  • 29. Prof. Pier Luca Lanzi An Example Using Contact Lens Data 29 NoneReducedYesHypermetropePre-presbyopic NoneNormalYesHypermetropePre-presbyopic NoneReducedNoMyopePresbyopic NoneNormalNoMyopePresbyopic NoneReducedYesMyopePresbyopic HardNormalYesMyopePresbyopic NoneReducedNoHypermetropePresbyopic SoftNormalNoHypermetropePresbyopic NoneReducedYesHypermetropePresbyopic NoneNormalYesHypermetropePresbyopic SoftNormalNoHypermetropePre-presbyopic NoneReducedNoHypermetropePre-presbyopic HardNormalYesMyopePre-presbyopic NoneReducedYesMyopePre-presbyopic SoftNormalNoMyopePre-presbyopic NoneReducedNoMyopePre-presbyopic hardNormalYesHypermetropeYoung NoneReducedYesHypermetropeYoung SoftNormalNoHypermetropeYoung NoneReducedNoHypermetropeYoung HardNormalYesMyopeYoung NoneReducedYesMyopeYoung SoftNormalNoMyopeYoung NoneReducedNoMyopeYoung Recommended lensesTear production rateAstigmatismSpectacle prescriptionAge
  • 30. Prof. Pier Luca Lanzi First Step: the Most General Rule • Rule we seek: • Possible tests: 30 4/12Tear production rate = Normal 0/12Tear production rate = Reduced 4/12Astigmatism = yes 0/12Astigmatism = no 1/12Spectacle prescription = Hypermetrope 3/12Spectacle prescription = Myope 1/8Age = Presbyopic 1/8Age = Pre-presbyopic 2/8Age = Young If ? then recommendation = hard
  • 31. Prof. Pier Luca Lanzi Adding the First Clause • Rule with best test added, • Instances covered by modified rule, 31 NoneReducedYesHypermetropePre-presbyopic NoneNormalYesHypermetropePre-presbyopic NoneReducedYesMyopePresbyopic HardNormalYesMyopePresbyopic NoneReducedYesHypermetropePresbyopic NoneNormalYesHypermetropePresbyopic HardNormalYesMyopePre-presbyopic NoneReducedYesMyopePre-presbyopic hardNormalYesHypermetropeYoung NoneReducedYesHypermetropeYoung HardNormalYesMyopeYoung NoneReducedYesMyopeYoung Recommended lensesTear production rateAstigmatismSpectacle prescriptionAge If astigmatism = yes then recommendation = hard
  • 32. Prof. Pier Luca Lanzi Extending the First Rule • Current state, • Possible tests, 32 4/6Tear production rate = Normal 0/6Tear production rate = Reduced 1/6Spectacle prescription = Hypermetrope 3/6Spectacle prescription = Myope 1/4Age = Presbyopic 1/4Age = Pre-presbyopic 2/4Age = Young If astigmatism = yes and ? then recommendation = hard
  • 33. Prof. Pier Luca Lanzi The Second Rule • Rule with best test added: • Instances covered by modified rule 33 NoneNormalYesHypermetropePrepresbyopic HardNormalYesMyopePresbyopic NoneNormalYesHypermetropePresbyopic HardNormalYesMyopePrepresbyopic HardNormalYesHypermetropeYoung HardNormalYesMyopeYoung Recommended lensesTear production rateAstigmatismSpectacle prescriptionAge If astigmatism = yes and tear production rate = normal then recommendation = Hard
  • 34. Prof. Pier Luca Lanzi Adding the Third Clause • Current state: • Possible tests: • Tie between the first and the fourth test, we choose the one with greater coverage 34 1/3Spectacle prescription = Hypermetrope 3/3Spectacle prescription = Myope 1/2Age = Presbyopic 1/2Age = Pre-presbyopic 2/2Age = Young If astigmatism = yes and tear production rate = normal and ? then recommendation = hard
  • 35. Prof. Pier Luca Lanzi The Final Result • Final rule: • Second rule for recommending “hard lenses”: (built from instances not covered by first rule) • These two rules cover all “hard lenses”: • Process is repeated with other two classes 35 If astigmatism = yes and tear production rate = normal and spectacle prescription = myope then recommendation = hard If age = young and astigmatism = yes and tear production rate = normal then recommendation = hard
  • 36. Prof. Pier Luca Lanzi Testing for the Best Rule • Measure 1: Accuracy (p/t) §t total instances covered by rule pnumber of these that are positive §Produce rules that do not cover negative instances, as quickly as possible §May produce rules with very small coverage —special cases or noise? • Measure 2: Information gain p (log(p/t) – log(P/T)) §P and T the positive and total numbers before the new condition was added §Information gain emphasizes positive rather than negative instances • These measures interact with the pruning mechanism used 36
  • 37. Prof. Pier Luca Lanzi Eliminating Instances • Why do we need to eliminate instances? §Otherwise, the next rule is identical to previous rule • Why do we remove positive instances? §To ensure that the next rule is different §Prevent overestimating the accuracy of rule §So that we have a more robust estimate of accuracy (why?) • Why do we remove negative instances? §Prevent underestimating the accuracy of rule §Compare rules R2 and R3 in the following diagram 37
  • 38. Prof. Pier Luca Lanzi Eliminating Instances 38 class = + class = - + + + + + + + + + + + + + + + + ++ + + - - - - - - - - - - - - - - - - - - - - - + + ++ + + + R1 R3 R2 + +
  • 39. Prof. Pier Luca Lanzi Missing Values and Numeric Attributes • Missing values usually fail the test • Covering algorithm must either §Use other tests to separate out positive instances §Leave them uncovered until later in the process • In some cases it is better to treat “missing” as a separate value • Numeric attributes are treated as in decision trees 39
  • 40. Prof. Pier Luca Lanzi Stopping Criterion and Rule Pruning • The process usually stops when there is no significant improvement by adding the new rule • Rule pruning is similar to post-pruning of decision trees • Reduced Error Pruning: § Remove one of the conjuncts in the rule § Compare error rate on validation set § If error improves, prune the conjunct 40
  • 41. Prof. Pier Luca Lanzi run the KNIME workflows on decision rules 41
  • 43. Prof. Pier Luca Lanzi Indirect Methods 43
  • 44. Prof. Pier Luca Lanzi Rules vs. Trees • Rule sets can be more readable • Decision trees suffer from replicated subtrees • Rule sets are collections of local models, trees represent models over the whole domain • The covering algorithm concentrates on one class at a time whereas decision tree learner takes all classes into account 44
  • 45. Prof. Pier Luca Lanzi Indirect Methods Rule Set r1: (P=No,Q=No) ==> - r2: (P=No,Q=Yes) ==> + r3: (P=Yes,R=No) ==> + r4: (P=Yes,R=Yes,Q=No) ==> - r5: (P=Yes,R=Yes,Q=Yes) ==> + P Q R Q- + + - + No No No Yes Yes Yes No Yes 45
  • 46. Prof. Pier Luca Lanzi run the KNIME workflows on decision rules 46
  • 48. Prof. Pier Luca Lanzi Summary 48
  • 49. Prof. Pier Luca Lanzi Summary • Advantages of Rule-Based Classifiers §As highly expressive as decision trees §Easy to interpret §Easy to generate §Can classify new instances rapidly §Performance comparable to decision trees • Two approaches: direct and indirect methods 49
  • 50. Prof. Pier Luca Lanzi Summary • Direct Methods, typically apply sequential covering approach §Grow a single rule §Remove Instances from rule §Prune the rule (if necessary) §Add rule to Current Rule Set §Repeat • Other approaches exist §Specific to general exploration (RISE) §Post processing of neural networks, association rules, decision trees, etc. 50
  • 51. Prof. Pier Luca Lanzi Homework • Generate the rule set for the Weather dataset by repeatedly applying the procedure to learn one rule until no improvement can be produced or the covered examples are too few • Check the problems provided in the previous exams and apply both OneRule and Sequential Covering to generate the first rule. Then, check the result with one of the implementations available in Weka • Apply one or more sequential covering algorithms using the KNIME workflow. Compute the usual performance statistics. 51