SlideShare a Scribd company logo
1 of 27
Download to read offline
Introduction to Machine
       Learning
                    Lecture 13
    Introduction to Association Rules

                  Albert Orriols i Puig
                 aorriols@salle.url.edu
                     i l @ ll       ld

        Artificial Intelligence – Machine Learning
            Enginyeria i Arquitectura La Salle
                gy           q
                   Universitat Ramon Llull
Recap of Lecture 5-12

                          LET’S START WITH DATA
                             CLASSIFICATION




                                                               Slide 2
Artificial Intelligence                     Machine Learning
Recap of Lecture 5-12
                  Data Set          Classification Model        How?




We have seen four different types of approaches to classification :
         • Decision trees (C4.5)
         • Instance-based algorithms (kNN & CBR)
           Instance based
         • Bayesian classifiers (Naïve Bayes)
         •N
          Neural N t
               l Networks (P
                       k (Perceptron, Ad li
                                t     Adaline, M d li
                                               Madaline, SVM)

                                                                       Slide 3
Artificial Intelligence               Machine Learning
Today’s Agenda


        Introduction to Association Rules
        A Taxonomy of Association Rules
        Measures of Interest
        Apriori




                                                  Slide 4
Artificial Intelligence        Machine Learning
Introduction to AR
        Ideas come from the market basket analysis (
                                              y    (MBA)
                                                       )
                Let’s go shopping!

           Milk, eggs, sugar,
                 bread
                                 Milk, eggs, cereal,        Eggs, sugar
                                        bread
                                        bd




              Customer1

                                     Customer2               Customer3

                What do my customer buy? Which product are bought together?
                Aim: Find associations and correlations between t e d e e t
                         d assoc at o s a d co e at o s bet ee the different
                items that customers place in their shopping basket
                                                                          Slide 5
Artificial Intelligence                Machine Learning
Introduction to AR
        Formalizing the problem a little bit
                  g     p
                Transaction Database T: a set of transactions T = {t1, t2, …, tn}
                Each transaction contains a set of items I (it
                E ht        ti      ti        t f it       (item set)
                                                                   t)
                An itemset is a collection of items I = {i1, i2, …, im}


        General aim:
                Find frequent/interesting patterns, associations, correlations, or
                causal structures among sets of items or elements in
                databases or other information repositories.
                Put this relationships in terms of association rules
                          X⇒ Y



                                                                              Slide 6
Artificial Intelligence                  Machine Learning
Example of AR

       TID        Items                                     Examples:
       T1         bread, jelly, peanut-butter
                                                                   bread ⇒ peanut-butter
                                                                           peanut butter
       T2         bread, peanut-butter
                                                                   beer ⇒ bread
       T3         bread, milk, peanut-butter
       T4         beer, bread
       T5         beer, milk




        Frequent itemsets: Items that frequently appear together
                I = {bread, peanut-butter}
                    {bread
                I = {beer, bread}


                                                                                           Slide 7
Artificial Intelligence                         Machine Learning
What’s an Interesting Rule?
        Support count (σ)
          pp          ()                                         TID   Items
                                                                 T1    bread, jelly, peanut-butter
                Frequency of occurrence of
                a d e se
                and itemset                                      T2    bread, peanut-butter
                                                                            ,p
                          σ ({bread, peanut-butter}) = 3         T3    bread, milk, peanut-butter
                                                                 T4    beer, bread
                          σ ({beer, bread}) = 1
                            ({    ,      })
                                                                 T5    beer, milk
        Support
                Fraction f t
                F ti of transactions that
                                   ti th t
                contain an itemset
                          s ({bread peanut butter}) = 3/5
                            ({bread,peanut-butter})
                          s ({beer, bread}) = 1/5

        Frequent itemset
        F      t it    t
                An itemset whose support is greater than or equal to a
                minimum support threshold (minsup)
                                                                                         Slide 8
Artificial Intelligence                       Machine Learning
What’s an Interesting Rule?
        An association rule is an                            TID   Items
        implication of two itemsets                          T1    bread, jelly, peanut-butter

                X⇒Y                                          T2    bread, peanut-butter
                                                                        ,p
                                                             T3    bread, milk, peanut-butter
                                                             T4    beer, bread
        Many measures of interest.                           T5    beer, milk
        The two most used are:
                Support (s)
                                                                         σ (X ∪Y )
                   The occurring frequency of the rule,
                                                                    s=
                   i.e., number of transactions that
                                                                           # of trans.
                   contain both X and Y
                Confidence (c)
                                                                      σ (X ∪Y )
                   The strength of the association,
                                                                   c=
                                                                        σ (X)
                   i.e.,
                   i e measures of how often items in Y
                   appear in transactions that contain X
                                                                                       Slide 9
Artificial Intelligence                   Machine Learning
Interestingness of Rules
                                                            TID   Items
                      TID           s                   c   T1    bread, jelly, peanut-butter
bread ⇒ peanut-butter              0.60            0.75     T2    bread, peanut-butter
peanut-butter ⇒ bread              0.60            1.00     T3    bread, milk, peanut-butter
beer ⇒ bread                       0.20            0.50     T4    beer, bread
peanut-butter ⇒ jelly              0.20            0.33     T5    beer, milk
jelly ⇒ peanut-butter              0.20            1.00
j ll ⇒ milk
jelly   ilk                        0.00
                                   0 00            0.00
                                                   0 00



        Many other interesting measures
                The method presented herein are based on these two
                approaches



                                                                                    Slide 10
Artificial Intelligence              Machine Learning
Types of AR
        Binary association rules:
             y
                bread ⇒ peanut-butter


        Quantitative association rules:
                weight in [70kg – 90kg] ⇒ height in [170cm – 190cm]


        Fuzzy association rules:
                weight in TALL ⇒ height in TALL


        Let’s start for the beginning
                Binary association rules – A priori
                Bi          i ti     l         ii

                                                                      Slide 11
Artificial Intelligence                 Machine Learning
Apriori
        This is the most influential AR miner
        It consists of two steps
                 Generate all f
                 G         ll frequent i
                                       itemsets whose support ≥ minsup
                                                 h               i
        1.

                 Use frequent itemsets to generate association rules
        2.



        So, let’s
        So let s pay attention to the first step




                                                                         Slide 12
Artificial Intelligence                Machine Learning
Apriori
                                              null




               A                 B                C             D            E




AB              AC         AD        AE     BC          BD     BE     CD    CE        DE




ABC            ABD         ABE       ACD    ACE        ADE     BCD    BCE   BDE       CDE




            ABCD                 ABCE        ABDE              ACDE         BCDE




                                            ABCDE

                     Given d items, we have 2d possible itemsets.
                           Do I have to generate them all?
                                                                                   Slide 13
 Artificial Intelligence                    Machine Learning
Apriori
        Let’s avoid expanding all the graph
                      p     g         gp
        Key idea:
                Downward closure property: A subsets of a f
                D         dl               Any b      f frequent itemset
                                                                 i
                are also frequent itemsets


        Therefore, the algorithm iteratively does:
                Create itemsets
                Only continue exploration of those whose support ≥ minsup




                                                                        Slide 14
Artificial Intelligence               Machine Learning
Example Itemset Generation
                                             null
        Infrequent
          itemset

               A                 B               C             D            E




AB              AC         AD        AE    BC          BD     BE     CD    CE        DE




ABC            ABD         ABE       ACD   ACE        ADE     BCD    BCE   BDE       CDE




            ABCD                 ABCE       ABDE              ACDE         BCDE




                                            ABCD

                     Given d items, we have 2d possible itemsets.
                           Do I have to generate them all?
                                                                                  Slide 15
 Artificial Intelligence                   Machine Learning
Recovering the Example
                                                     TID    Items
                                                     T1     bread, jelly, peanut-butter
                                                     T2     bread, peanut-butter
                                                     T3     bread, ilk
                                                            b d milk, peanut-butter
                                                                             b
                                                     T4     beer, bread
Minimum support = 3
          pp
                                                     T5     beer, milk
                                                            b      ilk
          1-itemsets
  Item                    count
                                               2-itemsets
  bread                     4
                                  Item                 count
  peanut-b                  3
                                  bread, peanut-b           3
  jelly                     1
  milk                      1
  beer                      1




                                                                                   Slide 16
Artificial Intelligence           Machine Learning
Apriori Algorithm
        k=1
        Generate frequent itemsets of length 1
        Repeat until no frequent itemsets are found
                k := k+1
                Generate itemsets of size k from the k-1 frequent itemsets
                Compute the support of each candidate by scanning DB




                                                                             Slide 17
Artificial Intelligence               Machine Learning
Apriori Algorithm
Algorithm Apriori(T)
    C1 ← init-pass(T);
    F1 ← {f | f ∈ C1, f.count/n ≥ minsup}; // n: no. of transactions in T
    for (k = 2; Fk-1 ≠ ∅; k++) do
        Ck ← candidate-gen(Fk-1);
        for each transaction t ∈ T do
           for each candidate c ∈ Ck do
                 if c i contained i t th
                      is   t i d in then
                    c.count++;
           endd
        end
         Fk ← {c ∈ Ck | c count/n ≥ minsup}
                         c.count/n
    end
return F ← Uk Fk;

                                                                  Slide 18
Artificial Intelligence         Machine Learning
Apriori Algorithm
Function candidate-gen(Fk-1)
   Ck ← ∅;
   forall f1, f2 ∈ Fk-1
       with f1 = {i1, … , ik-2, ik-1}
       and f2 = {i1, … , ik-2, i’k-1}
       and ik-1 < i’k-1 do
      c ← {i1, …, ik-1, i’k-1};       // join f1 and f2
      Ck ← Ck ∪ {c};
      for each (k-1)-subset s of c do
       if ( ∉ Fk-1) th
          (s         then
           delete c from Ck;          // prune
      end
   end
   return Ck;

                                                          Slide 19
Artificial Intelligence         Machine Learning
Example of Apriori Run
                                          Itemset         sup
                                                                               Itemset        sup
Database TDB
Dtb                                           {A}          2        L1           {A}            2
                                   C1
Tid          Items                            {B}          3
                                                                                 {B}            3
10          A, C
            A C, D                            {C}          3
                                                                                 {C}            3
                              1st scan
20           B, C, E                          {D}          1
                                                                                    {E}         3
30        A, B, C, E                          {E}          3
40             B, E
                                         Itemset         sup
                                   C2                                          C2
                                                                                          Itemset
                                                                                           te set
                                          {A,
                                          {A B}           1
                                                                  2nd   scan
L2       Itemset            sup                                                            {A, B}
                                          {A, C}          2
          {A, C}             2                                                             {A, C}
                                          {A, E}          1
          {B,
          {B C}              2
                                                                                           {A, E}
                                          {B, C}          2
          {B, E}             3
                                                                                           {B, C}
                                          {B, E}          3
          {C, E}             2
                                          {C, E}          2                                {B,
                                                                                           {B E}
                                                                                           {C, E}

                Itemset
                 te set                             L3
      C3                           3rd scan                Itemset
                                                           It      t     sup
                {B, C, E}
                                                           {B, C, E}      2
                                                                                                    Slide 20
  Artificial Intelligence                      Machine Learning
Apriori
        Remember that Apriori consists of two steps
                       p                         p
                 Generate all frequent itemsets whose support ≥ minsup
        1.

                 Use frequent it
                 Uf         t itemsets t generate association rules
                                    t to       t       i ti     l
        2.



        We accomplished step 1. So we have all frequent
        itemsets
        So, let’s pay attention to the second step




                                                                         Slide 21
Artificial Intelligence                Machine Learning
Rule Generation in Apriori
        Given a frequent itemset L
                   q
                Find all non-empty subsets F in L, such that the association
                rule F ⇒ {L-F} sat s es the minimum confidence
                 ue       { } satisfies t e      u co de ce
                Create the rule F ⇒ {L-F}


        If L={A,B,C}
                The candidate itemsets are: AB⇒C, AC⇒B, BC⇒A, A⇒BC,
                B⇒AC, C⇒AB
                In general, there are 2K-2 candidate solutions, where k is the
                length of the itemset L




                                                                            Slide 22
Artificial Intelligence                Machine Learning
Can you Be More Efficient?
        Can we apply the same trick used with support?
                pp y                            pp
                Confidence does not have anti-monote property
                Th t is, c(AB⇒D) > c(A ⇒D)?
                That i    (AB D)    (A D)?
                          Don’t know!


        But confidence of rules generated from the same itemset
        does have the anti-monote property
        d    h     h     i
                L={A,B,C,D}
                          C(ABC⇒D) ≥ c(AB ⇒CD) ≥ c(A ⇒BCD)
                We can apply this p p y to p
                        pp y      property prune the rule g
                                                          generation




                                                                       Slide 23
Artificial Intelligence                  Machine Learning
Example of Efficient Rule Generation

                                              ABCD
   Low
confidence


             ABC⇒D                ABD⇒C                 ACD⇒B             BCD⇒A




AB⇒CD                     AC⇒BD      BC⇒AD              AD⇒BC         BD⇒AD           CD⇒AB




               A⇒BCD               B⇒ACD                      C⇒ABD           D⇒ABC




                                                                                        Slide 24
Artificial Intelligence                    Machine Learning
Challenges in AR Mining
        Challenges
               g
                Apriori scans the data base multiple times
                Most ft
                M t often, there is a high number of candidates
                           th    i    hi h    b    f    did t
                Support counting for candidates can be time expensive


        Several methods try to improve this points by
                Reduce the number of scans of the data base
                Shrink the number of candidates
                Counting the support of candidates more efficiently




                                                                        Slide 25
Artificial Intelligence                Machine Learning
Next Class



        Advanced topics in association rule mining




                                                     Slide 26
Artificial Intelligence      Machine Learning
Introduction to Machine
       Learning
                    Lecture 13
    Introduction to Association Rules

                  Albert Orriols i Puig
                 aorriols@salle.url.edu
                     i l @ ll       ld

        Artificial Intelligence – Machine Learning
            Enginyeria i Arquitectura La Salle
                gy           q
                   Universitat Ramon Llull

More Related Content

What's hot

Association rule mining
Association rule miningAssociation rule mining
Association rule miningAcad
 
Data preprocessing in Machine learning
Data preprocessing in Machine learning Data preprocessing in Machine learning
Data preprocessing in Machine learning pyingkodi maran
 
Association rule mining and Apriori algorithm
Association rule mining and Apriori algorithmAssociation rule mining and Apriori algorithm
Association rule mining and Apriori algorithmhina firdaus
 
Lecture 4 Decision Trees (2): Entropy, Information Gain, Gain Ratio
Lecture 4 Decision Trees (2): Entropy, Information Gain, Gain RatioLecture 4 Decision Trees (2): Entropy, Information Gain, Gain Ratio
Lecture 4 Decision Trees (2): Entropy, Information Gain, Gain RatioMarina Santini
 
Association Rule Learning Part 1: Frequent Itemset Generation
Association Rule Learning Part 1: Frequent Itemset GenerationAssociation Rule Learning Part 1: Frequent Itemset Generation
Association Rule Learning Part 1: Frequent Itemset GenerationKnoldus Inc.
 
Data Mining: Concepts and Techniques (3rd ed.) - Chapter 3 preprocessing
Data Mining:  Concepts and Techniques (3rd ed.)- Chapter 3 preprocessingData Mining:  Concepts and Techniques (3rd ed.)- Chapter 3 preprocessing
Data Mining: Concepts and Techniques (3rd ed.) - Chapter 3 preprocessingSalah Amean
 
Association Analysis in Data Mining
Association Analysis in Data MiningAssociation Analysis in Data Mining
Association Analysis in Data MiningKamal Acharya
 
Association rule mining.pptx
Association rule mining.pptxAssociation rule mining.pptx
Association rule mining.pptxmaha797959
 
Frequent itemset mining methods
Frequent itemset mining methodsFrequent itemset mining methods
Frequent itemset mining methodsProf.Nilesh Magar
 
Data mining: Classification and prediction
Data mining: Classification and predictionData mining: Classification and prediction
Data mining: Classification and predictionDataminingTools Inc
 
2.4 rule based classification
2.4 rule based classification2.4 rule based classification
2.4 rule based classificationKrish_ver2
 
Data Mining:Concepts and Techniques, Chapter 8. Classification: Basic Concepts
Data Mining:Concepts and Techniques, Chapter 8. Classification: Basic ConceptsData Mining:Concepts and Techniques, Chapter 8. Classification: Basic Concepts
Data Mining:Concepts and Techniques, Chapter 8. Classification: Basic ConceptsSalah Amean
 
Data preprocessing
Data preprocessingData preprocessing
Data preprocessingankur bhalla
 

What's hot (20)

Association rule mining
Association rule miningAssociation rule mining
Association rule mining
 
Apriori algorithm
Apriori algorithmApriori algorithm
Apriori algorithm
 
Data preprocessing in Machine learning
Data preprocessing in Machine learning Data preprocessing in Machine learning
Data preprocessing in Machine learning
 
Association rule mining and Apriori algorithm
Association rule mining and Apriori algorithmAssociation rule mining and Apriori algorithm
Association rule mining and Apriori algorithm
 
Lecture 4 Decision Trees (2): Entropy, Information Gain, Gain Ratio
Lecture 4 Decision Trees (2): Entropy, Information Gain, Gain RatioLecture 4 Decision Trees (2): Entropy, Information Gain, Gain Ratio
Lecture 4 Decision Trees (2): Entropy, Information Gain, Gain Ratio
 
Apriori
AprioriApriori
Apriori
 
Apriori Algorithm
Apriori AlgorithmApriori Algorithm
Apriori Algorithm
 
Association Rule Learning Part 1: Frequent Itemset Generation
Association Rule Learning Part 1: Frequent Itemset GenerationAssociation Rule Learning Part 1: Frequent Itemset Generation
Association Rule Learning Part 1: Frequent Itemset Generation
 
Data Mining: Concepts and Techniques (3rd ed.) - Chapter 3 preprocessing
Data Mining:  Concepts and Techniques (3rd ed.)- Chapter 3 preprocessingData Mining:  Concepts and Techniques (3rd ed.)- Chapter 3 preprocessing
Data Mining: Concepts and Techniques (3rd ed.) - Chapter 3 preprocessing
 
Association Analysis in Data Mining
Association Analysis in Data MiningAssociation Analysis in Data Mining
Association Analysis in Data Mining
 
Data Preprocessing
Data PreprocessingData Preprocessing
Data Preprocessing
 
Association rule mining.pptx
Association rule mining.pptxAssociation rule mining.pptx
Association rule mining.pptx
 
Fp growth
Fp growthFp growth
Fp growth
 
Frequent itemset mining methods
Frequent itemset mining methodsFrequent itemset mining methods
Frequent itemset mining methods
 
Data mining: Classification and prediction
Data mining: Classification and predictionData mining: Classification and prediction
Data mining: Classification and prediction
 
2.4 rule based classification
2.4 rule based classification2.4 rule based classification
2.4 rule based classification
 
Decision tree
Decision treeDecision tree
Decision tree
 
Data Mining:Concepts and Techniques, Chapter 8. Classification: Basic Concepts
Data Mining:Concepts and Techniques, Chapter 8. Classification: Basic ConceptsData Mining:Concepts and Techniques, Chapter 8. Classification: Basic Concepts
Data Mining:Concepts and Techniques, Chapter 8. Classification: Basic Concepts
 
Data preprocessing
Data preprocessingData preprocessing
Data preprocessing
 
Apriori algorithm
Apriori algorithmApriori algorithm
Apriori algorithm
 

More from Albert Orriols-Puig (20)

Lecture1 AI1 Introduction to artificial intelligence
Lecture1 AI1 Introduction to artificial intelligenceLecture1 AI1 Introduction to artificial intelligence
Lecture1 AI1 Introduction to artificial intelligence
 
HAIS09-BeyondHomemadeArtificialDatasets
HAIS09-BeyondHomemadeArtificialDatasetsHAIS09-BeyondHomemadeArtificialDatasets
HAIS09-BeyondHomemadeArtificialDatasets
 
Lecture24
Lecture24Lecture24
Lecture24
 
Lecture23
Lecture23Lecture23
Lecture23
 
Lecture22
Lecture22Lecture22
Lecture22
 
Lecture21
Lecture21Lecture21
Lecture21
 
Lecture20
Lecture20Lecture20
Lecture20
 
Lecture19
Lecture19Lecture19
Lecture19
 
Lecture18
Lecture18Lecture18
Lecture18
 
Lecture17
Lecture17Lecture17
Lecture17
 
Lecture16 - Advances topics on association rules PART III
Lecture16 - Advances topics on association rules PART IIILecture16 - Advances topics on association rules PART III
Lecture16 - Advances topics on association rules PART III
 
Lecture15 - Advances topics on association rules PART II
Lecture15 - Advances topics on association rules PART IILecture15 - Advances topics on association rules PART II
Lecture15 - Advances topics on association rules PART II
 
Lecture14 - Advanced topics in association rules
Lecture14 - Advanced topics in association rulesLecture14 - Advanced topics in association rules
Lecture14 - Advanced topics in association rules
 
Lecture12 - SVM
Lecture12 - SVMLecture12 - SVM
Lecture12 - SVM
 
Lecture11 - neural networks
Lecture11 - neural networksLecture11 - neural networks
Lecture11 - neural networks
 
Lecture10 - Naïve Bayes
Lecture10 - Naïve BayesLecture10 - Naïve Bayes
Lecture10 - Naïve Bayes
 
Lecture9 - Bayesian-Decision-Theory
Lecture9 - Bayesian-Decision-TheoryLecture9 - Bayesian-Decision-Theory
Lecture9 - Bayesian-Decision-Theory
 
Lecture7 - IBk
Lecture7 - IBkLecture7 - IBk
Lecture7 - IBk
 
Lecture8 - From CBR to IBk
Lecture8 - From CBR to IBkLecture8 - From CBR to IBk
Lecture8 - From CBR to IBk
 
Lecture6 - C4.5
Lecture6 - C4.5Lecture6 - C4.5
Lecture6 - C4.5
 

Recently uploaded

INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxHumphrey A Beña
 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxCarlos105
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfPatidar M
 
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONTHEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONHumphrey A Beña
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfErwinPantujan2
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parentsnavabharathschool99
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfTechSoup
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4JOYLYNSAMANIEGO
 
Activity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translationActivity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translationRosabel UA
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management systemChristalin Nelson
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptxmary850239
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designMIPLM
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSJoshuaGantuangco2
 
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptx
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptxMusic 9 - 4th quarter - Vocal Music of the Romantic Period.pptx
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptxleah joy valeriano
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYKayeClaireEstoconing
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
ROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxVanesaIglesias10
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
Integumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptIntegumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptshraddhaparab530
 

Recently uploaded (20)

INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdf
 
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONTHEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parents
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4
 
Raw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptxRaw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptx
 
Activity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translationActivity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translation
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management system
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-design
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
 
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptx
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptxMusic 9 - 4th quarter - Vocal Music of the Romantic Period.pptx
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptx
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
ROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptx
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
Integumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptIntegumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.ppt
 

Machine learning lecture introduces association rules

  • 1. Introduction to Machine Learning Lecture 13 Introduction to Association Rules Albert Orriols i Puig aorriols@salle.url.edu i l @ ll ld Artificial Intelligence – Machine Learning Enginyeria i Arquitectura La Salle gy q Universitat Ramon Llull
  • 2. Recap of Lecture 5-12 LET’S START WITH DATA CLASSIFICATION Slide 2 Artificial Intelligence Machine Learning
  • 3. Recap of Lecture 5-12 Data Set Classification Model How? We have seen four different types of approaches to classification : • Decision trees (C4.5) • Instance-based algorithms (kNN & CBR) Instance based • Bayesian classifiers (Naïve Bayes) •N Neural N t l Networks (P k (Perceptron, Ad li t Adaline, M d li Madaline, SVM) Slide 3 Artificial Intelligence Machine Learning
  • 4. Today’s Agenda Introduction to Association Rules A Taxonomy of Association Rules Measures of Interest Apriori Slide 4 Artificial Intelligence Machine Learning
  • 5. Introduction to AR Ideas come from the market basket analysis ( y (MBA) ) Let’s go shopping! Milk, eggs, sugar, bread Milk, eggs, cereal, Eggs, sugar bread bd Customer1 Customer2 Customer3 What do my customer buy? Which product are bought together? Aim: Find associations and correlations between t e d e e t d assoc at o s a d co e at o s bet ee the different items that customers place in their shopping basket Slide 5 Artificial Intelligence Machine Learning
  • 6. Introduction to AR Formalizing the problem a little bit g p Transaction Database T: a set of transactions T = {t1, t2, …, tn} Each transaction contains a set of items I (it E ht ti ti t f it (item set) t) An itemset is a collection of items I = {i1, i2, …, im} General aim: Find frequent/interesting patterns, associations, correlations, or causal structures among sets of items or elements in databases or other information repositories. Put this relationships in terms of association rules X⇒ Y Slide 6 Artificial Intelligence Machine Learning
  • 7. Example of AR TID Items Examples: T1 bread, jelly, peanut-butter bread ⇒ peanut-butter peanut butter T2 bread, peanut-butter beer ⇒ bread T3 bread, milk, peanut-butter T4 beer, bread T5 beer, milk Frequent itemsets: Items that frequently appear together I = {bread, peanut-butter} {bread I = {beer, bread} Slide 7 Artificial Intelligence Machine Learning
  • 8. What’s an Interesting Rule? Support count (σ) pp () TID Items T1 bread, jelly, peanut-butter Frequency of occurrence of a d e se and itemset T2 bread, peanut-butter ,p σ ({bread, peanut-butter}) = 3 T3 bread, milk, peanut-butter T4 beer, bread σ ({beer, bread}) = 1 ({ , }) T5 beer, milk Support Fraction f t F ti of transactions that ti th t contain an itemset s ({bread peanut butter}) = 3/5 ({bread,peanut-butter}) s ({beer, bread}) = 1/5 Frequent itemset F t it t An itemset whose support is greater than or equal to a minimum support threshold (minsup) Slide 8 Artificial Intelligence Machine Learning
  • 9. What’s an Interesting Rule? An association rule is an TID Items implication of two itemsets T1 bread, jelly, peanut-butter X⇒Y T2 bread, peanut-butter ,p T3 bread, milk, peanut-butter T4 beer, bread Many measures of interest. T5 beer, milk The two most used are: Support (s) σ (X ∪Y ) The occurring frequency of the rule, s= i.e., number of transactions that # of trans. contain both X and Y Confidence (c) σ (X ∪Y ) The strength of the association, c= σ (X) i.e., i e measures of how often items in Y appear in transactions that contain X Slide 9 Artificial Intelligence Machine Learning
  • 10. Interestingness of Rules TID Items TID s c T1 bread, jelly, peanut-butter bread ⇒ peanut-butter 0.60 0.75 T2 bread, peanut-butter peanut-butter ⇒ bread 0.60 1.00 T3 bread, milk, peanut-butter beer ⇒ bread 0.20 0.50 T4 beer, bread peanut-butter ⇒ jelly 0.20 0.33 T5 beer, milk jelly ⇒ peanut-butter 0.20 1.00 j ll ⇒ milk jelly ilk 0.00 0 00 0.00 0 00 Many other interesting measures The method presented herein are based on these two approaches Slide 10 Artificial Intelligence Machine Learning
  • 11. Types of AR Binary association rules: y bread ⇒ peanut-butter Quantitative association rules: weight in [70kg – 90kg] ⇒ height in [170cm – 190cm] Fuzzy association rules: weight in TALL ⇒ height in TALL Let’s start for the beginning Binary association rules – A priori Bi i ti l ii Slide 11 Artificial Intelligence Machine Learning
  • 12. Apriori This is the most influential AR miner It consists of two steps Generate all f G ll frequent i itemsets whose support ≥ minsup h i 1. Use frequent itemsets to generate association rules 2. So, let’s So let s pay attention to the first step Slide 12 Artificial Intelligence Machine Learning
  • 13. Apriori null A B C D E AB AC AD AE BC BD BE CD CE DE ABC ABD ABE ACD ACE ADE BCD BCE BDE CDE ABCD ABCE ABDE ACDE BCDE ABCDE Given d items, we have 2d possible itemsets. Do I have to generate them all? Slide 13 Artificial Intelligence Machine Learning
  • 14. Apriori Let’s avoid expanding all the graph p g gp Key idea: Downward closure property: A subsets of a f D dl Any b f frequent itemset i are also frequent itemsets Therefore, the algorithm iteratively does: Create itemsets Only continue exploration of those whose support ≥ minsup Slide 14 Artificial Intelligence Machine Learning
  • 15. Example Itemset Generation null Infrequent itemset A B C D E AB AC AD AE BC BD BE CD CE DE ABC ABD ABE ACD ACE ADE BCD BCE BDE CDE ABCD ABCE ABDE ACDE BCDE ABCD Given d items, we have 2d possible itemsets. Do I have to generate them all? Slide 15 Artificial Intelligence Machine Learning
  • 16. Recovering the Example TID Items T1 bread, jelly, peanut-butter T2 bread, peanut-butter T3 bread, ilk b d milk, peanut-butter b T4 beer, bread Minimum support = 3 pp T5 beer, milk b ilk 1-itemsets Item count 2-itemsets bread 4 Item count peanut-b 3 bread, peanut-b 3 jelly 1 milk 1 beer 1 Slide 16 Artificial Intelligence Machine Learning
  • 17. Apriori Algorithm k=1 Generate frequent itemsets of length 1 Repeat until no frequent itemsets are found k := k+1 Generate itemsets of size k from the k-1 frequent itemsets Compute the support of each candidate by scanning DB Slide 17 Artificial Intelligence Machine Learning
  • 18. Apriori Algorithm Algorithm Apriori(T) C1 ← init-pass(T); F1 ← {f | f ∈ C1, f.count/n ≥ minsup}; // n: no. of transactions in T for (k = 2; Fk-1 ≠ ∅; k++) do Ck ← candidate-gen(Fk-1); for each transaction t ∈ T do for each candidate c ∈ Ck do if c i contained i t th is t i d in then c.count++; endd end Fk ← {c ∈ Ck | c count/n ≥ minsup} c.count/n end return F ← Uk Fk; Slide 18 Artificial Intelligence Machine Learning
  • 19. Apriori Algorithm Function candidate-gen(Fk-1) Ck ← ∅; forall f1, f2 ∈ Fk-1 with f1 = {i1, … , ik-2, ik-1} and f2 = {i1, … , ik-2, i’k-1} and ik-1 < i’k-1 do c ← {i1, …, ik-1, i’k-1}; // join f1 and f2 Ck ← Ck ∪ {c}; for each (k-1)-subset s of c do if ( ∉ Fk-1) th (s then delete c from Ck; // prune end end return Ck; Slide 19 Artificial Intelligence Machine Learning
  • 20. Example of Apriori Run Itemset sup Itemset sup Database TDB Dtb {A} 2 L1 {A} 2 C1 Tid Items {B} 3 {B} 3 10 A, C A C, D {C} 3 {C} 3 1st scan 20 B, C, E {D} 1 {E} 3 30 A, B, C, E {E} 3 40 B, E Itemset sup C2 C2 Itemset te set {A, {A B} 1 2nd scan L2 Itemset sup {A, B} {A, C} 2 {A, C} 2 {A, C} {A, E} 1 {B, {B C} 2 {A, E} {B, C} 2 {B, E} 3 {B, C} {B, E} 3 {C, E} 2 {C, E} 2 {B, {B E} {C, E} Itemset te set L3 C3 3rd scan Itemset It t sup {B, C, E} {B, C, E} 2 Slide 20 Artificial Intelligence Machine Learning
  • 21. Apriori Remember that Apriori consists of two steps p p Generate all frequent itemsets whose support ≥ minsup 1. Use frequent it Uf t itemsets t generate association rules t to t i ti l 2. We accomplished step 1. So we have all frequent itemsets So, let’s pay attention to the second step Slide 21 Artificial Intelligence Machine Learning
  • 22. Rule Generation in Apriori Given a frequent itemset L q Find all non-empty subsets F in L, such that the association rule F ⇒ {L-F} sat s es the minimum confidence ue { } satisfies t e u co de ce Create the rule F ⇒ {L-F} If L={A,B,C} The candidate itemsets are: AB⇒C, AC⇒B, BC⇒A, A⇒BC, B⇒AC, C⇒AB In general, there are 2K-2 candidate solutions, where k is the length of the itemset L Slide 22 Artificial Intelligence Machine Learning
  • 23. Can you Be More Efficient? Can we apply the same trick used with support? pp y pp Confidence does not have anti-monote property Th t is, c(AB⇒D) > c(A ⇒D)? That i (AB D) (A D)? Don’t know! But confidence of rules generated from the same itemset does have the anti-monote property d h h i L={A,B,C,D} C(ABC⇒D) ≥ c(AB ⇒CD) ≥ c(A ⇒BCD) We can apply this p p y to p pp y property prune the rule g generation Slide 23 Artificial Intelligence Machine Learning
  • 24. Example of Efficient Rule Generation ABCD Low confidence ABC⇒D ABD⇒C ACD⇒B BCD⇒A AB⇒CD AC⇒BD BC⇒AD AD⇒BC BD⇒AD CD⇒AB A⇒BCD B⇒ACD C⇒ABD D⇒ABC Slide 24 Artificial Intelligence Machine Learning
  • 25. Challenges in AR Mining Challenges g Apriori scans the data base multiple times Most ft M t often, there is a high number of candidates th i hi h b f did t Support counting for candidates can be time expensive Several methods try to improve this points by Reduce the number of scans of the data base Shrink the number of candidates Counting the support of candidates more efficiently Slide 25 Artificial Intelligence Machine Learning
  • 26. Next Class Advanced topics in association rule mining Slide 26 Artificial Intelligence Machine Learning
  • 27. Introduction to Machine Learning Lecture 13 Introduction to Association Rules Albert Orriols i Puig aorriols@salle.url.edu i l @ ll ld Artificial Intelligence – Machine Learning Enginyeria i Arquitectura La Salle gy q Universitat Ramon Llull