SlideShare a Scribd company logo
1 of 122
Machine Learning  and Inductive Inference Hendrik Blockeel 2001-2002
1  Introduction ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Practical information about the course ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
What is machine learning? ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Inductive inference ,[object Object],[object Object],sample population observation: "these dogs are all brown" hypothesis: "all dogs are brown"
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
What is it useful for? ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Knowledge discovery ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Example: given molecules that are active against some disease, find out what is common in them; this is probably the reason for their activity.
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Learning to perform difficult tasks ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Adaptive systems ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Illustration : building a system that learns checkers ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Overview of design choices type of training experience games against self games against expert table of good moves determine type of target function determine representation determine learning algorithm …  …  …  …  ready!  Board      Board    Move linear function of 6 features …  gradient descent
Some issues that influence choices ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Typical learning tasks ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Concept learning: supervised ,[object Object],+ + + + + - - - - - - - - - X C : X    {true,false} + + + + + - - - - - - - - - X C
Concept learning: unsupervised ,[object Object],[object Object],[object Object],. . . . . . . . . . . X . . . . . . . . . . . . . . . . . X . . . . . . C 3 C 2 C 1 ,[object Object]
Function learning ,[object Object],[object Object],[object Object],[object Object],. 1.4 . 2.7 . 0.6 . 2.1 X . 0.9 . 1.4 . 2.7 . 0.6 . 2.1 X . 0.9 0 1 2 3 f
Clustering ,[object Object],[object Object],[object Object],[object Object],[object Object],. . . . . . . . . . . X . . . . . . . . . . . . . . . . . X . . . . . .
Finding descriptive patterns ,[object Object],[object Object],[object Object],[object Object],[object Object]
Representation of data ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Brief overview of approaches ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Overview of the course ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
2  Version Spaces ,[object Object],[object Object],[object Object],[object Object],[object Object]
Basic principles ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
An example ,[object Object],[object Object],[object Object],[object Object],[object Object],- - - - - - - - - - - - + + + - -
[object Object],- - - - - - - - - - - - + + + - -
[object Object],h 1 h 2 h 3 h 2  more specific than h 1 h 3  incomparable with h 1 - - - - - - - - - - - - + + + - -
Version space boundaries ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Example, continued ,[object Object],S  = {h 1 },  G  = {h 2 ,h 3 } - - - - - - - - - - - - + + + - - h 2 : most general hypothesis h 3 : another most general hyp. h 1 : most specific hypothesis
Computing the version space ,[object Object],[object Object],[object Object],[object Object],[object Object]
Candidate Elimination Algorithm: demonstration with rectangles ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
G S ,[object Object],S = {<  ,  >} G = {<1-6, 1-6>} 3 4 5 6 2 1 3 2 1 4 5 6
G S ,[object Object],[object Object],S = {<  ,  >}  G = {<1-6,1-6>} (3,2) : + +
+ G S (3,2) : + ,[object Object],[object Object],[object Object],S = {< 3-3,2-2 >} G = {<1-6,1-6>}
(3,2) : + ,[object Object],[object Object],[object Object],[object Object],+ G S S = {<3-3,2-2>} G = {<1-6,1-6>} - (5,4) : -
+ G S (3,2) : + ,[object Object],[object Object],[object Object],S = {<3-3,2-2>}  G = {< 1-4,1-6 >, < 1-6, 1-3 >} ,[object Object],[object Object],[object Object],[object Object],(5,4) : - -
+ G S (3,2) : + ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],S = {<3-3,2-2>}  G = {<1-4,1-6>, <1-6, 1-3>} (5,4) : - - (2,4) : - -
+ G S (3,2) : + ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],S = {<3-3,2-2>}  G = {< 3-4,1-6 >, <1-6, 1-3>} (5,4) : - - (2,4) : - -
+ G S (3,2) : + ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],S = {<3-3,2-2>}  G = {<3-4,1-6>, <1-6, 1-3>} (5,4) : - - (2,4) : - - (5,3) : + +
+ G S (3,2) : + ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],S = {< 3-5,2-3 >}  G = {<3-4,1-6>, <1-6, 1-3>} (5,4) : - - (2,4) : - - (5,3) : + +
+ G S (3,2) : + ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],S = {<3-5,2-3>}  G =  {<1-6, 1-3>} (5,4) : - - (2,4) : - - (5,3) : + +
+ G S Current versionspace contains all rectangles covering S and covered by G, e.g. h = <2-5,2-3> h S = {<3-5,2-3>}  G = {<1-6, 1-3>}
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Difficulties with  version space approaches ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Inductive bias ,[object Object],[object Object],[object Object],[object Object]
Equivalence between inductive and deductive systems inductive system training examples new instance deductive system training examples new instance inductive bias result (by proof) result (by inductive leap)
Definition of inductive bias ,[object Object],[object Object],[object Object],[object Object]
Effect of inductive bias ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Inductive bias of version spaces ,[object Object],[object Object],[object Object],[object Object],[object Object]
Unbiased version spaces ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object]
To remember ,[object Object],[object Object]
3  Induction of decision trees ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
What are decision trees? ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Example decision tree 1 ,[object Object],Outlook Humidity Wind No Yes No Yes Yes Sunny Overcast Rainy High Normal Strong Weak
Example decision tree 2 ,[object Object],[object Object],Fetal_Presentation Previous_Csection + - - 1 2 3 0 1 [3+, 29-] .11+ .89- [8+, 22-] .27+ .73- [55+, 35-] .61+ .39- Primiparous … …
Representation power ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Representing boolean formulae ,[object Object],[object Object],[object Object],[object Object],[object Object],A false true B true true false true false
Classification, Regression and Clustering trees ,[object Object],[object Object],[object Object],[object Object],[object Object]
Example decision tree 3  (from study of river water quality) ,[object Object],[object Object],[object Object],[object Object],[object Object]
Clustering tree abundance(Tubifex sp.,5) ?  T  = 0.357111 pH  = -0.496808 cond  = 1.23151 O2  = -1.09279 O2sat  = -1.04837  CO2  = 0.893152 hard  = 0.988909 NO2  = 0.54731 NO3  = 0.426773 NH4  = 1.11263 PO4  = 0.875459  Cl  = 0.86275 SiO2  = 0.997237 KMnO4  = 1.29711 K2Cr2O7 = 0.97025 BOD  = 0.67012 abundance(Sphaerotilus natans,5) ?   yes no T  = 0.0129737 pH  = -0.536434 cond  = 0.914569 O2  = -0.810187 O2sat  = -0.848571 CO2  = 0.443103 hard  = 0.806137 NO2  = 0.4151 NO3  = -0.0847706 NH4  = 0.536927 PO4  = 0.442398 Cl  = 0.668979 SiO2  = 0.291415 KMnO4  = 1.08462 K2Cr2O7 = 0.850733 BOD  = 0.651707 yes no abundance( ...) <- &quot;standardized&quot; values (how many standard  deviations above mean)
Top-Down Induction of  Decision Trees ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Finding the best test  (for classification trees) ,[object Object],[object Object],[object Object],[object Object]
Entropy ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Entropy ,[object Object]
Information gain ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Example ,[object Object],Humidity Wind High Normal Strong Weak S: [9+,5-] S: [9+,5-] S: [3+,4-] S: [6+,1-] S: [6+,2-] S: [3+,3-] E = 0.985 E = 0.592 E = 0.811 E = 1.0 E = 0.940 E = 0.940 Gain(S, Humidity) = .940 - (7/14).985 - (7/14).592 = 0.151 Gain(S, Wind) = .940 - (8/14).811 - (6/14)1.0 = 0.048
[object Object],Outlook ? ? Yes Sunny Overcast Rainy [9+,5-] [2+,3-] [3+,2-] [4+,0-]
Hypothesis space search in TDIDT ,[object Object],[object Object],...
Inductive bias in TDIDT ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Occam’s Razor ,[object Object],[object Object],[object Object],[object Object]
Avoiding Overfitting ,[object Object],[object Object],[object Object],[object Object],[object Object],. . . . . . . . . . . .
Overfitting: example + + + + + + + - - - - - - - - - - - - - - + - - - - - area with probably wrong predictions
Overfitting: effect on predictive accuracy ,[object Object],[object Object],[object Object],accuracy on training data accuracy on unseen data size of tree accuracy overfitting starts about here
How to avoid overfitting when building classification trees? ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Stopping criteria ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Post-pruning trees ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
accuracy on training data accuracy on unseen data size of tree accuracy effect of pruning
Comparison ,[object Object],[object Object],[object Object],[object Object]
Turning trees into rules ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Rules from trees: example Outlook Humidity Wind No Yes No Yes Yes Sunny Overcast Rainy High Normal Strong Weak if  Outlook = Sunny  and  Humidity = High  then  No if  Outlook = Sunny  and  Humidity = Normal  then  Yes …
Pruning rules ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Pruning rules: example A false true B true true false true false if  A=true  then  true if  A=false  and  B=true  then  true if  A=false  and  B=false  then  false Tree representing A    B Rules represent A    (  A  B) A    B
Alternative heuristics  for choosing tests ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],
[object Object],[object Object],[object Object],[object Object],[object Object]
Properties of good heuristics ,[object Object],[object Object],[object Object],[object Object],A1 80-, 20+ 40-,0+ 40-,20+ A2 80-, 20+ 40-,10+ 40-,10+ How would  - accuracy - information gain rate these splits?
Heuristics compared Good heuristics are  strictly concave
Why concave functions? E E 1 E 2 p p 2 p 1 Assume node with size  n , entropy  E  and proportion of positives  p is split into 2 nodes with  n 1 , E 1 , p 1  and  n 2 , E 2  p 2 . We have  p = (n 1 /n)p 1  + (n 2 /n) p 2 and the new average entropy  E’ = (n 1 /n)E 1 +(n 2 /n)E 2  is therefore  found by linear interpolation between ( p 1 ,E 1 ) and ( p 2 ,E 2 ) at  p .  Gain = difference in height between ( p, E ) and ( p,E’ ). (n 1 /n)E 1 +(n 2 /n)E 2 Gain
Handling missing values ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Generic TDIDT algorithm function  TDIDT( E : set of examples)  returns  tree; T'  := grow_tree( E ); T  :=  prune ( T' ); return   T ; function  grow_tree( E : set of examples)  returns  tree; T  :=  generate_tests ( E ); t  :=  best_test ( T ,  E ); P  := partition induced on  E  by  t ; if   stop_criterion ( E ,  P ) then   return  leaf( info ( E )) else for all   E j   in  P:  t j  := grow_tree( E j ); return  node( t , {( j,t j )}; 
For classification... ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
For regression... ,[object Object],[object Object],[object Object],[object Object],A1 A2 {1,3,4,7,8,12} {1,3,4,7,8,12} {1,4,12} {3,7,8} {1,3,7} {4,8,12}
CART ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
n-dimensional target spaces ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Clustering tree abundance(Tubifex sp.,5) ?  T  = 0.357111 pH  = -0.496808 cond  = 1.23151 O2  = -1.09279 O2sat  = -1.04837  CO2  = 0.893152 hard  = 0.988909 NO2  = 0.54731 NO3  = 0.426773 NH4  = 1.11263 PO4  = 0.875459  Cl  = 0.86275 SiO2  = 0.997237 KMnO4  = 1.29711 K2Cr2O7 = 0.97025 BOD  = 0.67012 abundance(Sphaerotilus natans,5) ?   yes no T  = 0.0129737 pH  = -0.536434 cond  = 0.914569 O2  = -0.810187 O2sat  = -0.848571 CO2  = 0.443103 hard  = 0.806137 NO2  = 0.4151 NO3  = -0.0847706 NH4  = 0.536927 PO4  = 0.442398 Cl  = 0.668979 SiO2  = 0.291415 KMnO4  = 1.08462 K2Cr2O7 = 0.850733 BOD  = 0.651707 yes no abundance( ...) <- &quot;standardized&quot; values (how many standard  deviations above mean)
To Remember ,[object Object],[object Object],[object Object],[object Object],[object Object]
4  Neural networks ,[object Object],[object Object],[object Object],[object Object],[object Object]
Artificial neural networks ,[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Perceptrons ,[object Object],[object Object],[object Object],[object Object],   computes    w i x i X Y threshold function: Y = -1 if X<t, Y=1 otherwise x 1 x 2 x 3 x 4 x 5 w 1 w 5
2-input perceptron ,[object Object],[object Object],[object Object],[object Object],+1 -1
n-input perceptrons ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Multi-layer networks ,[object Object],+1 -1 +1 -1 X Y neuron 1 neuron 2 +1 -1 output -1 -1 inputs hidden layer output layer
[object Object],[object Object],[object Object],   x 1 x 2 x 3 x 4 x 5 w 1 w 5
[object Object],[object Object],[object Object],a b c d e
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],for instance: 1  101 2  100 3  011 4  111 5  000 6  010 7  110 8  001
Training neural networks ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Properties of neural networks ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
To remember ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]

More Related Content

What's hot

Machine learning ppt.
Machine learning ppt.Machine learning ppt.
Machine learning ppt.ASHOK KUMAR
 
Neural Networks in Data Mining - “An Overview”
Neural Networks  in Data Mining -   “An Overview”Neural Networks  in Data Mining -   “An Overview”
Neural Networks in Data Mining - “An Overview”Dr.(Mrs).Gethsiyal Augasta
 
Autoencoders
AutoencodersAutoencoders
AutoencodersCloudxLab
 
Forms of learning in ai
Forms of learning in aiForms of learning in ai
Forms of learning in aiRobert Antony
 
Support Vector Machines for Classification
Support Vector Machines for ClassificationSupport Vector Machines for Classification
Support Vector Machines for ClassificationPrakash Pimpale
 
Decision tree induction \ Decision Tree Algorithm with Example| Data science
Decision tree induction \ Decision Tree Algorithm with Example| Data scienceDecision tree induction \ Decision Tree Algorithm with Example| Data science
Decision tree induction \ Decision Tree Algorithm with Example| Data scienceMaryamRehman6
 
Machine learning Lecture 3
Machine learning Lecture 3Machine learning Lecture 3
Machine learning Lecture 3Srinivasan R
 
Deep Belief Networks
Deep Belief NetworksDeep Belief Networks
Deep Belief NetworksHasan H Topcu
 
Machine learning Lecture 1
Machine learning Lecture 1Machine learning Lecture 1
Machine learning Lecture 1Srinivasan R
 
Parametric & Non-Parametric Machine Learning (Supervised ML)
Parametric & Non-Parametric Machine Learning (Supervised ML)Parametric & Non-Parametric Machine Learning (Supervised ML)
Parametric & Non-Parametric Machine Learning (Supervised ML)Rehan Guha
 
Machine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksMachine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksFrancesco Collova'
 
Transfer Learning -- The Next Frontier for Machine Learning
Transfer Learning -- The Next Frontier for Machine LearningTransfer Learning -- The Next Frontier for Machine Learning
Transfer Learning -- The Next Frontier for Machine LearningSebastian Ruder
 
2.4 rule based classification
2.4 rule based classification2.4 rule based classification
2.4 rule based classificationKrish_ver2
 
Bias and variance trade off
Bias and variance trade offBias and variance trade off
Bias and variance trade offVARUN KUMAR
 
Introduction to Machine Learning Classifiers
Introduction to Machine Learning ClassifiersIntroduction to Machine Learning Classifiers
Introduction to Machine Learning ClassifiersFunctional Imperative
 

What's hot (20)

Machine learning ppt.
Machine learning ppt.Machine learning ppt.
Machine learning ppt.
 
Neural Networks in Data Mining - “An Overview”
Neural Networks  in Data Mining -   “An Overview”Neural Networks  in Data Mining -   “An Overview”
Neural Networks in Data Mining - “An Overview”
 
AI Lecture 7 (uncertainty)
AI Lecture 7 (uncertainty)AI Lecture 7 (uncertainty)
AI Lecture 7 (uncertainty)
 
Uncertainty in AI
Uncertainty in AIUncertainty in AI
Uncertainty in AI
 
Autoencoders
AutoencodersAutoencoders
Autoencoders
 
Forms of learning in ai
Forms of learning in aiForms of learning in ai
Forms of learning in ai
 
Support Vector Machines for Classification
Support Vector Machines for ClassificationSupport Vector Machines for Classification
Support Vector Machines for Classification
 
Decision tree induction \ Decision Tree Algorithm with Example| Data science
Decision tree induction \ Decision Tree Algorithm with Example| Data scienceDecision tree induction \ Decision Tree Algorithm with Example| Data science
Decision tree induction \ Decision Tree Algorithm with Example| Data science
 
Machine learning Lecture 3
Machine learning Lecture 3Machine learning Lecture 3
Machine learning Lecture 3
 
Naive Bayes
Naive BayesNaive Bayes
Naive Bayes
 
Bayes Belief Networks
Bayes Belief NetworksBayes Belief Networks
Bayes Belief Networks
 
Deep Belief Networks
Deep Belief NetworksDeep Belief Networks
Deep Belief Networks
 
Machine learning Lecture 1
Machine learning Lecture 1Machine learning Lecture 1
Machine learning Lecture 1
 
Parametric & Non-Parametric Machine Learning (Supervised ML)
Parametric & Non-Parametric Machine Learning (Supervised ML)Parametric & Non-Parametric Machine Learning (Supervised ML)
Parametric & Non-Parametric Machine Learning (Supervised ML)
 
Machine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksMachine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural Networks
 
Transfer Learning -- The Next Frontier for Machine Learning
Transfer Learning -- The Next Frontier for Machine LearningTransfer Learning -- The Next Frontier for Machine Learning
Transfer Learning -- The Next Frontier for Machine Learning
 
2.4 rule based classification
2.4 rule based classification2.4 rule based classification
2.4 rule based classification
 
Bias and variance trade off
Bias and variance trade offBias and variance trade off
Bias and variance trade off
 
Artificial Neural Networks for Data Mining
Artificial Neural Networks for Data MiningArtificial Neural Networks for Data Mining
Artificial Neural Networks for Data Mining
 
Introduction to Machine Learning Classifiers
Introduction to Machine Learning ClassifiersIntroduction to Machine Learning Classifiers
Introduction to Machine Learning Classifiers
 

Viewers also liked

Introduction to Machine Learning
Introduction to Machine LearningIntroduction to Machine Learning
Introduction to Machine Learningbutest
 
Introduction to machine learning
Introduction to machine learningIntroduction to machine learning
Introduction to machine learningbutest
 
CS364 Artificial Intelligence Machine Learning
CS364 Artificial Intelligence Machine LearningCS364 Artificial Intelligence Machine Learning
CS364 Artificial Intelligence Machine Learningbutest
 
The Future of Materials, by Means of Example, Part 1: Packaging Solutions
The Future of Materials, by Means of Example, Part 1: Packaging SolutionsThe Future of Materials, by Means of Example, Part 1: Packaging Solutions
The Future of Materials, by Means of Example, Part 1: Packaging SolutionsSustainable Brands
 
NANO266 - Lecture 12 - High-throughput computational materials design
NANO266 - Lecture 12 - High-throughput computational materials designNANO266 - Lecture 12 - High-throughput computational materials design
NANO266 - Lecture 12 - High-throughput computational materials designUniversity of California, San Diego
 
Lecture 3: Basic Concepts of Machine Learning - Induction & Evaluation
Lecture 3: Basic Concepts of Machine Learning - Induction & EvaluationLecture 3: Basic Concepts of Machine Learning - Induction & Evaluation
Lecture 3: Basic Concepts of Machine Learning - Induction & EvaluationMarina Santini
 
Artificial Intelligence Master at UPC: some experience on applying AI to real...
Artificial Intelligence Master at UPC: some experience on applying AI to real...Artificial Intelligence Master at UPC: some experience on applying AI to real...
Artificial Intelligence Master at UPC: some experience on applying AI to real...Javier Vázquez-Salceda
 
Desperately Seeking Theory: Gamification, Theory, and the Promise of a Data/A...
Desperately Seeking Theory: Gamification, Theory, and the Promise of a Data/A...Desperately Seeking Theory: Gamification, Theory, and the Promise of a Data/A...
Desperately Seeking Theory: Gamification, Theory, and the Promise of a Data/A...Sebastian Deterding
 
Machine Learning presentation.
Machine Learning presentation.Machine Learning presentation.
Machine Learning presentation.butest
 
Introduction to Machine Learning
Introduction to Machine LearningIntroduction to Machine Learning
Introduction to Machine LearningRahul Jain
 
Introduction to Machine Learning
Introduction to Machine LearningIntroduction to Machine Learning
Introduction to Machine LearningLior Rokach
 
Introduction to Big Data/Machine Learning
Introduction to Big Data/Machine LearningIntroduction to Big Data/Machine Learning
Introduction to Big Data/Machine LearningLars Marius Garshol
 
introduction to data mining tutorial
introduction to data mining tutorial introduction to data mining tutorial
introduction to data mining tutorial Salah Amean
 

Viewers also liked (16)

Introduction to Machine Learning
Introduction to Machine LearningIntroduction to Machine Learning
Introduction to Machine Learning
 
Introduction to machine learning
Introduction to machine learningIntroduction to machine learning
Introduction to machine learning
 
CS364 Artificial Intelligence Machine Learning
CS364 Artificial Intelligence Machine LearningCS364 Artificial Intelligence Machine Learning
CS364 Artificial Intelligence Machine Learning
 
The Future of Materials, by Means of Example, Part 1: Packaging Solutions
The Future of Materials, by Means of Example, Part 1: Packaging SolutionsThe Future of Materials, by Means of Example, Part 1: Packaging Solutions
The Future of Materials, by Means of Example, Part 1: Packaging Solutions
 
NANO266 - Lecture 12 - High-throughput computational materials design
NANO266 - Lecture 12 - High-throughput computational materials designNANO266 - Lecture 12 - High-throughput computational materials design
NANO266 - Lecture 12 - High-throughput computational materials design
 
Lecture 3: Basic Concepts of Machine Learning - Induction & Evaluation
Lecture 3: Basic Concepts of Machine Learning - Induction & EvaluationLecture 3: Basic Concepts of Machine Learning - Induction & Evaluation
Lecture 3: Basic Concepts of Machine Learning - Induction & Evaluation
 
Artificial intelligence ppt
Artificial intelligence   pptArtificial intelligence   ppt
Artificial intelligence ppt
 
The Materials API
The Materials APIThe Materials API
The Materials API
 
ICME Workshop Jul 2014 - The Materials Project
ICME Workshop Jul 2014 - The Materials ProjectICME Workshop Jul 2014 - The Materials Project
ICME Workshop Jul 2014 - The Materials Project
 
Artificial Intelligence Master at UPC: some experience on applying AI to real...
Artificial Intelligence Master at UPC: some experience on applying AI to real...Artificial Intelligence Master at UPC: some experience on applying AI to real...
Artificial Intelligence Master at UPC: some experience on applying AI to real...
 
Desperately Seeking Theory: Gamification, Theory, and the Promise of a Data/A...
Desperately Seeking Theory: Gamification, Theory, and the Promise of a Data/A...Desperately Seeking Theory: Gamification, Theory, and the Promise of a Data/A...
Desperately Seeking Theory: Gamification, Theory, and the Promise of a Data/A...
 
Machine Learning presentation.
Machine Learning presentation.Machine Learning presentation.
Machine Learning presentation.
 
Introduction to Machine Learning
Introduction to Machine LearningIntroduction to Machine Learning
Introduction to Machine Learning
 
Introduction to Machine Learning
Introduction to Machine LearningIntroduction to Machine Learning
Introduction to Machine Learning
 
Introduction to Big Data/Machine Learning
Introduction to Big Data/Machine LearningIntroduction to Big Data/Machine Learning
Introduction to Big Data/Machine Learning
 
introduction to data mining tutorial
introduction to data mining tutorial introduction to data mining tutorial
introduction to data mining tutorial
 

Similar to Machine Learning and Inductive Inference

introducción a Machine Learning
introducción a Machine Learningintroducción a Machine Learning
introducción a Machine Learningbutest
 
introducción a Machine Learning
introducción a Machine Learningintroducción a Machine Learning
introducción a Machine Learningbutest
 
课堂讲义(最后更新:2009-9-25)
课堂讲义(最后更新:2009-9-25)课堂讲义(最后更新:2009-9-25)
课堂讲义(最后更新:2009-9-25)butest
 
Computational Biology, Part 4 Protein Coding Regions
Computational Biology, Part 4 Protein Coding RegionsComputational Biology, Part 4 Protein Coding Regions
Computational Biology, Part 4 Protein Coding Regionsbutest
 
notes as .ppt
notes as .pptnotes as .ppt
notes as .pptbutest
 
3_learning.ppt
3_learning.ppt3_learning.ppt
3_learning.pptbutest
 
Essentials of machine learning algorithms
Essentials of machine learning algorithmsEssentials of machine learning algorithms
Essentials of machine learning algorithmsArunangsu Sahu
 
Machine Learning.pptx
Machine Learning.pptxMachine Learning.pptx
Machine Learning.pptxchadhar227
 
Chapter 6 - Learning data and analytics course
Chapter 6 - Learning data and analytics courseChapter 6 - Learning data and analytics course
Chapter 6 - Learning data and analytics coursegideymichael
 
Machine learning presentation (razi)
Machine learning presentation (razi)Machine learning presentation (razi)
Machine learning presentation (razi)Rizwan Shaukat
 
Introduction to Machine Learning.
Introduction to Machine Learning.Introduction to Machine Learning.
Introduction to Machine Learning.butest
 
AML_030607.ppt
AML_030607.pptAML_030607.ppt
AML_030607.pptbutest
 
Machine learning para tertulianos, by javier ramirez at teowaki
Machine learning para tertulianos, by javier ramirez at teowakiMachine learning para tertulianos, by javier ramirez at teowaki
Machine learning para tertulianos, by javier ramirez at teowakijavier ramirez
 
Machine learning-in-details-with-out-python-code
Machine learning-in-details-with-out-python-codeMachine learning-in-details-with-out-python-code
Machine learning-in-details-with-out-python-codeOsama Ghandour Geris
 
Learning
LearningLearning
Learningbutest
 
Statistical foundations of ml
Statistical foundations of mlStatistical foundations of ml
Statistical foundations of mlVipul Kalamkar
 

Similar to Machine Learning and Inductive Inference (20)

introducción a Machine Learning
introducción a Machine Learningintroducción a Machine Learning
introducción a Machine Learning
 
introducción a Machine Learning
introducción a Machine Learningintroducción a Machine Learning
introducción a Machine Learning
 
课堂讲义(最后更新:2009-9-25)
课堂讲义(最后更新:2009-9-25)课堂讲义(最后更新:2009-9-25)
课堂讲义(最后更新:2009-9-25)
 
Computational Biology, Part 4 Protein Coding Regions
Computational Biology, Part 4 Protein Coding RegionsComputational Biology, Part 4 Protein Coding Regions
Computational Biology, Part 4 Protein Coding Regions
 
notes as .ppt
notes as .pptnotes as .ppt
notes as .ppt
 
3_learning.ppt
3_learning.ppt3_learning.ppt
3_learning.ppt
 
Essentials of machine learning algorithms
Essentials of machine learning algorithmsEssentials of machine learning algorithms
Essentials of machine learning algorithms
 
Machine Learning.pptx
Machine Learning.pptxMachine Learning.pptx
Machine Learning.pptx
 
ML Lec 1 (1).pptx
ML Lec 1 (1).pptxML Lec 1 (1).pptx
ML Lec 1 (1).pptx
 
Chapter 6 - Learning data and analytics course
Chapter 6 - Learning data and analytics courseChapter 6 - Learning data and analytics course
Chapter 6 - Learning data and analytics course
 
Machine learning presentation (razi)
Machine learning presentation (razi)Machine learning presentation (razi)
Machine learning presentation (razi)
 
Introduction to Machine Learning.
Introduction to Machine Learning.Introduction to Machine Learning.
Introduction to Machine Learning.
 
module 6 (1).ppt
module 6 (1).pptmodule 6 (1).ppt
module 6 (1).ppt
 
AML_030607.ppt
AML_030607.pptAML_030607.ppt
AML_030607.ppt
 
Machine learning para tertulianos, by javier ramirez at teowaki
Machine learning para tertulianos, by javier ramirez at teowakiMachine learning para tertulianos, by javier ramirez at teowaki
Machine learning para tertulianos, by javier ramirez at teowaki
 
Machine_Learning.pptx
Machine_Learning.pptxMachine_Learning.pptx
Machine_Learning.pptx
 
Machine learning-in-details-with-out-python-code
Machine learning-in-details-with-out-python-codeMachine learning-in-details-with-out-python-code
Machine learning-in-details-with-out-python-code
 
Learning
LearningLearning
Learning
 
Lec1 intoduction.pptx
Lec1 intoduction.pptxLec1 intoduction.pptx
Lec1 intoduction.pptx
 
Statistical foundations of ml
Statistical foundations of mlStatistical foundations of ml
Statistical foundations of ml
 

More from butest

EL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBEEL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBEbutest
 
1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同butest
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALbutest
 
Timeline: The Life of Michael Jackson
Timeline: The Life of Michael JacksonTimeline: The Life of Michael Jackson
Timeline: The Life of Michael Jacksonbutest
 
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...butest
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALbutest
 
Com 380, Summer II
Com 380, Summer IICom 380, Summer II
Com 380, Summer IIbutest
 
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet JazzThe MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazzbutest
 
MICHAEL JACKSON.doc
MICHAEL JACKSON.docMICHAEL JACKSON.doc
MICHAEL JACKSON.docbutest
 
Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1butest
 
Facebook
Facebook Facebook
Facebook butest
 
Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...butest
 
Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...butest
 
NEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTNEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTbutest
 
C-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docC-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docbutest
 
MAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docMAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docbutest
 
Mac OS X Guide.doc
Mac OS X Guide.docMac OS X Guide.doc
Mac OS X Guide.docbutest
 
WEB DESIGN!
WEB DESIGN!WEB DESIGN!
WEB DESIGN!butest
 

More from butest (20)

EL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBEEL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBE
 
1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIAL
 
Timeline: The Life of Michael Jackson
Timeline: The Life of Michael JacksonTimeline: The Life of Michael Jackson
Timeline: The Life of Michael Jackson
 
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIAL
 
Com 380, Summer II
Com 380, Summer IICom 380, Summer II
Com 380, Summer II
 
PPT
PPTPPT
PPT
 
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet JazzThe MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
 
MICHAEL JACKSON.doc
MICHAEL JACKSON.docMICHAEL JACKSON.doc
MICHAEL JACKSON.doc
 
Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1
 
Facebook
Facebook Facebook
Facebook
 
Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...
 
Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...
 
NEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTNEWS ANNOUNCEMENT
NEWS ANNOUNCEMENT
 
C-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docC-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.doc
 
MAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docMAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.doc
 
Mac OS X Guide.doc
Mac OS X Guide.docMac OS X Guide.doc
Mac OS X Guide.doc
 
hier
hierhier
hier
 
WEB DESIGN!
WEB DESIGN!WEB DESIGN!
WEB DESIGN!
 

Machine Learning and Inductive Inference

  • 1. Machine Learning and Inductive Inference Hendrik Blockeel 2001-2002
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
  • 11. Example: given molecules that are active against some disease, find out what is common in them; this is probably the reason for their activity.
  • 12.
  • 13.
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
  • 19.
  • 20.
  • 21. Overview of design choices type of training experience games against self games against expert table of good moves determine type of target function determine representation determine learning algorithm … … … … ready! Board   Board  Move linear function of 6 features … gradient descent
  • 22.
  • 23.
  • 24.
  • 25.
  • 26.
  • 27.
  • 28.
  • 29.
  • 30.
  • 31.
  • 32.
  • 33.
  • 34.
  • 35.
  • 36.
  • 37.
  • 38.
  • 39.
  • 40.
  • 41.
  • 42.
  • 43.
  • 44.
  • 45.
  • 46.
  • 47.
  • 48.
  • 49.
  • 50.
  • 51.
  • 52. + G S Current versionspace contains all rectangles covering S and covered by G, e.g. h = <2-5,2-3> h S = {<3-5,2-3>} G = {<1-6, 1-3>}
  • 53.
  • 54.
  • 55.
  • 56. Equivalence between inductive and deductive systems inductive system training examples new instance deductive system training examples new instance inductive bias result (by proof) result (by inductive leap)
  • 57.
  • 58.
  • 59.
  • 60.
  • 61.
  • 62.
  • 63.
  • 64.
  • 65.
  • 66.
  • 67.
  • 68.
  • 69.
  • 70.
  • 71. Clustering tree abundance(Tubifex sp.,5) ? T = 0.357111 pH = -0.496808 cond = 1.23151 O2 = -1.09279 O2sat = -1.04837 CO2 = 0.893152 hard = 0.988909 NO2 = 0.54731 NO3 = 0.426773 NH4 = 1.11263 PO4 = 0.875459 Cl = 0.86275 SiO2 = 0.997237 KMnO4 = 1.29711 K2Cr2O7 = 0.97025 BOD = 0.67012 abundance(Sphaerotilus natans,5) ? yes no T = 0.0129737 pH = -0.536434 cond = 0.914569 O2 = -0.810187 O2sat = -0.848571 CO2 = 0.443103 hard = 0.806137 NO2 = 0.4151 NO3 = -0.0847706 NH4 = 0.536927 PO4 = 0.442398 Cl = 0.668979 SiO2 = 0.291415 KMnO4 = 1.08462 K2Cr2O7 = 0.850733 BOD = 0.651707 yes no abundance( ...) <- &quot;standardized&quot; values (how many standard deviations above mean)
  • 72.
  • 73.
  • 74.
  • 75.
  • 76.
  • 77.
  • 78.
  • 79.
  • 80.
  • 81.
  • 82.
  • 83. Overfitting: example + + + + + + + - - - - - - - - - - - - - - + - - - - - area with probably wrong predictions
  • 84.
  • 85.
  • 86.
  • 87.
  • 88. accuracy on training data accuracy on unseen data size of tree accuracy effect of pruning
  • 89.
  • 90.
  • 91. Rules from trees: example Outlook Humidity Wind No Yes No Yes Yes Sunny Overcast Rainy High Normal Strong Weak if Outlook = Sunny and Humidity = High then No if Outlook = Sunny and Humidity = Normal then Yes …
  • 92.
  • 93. Pruning rules: example A false true B true true false true false if A=true then true if A=false and B=true then true if A=false and B=false then false Tree representing A  B Rules represent A  (  A  B) A  B
  • 94.
  • 95.
  • 96.
  • 97. Heuristics compared Good heuristics are strictly concave
  • 98. Why concave functions? E E 1 E 2 p p 2 p 1 Assume node with size n , entropy E and proportion of positives p is split into 2 nodes with n 1 , E 1 , p 1 and n 2 , E 2 p 2 . We have p = (n 1 /n)p 1 + (n 2 /n) p 2 and the new average entropy E’ = (n 1 /n)E 1 +(n 2 /n)E 2 is therefore found by linear interpolation between ( p 1 ,E 1 ) and ( p 2 ,E 2 ) at p . Gain = difference in height between ( p, E ) and ( p,E’ ). (n 1 /n)E 1 +(n 2 /n)E 2 Gain
  • 99.
  • 100. Generic TDIDT algorithm function TDIDT( E : set of examples) returns tree; T' := grow_tree( E ); T := prune ( T' ); return T ; function grow_tree( E : set of examples) returns tree; T := generate_tests ( E ); t := best_test ( T , E ); P := partition induced on E by t ; if stop_criterion ( E , P ) then return leaf( info ( E )) else for all E j in P: t j := grow_tree( E j ); return node( t , {( j,t j )}; 
  • 101.
  • 102.
  • 103.
  • 104.
  • 105. Clustering tree abundance(Tubifex sp.,5) ? T = 0.357111 pH = -0.496808 cond = 1.23151 O2 = -1.09279 O2sat = -1.04837 CO2 = 0.893152 hard = 0.988909 NO2 = 0.54731 NO3 = 0.426773 NH4 = 1.11263 PO4 = 0.875459 Cl = 0.86275 SiO2 = 0.997237 KMnO4 = 1.29711 K2Cr2O7 = 0.97025 BOD = 0.67012 abundance(Sphaerotilus natans,5) ? yes no T = 0.0129737 pH = -0.536434 cond = 0.914569 O2 = -0.810187 O2sat = -0.848571 CO2 = 0.443103 hard = 0.806137 NO2 = 0.4151 NO3 = -0.0847706 NH4 = 0.536927 PO4 = 0.442398 Cl = 0.668979 SiO2 = 0.291415 KMnO4 = 1.08462 K2Cr2O7 = 0.850733 BOD = 0.651707 yes no abundance( ...) <- &quot;standardized&quot; values (how many standard deviations above mean)
  • 106.
  • 107.
  • 108.
  • 109.
  • 110.
  • 111.
  • 112.
  • 113.
  • 114.
  • 115.
  • 116.
  • 117.
  • 118.
  • 119.
  • 120.
  • 121.
  • 122.