1. Machine Learning for Language Technology 2015
http://stp.lingfil.uu.se/~santinim/ml/2015/ml4lt_2015.htm
What is Machine Learning?
Marina Santini
santinim@stp.lingfil.uu.se
Department of Linguistics and Philology
Uppsala University, Uppsala, Sweden
Autumn 2015
2. Acknowledgements
ā¢ Thanks to Hal Daumeā III, Andrew Y. Ng, Arthur
Samuel, Tom Mitchell, Ethem Alpaydin,
Michael Jordan, Michael Collins, Joakim Nivre,
Pedro Domingo, wikipedia, the web.
Lecture 1: What is Machine Learning? 2
3. What is Machine Learning?
http://www.umiacs.umd.edu/~hal/ciml/
ā¢ Machine learning is the study of computer
systems that learn from data and experience.
ā¢ It is applied in an incredibly wide variety of
application areas, from medicine to
advertising, from military to pedestrian.
ā¢ Any area in which you need to make sense of
data is a potential customer of machine
learning.
[Hal Daumeā III]
Lecture 1: What is Machine Learning? 3
4. Example: Rule-based systems
ā¢ Parts-Of-Speech Tagger (Brillās tagger)
ā¢ tag1--> tag2 IF Condition
the Condition tests the preceding and/or following word tokens, or their tags
(the notation for such rules differs between implementations). For example,
in Brill's notation:
ā¢ IN NN WDPREVTAG DT while
change the tag of a word from IN (preposition) to NN (common noun), if the
preceding word's tag is DT (determiner) and the word itself is "while". This
covers cases like "all the while" or "in a while", where "while" should be
tagged as a noun rather than its more common use as a preposition (many
rules are more general).
Lecture 1: What is Machine Learning? 4
5. Example: Machine Learning-Based Systems
(From: The Apache OpenNLP library, a machine learning based toolkit for the
processing of natural language text: https://opennlp.apache.org/ )
ā¢ We do not have rules , but a training corpus/dataset: The POS Tagger can be
trained on annotated training material. The training material is a collection of
tokenized sentences where each token has the assigned part-of-speech tag. The
training material may look like this:
About_IN
10_CD
Euro_NNP
,_,
I_PRP
reckon_VBP
That_DT
sounds_VBZ
good_JJ
ā¢ With this annotated material we train a (mathematical/statistical) model.
ā¢ Then we evaluate the results: how well does this model perform? The accuracy can
be measured on a test dataset or via cross validation.
Lecture 1: What is Machine Learning? 5
6. Generally speaking: Deduction vs Induction
ā¢ Deductive reasoning works from the more general to the more
specific. Sometimes this is informally called a "top-down" approach.
We might begin with thinking up a theory about our topic of
interest. We then narrow that down into more specific hypotheses
that we can test. This ultimately leads us to be able to test the
hypotheses with specific data -- a confirmation (or not) of our
original theories.
ā¢ Inductive reasoning works the other way, moving from specific
observations to broader generalizations and theories. Informally,
we sometimes call this a "bottom up" approach. In inductive
reasoning, we begin with specific observations (the data) and
measures, begin to detect patterns and regularities, formulate
some tentative hypotheses that we can explore, and finally end up
developing a general model.
Lecture 1: What is Machine Learning? 6
7. Machine Learning is based on ...
ā¢ Induction
ā¢ Generalization from data
Lecture 1: What is Machine Learning? 7
8. In summary (by Hal Daumeā III)
https://piazza.com/umd/fall2015/cmsc422/home
ā¢ Machine learning is all about finding patterns in data.
The whole idea is to replace the "human writing code"
with a "human supplying data" and then let the system
figure out what it is that the person wants to do by
looking at the examples.
ā¢ The most central concept in machine learning is
generalization: how to generalize beyond the examples
that have been provided at "training time" to new
examples that you see at "test time.ā
ā¢ A very large fraction of what we'll talk about has to do
with figuring out what generalization means.
Lecture 1: What is Machine Learning? 8
9. Why learning?
ā¢ We do not know the exact method: speech
recognition, spam filters, robotics, etc.
ā¢ The exact method is too expensive: statistical
physics, etc.
ā¢ Task evolves over time...
ā¢ There is no need to use machine learning for
computing a payroll: for this task we just need
an algorithm!
Lecture 1: What is Machine Learning? 9
10. Why is ML so fashionable?
ā¢ Broad applicability:
ā Finance, robotic, medicin, NLP, IT, MT etc.
ā¢ Close connection between theory and practice
ā¢ Open field, lots of room for new work.M
Lecture 1: What is Machine Learning? 10
11. Interdisciplinary Field
Machine learning is:
ā¢ a subfield of computer science that evolved from the study
of pattern recognition and computational learning
theory in artificial intelligence.
ā¢ Machine learning explores the study and construction of
algorithms that can LEARN from and make predictions on data.
ā¢ Such algorithms operate by building a model from example inputs
in order to make data-driven predictions or decisions, rather than
following strictly static program instructions.
[wikipidia]
Lecture 1: What is Machine Learning? 11
12. Machine Learning & Statistics
ā¢ Machine learning and statistics are closely
related fields.
ā¢ According to Michael Jordan, the ideas of
machine learning, from methodological
principles to theoretical tools, have had a long
pre-history in statistics. He also suggested the
term data science as a placeholder to call the
overall field.
Lecture 1: What is Machine Learning? 12
13. Machine Learning and Data Mining
ā¢ Machine Learning relates with the study, design
and development of models and algorithms that
give computers the capability to learn from data.
ā¢ Data Mining can be defined as the process that
starting from apparently unstructured data tries
to extract knowledge and/or unknown interesting
patterns. During this process machine Learning
algorithms are used.
Lecture 1: What is Machine Learning? 13
14. Many Types of Learning
ā¢ Supervised learning
ā Supervised classification is the only focus of this
course
ā¢ Unsupervised learning
ā¢ Semi- and weakly supervised
ā¢ Reinforcement learning
ā¢ etc.
Lecture 1: What is Machine Learning? 14
15. In other words
ā¢ Supervised learning: learning with a teacher
(ļ with class labels)
ā¢ Unsupervised learning: learning without a
teacher (ļ without class labels)
Lecture 1: What is Machine Learning? 15
16. Learning problems
ā¢ Regression
ā¢ Binary classification
ā¢ Multiclass classification
ā¢ Ranking
ā¢ etc.
Lecture 1: What is Machine Learning? 16
17. Informal and Formal Definitions
ā¢ There are plenty of definitions...
ā¢ Informal: The field of study that gives computers the
ability to learn without being explicitly programmed
(Arthur Samuel, 1959)
ā¢ Formal: A computer program is said to learn from
experience E, with respect to some task T, and some
performance measure P, if its performance on T as
measured by P improves with experience E (Tom
Mitchell, 1998).
Lecture 1: What is Machine Learning? 17
18. Example: Spam Filter (by Andrew Y. Ng)
1. Classifying emails as spam or not spam
2. Watching you label emails as spam or not spam.
3. The number (or fraction) of emails correctly classified as spam/not spam.
4. None of the aboveāthis is not a machine learning problem.
Lecture 1: What is Machine Learning? 18
19. Spam Filter (by Andrew Y. Ng)
ā¢ Answer:
Lecture 1: What is Machine Learning? 19
20. Classification: Questions
ā¢ How would you write a program to
distinguish a picture of me from a picture of
someone else?
ā¢ How would you write a program to determine
whether a sentence is grammatical or not?
ā¢ How would you write a program to distinguish
cancerous cells from normal cells?
Lecture 1: What is Machine Learning? 20
21. Classification: Answers
ā¢ How would you write a program to distinguish a picture of
me from a picture of someone else?
ā Provide examples pictures of me and pictures of other people
and let a classiļ¬er learn to distinguish the two.
ā¢ How would you write a program to determine whether a
sentence is grammatical or not?
ā Provide examples of grammatical and ungrammatical
sentences and let a classiļ¬er learn to distinguish the two.
ā¢ How would you write a program to distinguish cancerous
cells from normal cells?
ā Provide examples of cancerous and normal cells and let
a classiļ¬er learn to distinguish the two.
Lecture 1: What is Machine Learning? 21
22. Elements of Machine Learning
ā¢ Generalization: how well a model performs on new data.
ā¢ Data:
ā Training data: specific examples to learn from.
ā Test data: new specific examples to assess performance.
ā¢ Models (theoretical assumptions)
ā decision trees, naive bayes, perceptron, etc.
ā¢ Algorithms:
ā Learning altorighms that infer the model parameters from the data.
ā Inference algorithms that infer prediction from a model.
Lecture 1: What is Machine Learning? 22
23. Generalization
ā¢ Predicting the future based on the past
ā¢ Our system needs to generalize beyond the
training data to some future data that it might
not have seen yet.
Lecture 1: What is Machine Learning? 23
24. Generalization: Overfitting & Underfitting
ā¢ Overfitting occurs when the model fits the training data too well
and does not generalize so it performs badly on the test
data. Overfitting is often a result of an excessively complicated
model
ā¢ Underfitting occurs when the model does not fit the data well
enough. Underfitting is often a result of an excessively simple
model.
ā¢ Both overfitting and underfitting lead to poor predictions on new
data sets.
ā¢ A learning model that overfits or underfits does not generalize well.
Lecture 1: What is Machine Learning? 24
25. Example: Letter vs non-letter classification
Training
set
Test set
Lecture 1: What is Machine Learning? 25
26. Data:
The iris
dataset
Three components:
1. Class label (aka ālabelā, denoted y)
2. Features (aka āattributesā)
3. Feature values (aka āattribute valuesā, denoted x) ā Features can be
binary, nominal or continuous
ā¢ A labeled dataset is a collection of (x,y) pairs
Lecture 1: What is Machine Learning? 26
27. Task
ā¢ Predict the class for this ātestā example:
Sepal length ā Sepal width ā Petal length ā Petal width - Type
5.2 3.7 1.7 0.3 ???
Lecture 1: What is Machine Learning? 27
Require us to
generalize from
the training data
28. Noise
ā¢ Unexplained or random variation in the data
ā¢ Anomaly
Lecture 1: What is Machine Learning? 28
29. Models
Models are theoretical assumptions
ā decision trees, naive bayes, perceptron, etc.
Given:
ā Domain X: descriptions
ā Domain Y: predictions
ā H: hypothesis space; the set of all possible hypotheses
ā h: target hypothesis
ā Idea: to extrapolate observed y's over all X.
ā Hope: to predict well on future y's given x's.
ā Require: there must be regularities to be found.
Lecture 1: What is Machine Learning? 29
30. Inductive bias
ā¢ In ML, the inductive bias is the set of
theoretical assumptions that must be added
to the observed data to transform the
algorithm's outputs into logical deductions.
Lecture 1: What is Machine Learning? 30
31. Examples of inductive biases
ā¢ Decision trees (ID3): Shorter trees are preferred
over larger trees. Trees that place high
information gain attributes close to the root are
preferred over those that do not.
ā¢ Naive Bayes classifier: maximize conditional
independence.
ā¢ Logistic regression: there exists a simple
boundary that splits one class from the other, and
the further you get from that boundary the more
confident you can be.
ā¢ Perceptron: data must be linearly separable.
Lecture 1: What is Machine Learning? 31
32. Inductive Bias: definition
ā¢ āThe inductive bias of a learning algorithm is
the set of assumptions that the learner uses
to predict outputs given inputs that it has not
encounteredā.
ā Tom Mitchell, 1980
Lecture 1: What is Machine Learning? 32
33. Learning & Inference Algorithms
ā¢ Traditionally, the goal of learning has been to find
a model for which prediction (i.e., inference)
accuracy is as high as possible.
ā¢ More recently: find models for which prediction
(i.e., inference) is as efficient as possible.
ā¢ In practical terms: recent interest in more
unconventional approaches to learning that
combine generalization accuracy with other
desiderata such as faster inference.
Lecture 3: Basic Concepts of ML 33