5. IT’S NOT ABOUT DEEP LEARNING ONLY
AI ML
NLP
DL
Expert systems
Grammars
ARTIFICIAL INTELLIGENCE ACHIEVEMENTS
RL
6. ANATOMY OF A BOT
Natural
Language
Under-
standing
“I haven’t
received my
shoes I ordered
last week”
Intent: Delivery
problem
Product: Shoes
Date: Last_week
INPUT
Dialog
system
Business
logic
TRANSFORMATION INTO
STRUCTURED DATA
Action: Indicate
delivery status
Delivery date:
Tomorrow
Reason for delay:
Shortage
Natural
language
generation
Parse the input of the user to
extract the meaning behind
it, ideally observing context
FETCHING RESULTS FROM
BACKEND SERVICES
Provide information, insights
and functionality from the
input
FORMULATE
ANSWER
Organize
information in a
coherent,
readable format
“We are sorry.
There was a
shortage on
these shoes.
You will
receive them
tomorrow”
OUTPUT
HOW AI CAN HELP WITH CHATBOTS THOUGH?
7. MACHINE LEARNING FOR NLP
VR AND
TEXT TO SPEECH
POS TAGGING
SENTIMENT
ANALYSIS
NATURAL
LANGUAGE
GENERATION
MACHINE
TRANSLATION
DEPENDENCY
PARSING
ALREADY MANY ACHIEVEMENTS
10. BACKGROUND
Recent but successful technique
Unsupervisedly learned word embeddings have
been exceptionally successful in many NLP tasks.
Maybe the primary reason for NLP's breakout.
Encoding general semantic relationships
Beneficial to many downstream tasks
Distributional hypothesis
Words that are used and occur in the same
contexts tend to purport similar meanings.
12. VECTOR SPACE MODELS
0 1 0 0 1 0 0 0 0 0 2 0 1 0Cat
Doc2 Doc5 Doc11 Doc13
Could be word counts in documents
13. VECTOR SPACE MODELS
0 1 0 0 1 0 0 0 0 0 1 0 1 0
cat cute drinks milk
Bag of words, frequency count
Bi-grams …. N-grams
my cat is so cute when he drinks milk
14. EMBEDDINGS
Vectors discussed so far
are very high
dimensional
Techniques used to
learn
lower-dimensional
vectors are called
embeddings
16. MAIN ADVANTAGES
● One of the few currently successful applications of
unsupervised learning.
● Can be derived from large sets of unannotated corpora
● Pre-trained embeddings can then be used in
downstream tasks that use small amounts of labeled
data.
23. INTENT CLASSIFICATION
Word embeddings are especially helpful when there is little
training data
I will travel to New York tomorrow
I will leave for New York tomorrow
I will take a flight to New York tomorrow
{intent: travel}
24. ENTITY RECOGNITION
Do you know a good vietnamese restaurant?
{restaurant_types: [italian french, japanese]}
{restaurant_type: vietnamese}
25. SOME REFERENCES
Mikolov, T., Corrado, G., Chen, K., & Dean, J. (2013). Efficient Estimation of
Word Representations in Vector Space. Proceedings of the International
Conference on Learning Representations (ICLR 2013), 1–12.
Mikolov, T., Chen, K., Corrado, G., & Dean, J. (2013). Distributed
Representations of Words and Phrases and their Compositionality. NIPS, 1–9.
Bootstrapping Dialog Systems with Word Embeddings
https://www.cs.cmu.edu/~apparikh/nips2014ml-nlp/camera-ready/forgues_e
tal_mlnlp2014.pdf
https://www.npmjs.com/package/word2vec
https://radimrehurek.com/gensim/
26. TELL US WHICH BOT YOU NEED,
WE MOST PROBABLY KNOW HOW TO DO IT
Web
http://botfuel.io
E-mail
sales@botfuel.io
Fueling the next generation of bots