2. About me
• Education
• NCU (MIS)、NCCU (CS)
• Work Experience
• Telecom big data Innovation
• AI projects
• Retail marketing technology
• User Group
• TW Spark User Group
• TW Hadoop User Group
• Taiwan Data Engineer Association Director
• Research
• Big Data/ ML/ AIOT/ AI Columnist
2
「How can you not get romantic about baseball ? 」
3. Tutorial
Content
3
What Is A Markov Chain?
What Is The Markov Property?
What Is A Transition Matrix?
Homework
Markov Chain Applications
5. What Is A Markov Chain?
• A stochastic process containing random variables, transitioning from
one state to another depending on certain assumptions and definite
probabilistic rules.
5
6. What Is A Markov Chain?
• Zero-order Markov Chain
• A general probability model.
• First-order Markov Chain (Most use case)
• Only use previous state to predict the next state.
• Second-order Markov Chain
• M-order Markov Chain
• It requires prior M states for prediction of next state.
• Hidden Markov Chain
• It used to time-series research or speech-recognition.
• Absorb Markov Chain
• An absorbing state is a state that, once entered, cannot be left.
6
7. What Is A Markov Chain?
one pig two pig hail pig happy pig
• Keys denote the unique words in the sentence, i.e., 5 keys (one, two, hail, happy, pig)
• Tokens denote the total number of words, i.e., 8 tokens
7
8. What Is A Markov Chain?
one 1
pig 4
two 1
hail 1
happy 1
keys frequencies
Weighted distribution:
1. pig is 50% (4/8)
2. one, two, hail,
happy is ≈ 13% (1/8)
8
9. What Is The Markov Property?
• It states that the calculated probability of a random process
transitioning to the next possible state is only dependent on the
current state and time and it is independent of the series of states
that preceded it.
• Random variables in such manner that the variables follow the
Markov Property.
9
10. What Is The Markov Property?
start one pig two pig hail pig happy pig end
(start, one)
(one, pig)
(pig, two)(pig, hail)(pig, happy)(pig, end)
(two, pig)
(hail, pig)
(happy, pig)
(end, none)
(start, one)
(one, pig)
(pig, two)
(two, pig)
(pig, hail)
(hail, pig)
(pig, happy)
(happy, pig)
(pig, end)
(end, none)
10
11. What Is The Markov Property?
start one pig two pig hail pig happy pig end
(start, one)
(one, pig)
(pig, two)(pig, hail)(pig, happy)(pig, end)
(two, pig)
(hail, pig)
(happy, pig)
(end, none)
start [one]
one [pig]
pig [two, hail, happy, end]
two [pig]
hail [pig]
happy [pig]
end [none]
key and list
11
12. What Is The Markov Property?
start [one]
one [pig]
pig [two, hail, happy, end]
two [pig]
hail [pig]
happy [pig]
end [none]
start
one
pig
happy
two
end
hail
¼
¼
¼
1
1
1
¼
1
1
Homework 1: Makes Transition Matrix from state transition diagram
12
13. What Is A Transition Matrix?
• In a Markov Process, we use a matrix to represent the transition
probabilities from one state to another.
• This matrix is called Transition or Probability Matrix.
• It is usually denoted by P.
0.6 0.3 0.1
0.2 0.7 0.1
0.3 0.2 0.5
P =
Transition
A B
C
20%
60%
10%
30%
50%
20%
10%
State Transition Diagram
state
1
1
1
Each row sum
=
A
A B C
B
C
30% 70%
13
14. What Is A Transition Matrix?
• Assumption
• A stores 40% of customers
• B stores 50% of customers
• C stores 10 % of customers
• The initial state is
• X0 = [0.4 0.5 0.1]
40% 50%
10%
14
15. What Is A Transition Matrix?
• Powers of a Transition Matrix (Calculating state, next state…)
X0 x P = [0.4 0.5 0.1] = [0.37 0.49 0.14] = X1
X1 x P = [0.37 0.49 0.14] = [0.362 0.482 0.156] = X2
[Or]
X2 = Xo x P² = [0.4 0.5 0.1] = [0.362 0.482 0.156]
37% of customer will go to A after the first state
15
16. What Is A Transition Matrix?
• Find the final stable state
• X0 x P = X1
• X1 x P = X2
• …
• X̅ = [ A B C] => stable means: X̅ x P = X̅, and A + B + C =1
[ A B C] = [ A B C], solve: A = 13/36 => 0.361
B = 17/36 => 0.472
C = 6/36 => 0.167
• X̅ = [0.361 0.472 0.167]
16
17. What Is A Transition Matrix?
• Find the final stable state
• Try code
36.1%
47.2%
16.7%
First_order_Markov_Chain_1.ipynb
17
18. Markov Chain Applications
• Generate text simulation
• To apply Markov Property and create a Markov Model by studying Donald
Trump speech dataset.
First_order_Markov_Chain_2.ipynb
18
19. Markov Chain Applications
• PageRank + MC
• This model can be seen as Markov
chain to predict the behavior of a
system that travels from one state to
another state considering only the
current condition.
C has higher PageRank than E, even though E has more inlinks.
19