Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Stochastic Processes - part 4
1. Electronic noise
Three types of noise are commonly observed
Shot noise. Electric currents are not continuous but are
ultimately made up from large numbers of moving
charge carriers, typically electrons. Shot noise arises
from statistical fluctuations in the flow of charge
carriers: if a single bit of data is represented by 10,000
electrons, the magnitude of the fluctuations will
typically be about 1%. When looked at as a waveform
over time, shot noise has a flat frequency spectrum.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.1/123
2. Thermal (Johnson) noise. Even though an electric
current may have a definite overall direction, the
individual charge carriers within it will exhibit random
motions. In a material at nonzero temperature, the
energy of these motions and thus the intensity of the
thermal noise they produce is essentially proportional
to temperature. (At very low temperatures, quantum
mechanical fluctuations still yield random motion in
most materials.) Like shot noise, thermal noise has a
flat frequency spectrum.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.2/123
3. Flicker (1/f) noise. Almost all electronic devices also
exhibit a third kind of noise, whose main characteristic
is that its spectrum is not flat, but instead goes roughly
like 1/f over a wide range of frequencies. Such a
spectrum implies the presence of large low-frequency
fluctuations, and indeed fluctuations are often seen
over timescales of many minutes or even hours. Unlike
the types of noise described above, this kind of noise
can be affected by details of the construction of
individual devices. Although seen since the 1920s its
origins remain somewhat mysterious
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.3/123
4. Shot noise in electronic devices consists of random
fluctuations of the electric current in an electrical
conductor, which are caused by the fact that the current is
carried by discrete charges (electrons).
Shot noise is to be distinguished from current fluctuations
in equilibrium, which happen without any applied voltage
and without any average current flowing. These equilibrium
current fluctuations are known as thermal noise.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.4/123
6. Given a set of Poisson points ti with average density λ and
a real system h(t)
S(t) =
X
i
h(t − ti)
is an SSS process known as shot noise.
S(t) = Z(t) ∗ h(t) =
Z ∞
−∞
h(α)Z(t − α)dα
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.6/123
7. E{S(t)} =
Z ∞
−∞
h(α)E{Z(t − α)}dα
= ηZ
Z ∞
−∞
h(α)dα = ηZH(0)
Z(t) =
X
i
δ(t − ti) =
d
dt
X(t)
z }| {
X
i
u(t − ti), Z(t) = X′
(t)
X(t) is a Poisson process.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.7/123
9. Markov process:
In probability theory, a Markov process is a stochastic
process characterized as follows: The state ck at time k is
one of a finite number in the range
{1, · · · , M}
Under the assumption that the process runs only from time
0 to time N and that the initial and final states are known,
the state sequence is then represented by a finite vector
C = (c0, · · · , cN ).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.9/123
10. Let P(ck|c0, c1, · · · , c(k−1)) denote the probability (chance of
occurrence) of the state ck at time k conditioned on all states
up to time k − 1. Suppose a process was such that ck de-
pended only on the previous state ck−1 and was indepen-
dent of all previous states. This process would be known as
a first-order Markov process.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.10/123
11. This means that the probability of being in state ck at time
k, given all states up to time k − 1 depends only on the
previous state, i.e. ck−1 at time k − 1:
P(ck|c0, c1, . . . , ck−1) = P(ck|ck−1).
For an nth-order Markov process,
P(ck|c0, c1, . . . , ck−1) = P(ck|ck−n, . . . , ck−1).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.11/123
12. The underlying process is assumed to be a Markov
process with the following characteristics:
finite-state, this means that the number M is finite.
discrete-time, this means that going from one state to
other takes the same unit of time.
observed in memoryless noise, this means that the
sequence of observations depends probabilistically
only on the previous sequence transitions.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.12/123
13. In probability theory, a stochastic process has the Markov
property if the conditional probability distribution of future
states of the process, given the present state, depends only
upon the current state, i.e. it is conditionally independent of
the past states (the path of the process) given the present
state. A process with the Markov property is usually called
a Markov process, and may be described as Markovian.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.13/123
14. Mathematically, if X(t), t > 0, is a stochastic process, the
Markov property states that
Pr[X(t+h) = y|X(s) = x(s), s ≤ t] = Pr[X(t+h) = y|X(t) = x(t)], ∀h > 0
Markov processes are typically termed (time-)
homogeneous if
Pr[X(t + h) = y|X(t) = x(t)] = Pr[X(h) = y|X(0) = x(0)], ∀t, h > 0
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.14/123
15. and otherwise are termed (time-) inhomogeneous (or (time-
) nonhomogeneous). Homogeneous Markov processes,
usually being simpler than inhomogeneous ones, form the
most important class of Markov processes.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.15/123
16. In some cases, apparently non-Markovian processes may
still have Markovian representations, constructed by
expanding the concept of the ’current’ and ’future’ states.
Let X be a non-Markovian process. Then we define a
process Y, such that each state of Y represents a
time-interval of states of X, i.e. mathematically
Y (t) = {X(s) : s ∈ [a(t), b(t)]}.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.16/123
17. If Y has the Markov property, then it is a Markovian rep-
resentation of X. In this case, X is also called a second-
order Markov process. Higher-order Markov processes are
defined analogously.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.17/123
18. An example of a non-Markovian process with a Markovian
representation is a moving average time series.
Xt = εt +
q
X
i=1
θiεt−i
where the θ1, · · · , θq are the parameters of the model and
the εt, εt−1, · · · are again, the error terms. A moving aver-
age model is essentially a finite impulse response filter with
some additional interpretation placed on it.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.18/123
19. The most famous Markov processes are Markov chains,
but many other processes, including Brownian motion, are
Markovian.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.19/123
20. Markov chain:
A collection of random variables {Xt} (where the index
runs through 0, 1, . . . ) having the property that, given the
present, the future is conditionally independent of the past.
In other words,
P{Xt = j|X0 = i0, X1 = i1, · · · , Xt−1 = i(t−1)} = P{Xt = j|Xt−1 = it−1}
If a Markov sequence of random variates Xn take the dis-
crete values {a1, · · · , aN }, then
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.20/123
21. P{xn = ain |xn−1 = ain−1 , · · · , x1 = ai1 } = P{xn = ain |xn−1 = ain−1 }
and the sequence Xn is called a Markov chain. A simple
random walk is an example of a Markov chain.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.21/123
22. Example of Markov chains
The probabilities of weather conditions, given the weather
on the preceding day, can be represented by a transition
matrix:
P =
0.9 0.5
0.1 0.5
The matrix P represents the weather model in which a
sunny day is 90% likely to be followed by another sunny
day, and a rainy day is 50% likely to be followed by another
rainy day.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.22/123
23. The columns can be labelled “sunny” and “rainy” respec-
tively, and the rows can be labelled in the same order. Pij
is the probability that, if a given day is of type j, it will be
followed by a day of type i. Notice that the columns of P
sum to 1. This is because P is a stochastic matrix.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.23/123
24. Predicting the weather
The weather on day 0 is known to be sunny. This is
represented by a vector in which the “sunny” entry is
100%, and the “rainy” entry is 0%:
x(0)
=
1
0
The weather on day 1 can be predicted by:
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.24/123
25. x(1)
= Px(0)
=
0.9 0.5
0.1 0.5
1
0
=
0.9
0.1
Thus, there is an 90% chance that day 1 will also be sunny.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.25/123
26. The weather on day 2 can be predicted in the same way:
x(2)
= Px(1)
= P2
x(0)
=
0.9 0.5
0.1 0.5
2
1
0
=
0.86
0.14
General rules for day n are:
x(n)
= Px(n−1)
x(n)
= Pn
x(0)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.26/123
27. Steady state of the weather In this example, predictions for
the weather on more distant days are increasingly
inaccurate and tend towards a steady state vector. This
vector represents the probabilities of sunny and rainy
weather on all days, and is independent of the initial
weather.
The steady state vector is defined as:
q = lim
n→∞
x(n)
but only converges if P is a regular transition matrix (that
is, there is at least one Pn
with all non-zero entries).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.27/123
28. Since the q is independent from initial conditions, it must be
unchanged when transformed by P. This makes it an
eigenvector (with eigenvalue 1), and means it can be
derived from P. For the weather example:
P =
0.9 0.5
0.1 0.5
Pq = q
(I − P)q = 0 ⇒
0.1 −0.5
−0.1 0.5
q =
0
0
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.28/123
29. q1 + q2 = 1, 0.1q1 − 0.5q2 = 0 ⇒
q1 = 0.833, q2 = 0.167
In conclusion, in the long term, 83% of days are sunny.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.29/123
30. For homogeneous Markov chain:
The statistics of any order can be determined in terms of
the conditional PDF f(Xn|Xn−1) and f(Xn),
f(Xn|Xn−1, · · · , X0) = f(Xn|Xn−1)
and if Xn is stationary then f(Xn) and f(Xn|Xn−1) are
invariant to shift of origin. A Markov process is
homogeneous if the conditional PDF f(Xn|Xn−1) is
invariant to shift of the origin
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.30/123
31. Chapman-Kolmogorov Equation
f(xn|xs) =
Z ∞
−∞
f(xn|xr)f(xr|xs) dxr
which gives the transitional densities of a Markov sequence.
Here, n r s are any integers.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.31/123
32. Continuous time-discrete state Markov chain (CTDSMC)
A CTDSMC is Markov process X(t) consisting of a family
of staircase functions(discrete states) with discontinuities at
random times tn.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.32/123
35. X(t) at these random points form a discrete-state Markov
sequence called the Markov chain imbedded in the
process X(t).
A discrete-state stochastic process is called semi-Markov if
it is not Markov but the imbedded sequence qn is Markov
chain.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.35/123
36. Pi(t) = P{X(t) = ai}, state probabilities
πij(t1, t2) = P{X(t2) = aj|X(t1) = ai}, transition probabilities
X
j
πij(t1, t2) = 1,
X
i
Pi(t1)πij(t1, t2) = Pj(t2)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.36/123
37. Discrete-time Markov chain is Markov chain where Xn has a
countable number of states ai. This kind of Markov process
is specified by:
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.37/123
38. state probabilities:
Pi(n) = P{Xn = ai}, i = 1, 2, · · ·
transition probabilities:
πij(n1, n2)P{Xn2 = aj|Xn1 = ai}
X
j
πij(n1, n2) = 1
P{Xn = aj} =
X
i
P{Xn2 = aj, Xn1 = ai}
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.38/123
40. If Xn is homogeneous, then the transition probabilities
depend only on m = n2 − n1 ⇒
πij(n1, n2) = πij(m) = P{Xn+m = aj|Xn = ai}
the Chapman-Kolmogorov equation will be:
πij(n1, n3) =
X
r
πir(n1, n3)πrj(n2, n3) ⇒
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.40/123
41. Πij(n + k) =
X
r
elements
z }| {
Πir(k)
| {z }
matrix
Πrj(n)
| {z }
matrix
Π(n + k) = Π(k)Π(n)
the right hand side is matrix at time n + k
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.41/123
42. For homogeneous discrete Markov chain
P{Xn2 = aj} =
X
i
P{Xk = ai}πij(k, n) ⇒
P(n) = P(0)Πn
Π =
π11 π12 · · · π1N
π21 π22 · · · π2N
.
.
.
.
.
. · · ·
.
.
.
πN1 πN2 · · · πNN
,
X
j
πij = 1
Πn
is the state transition matrix at the time instant n = 2.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.42/123
54. E{X(t1)X(t2)} = RXX(t1, t2)
by using
Π′
(τ) = Π(τ)Λ
we have
RXX(t1, t2) = A2
[P{X(t + τ) = A, X(t) = A}
+P{X(t + τ) = −A, X(t) = −A}−
P{X(t + τ) = −A, X(t) = A}
−P{X(t + τ) = A, X(t) = −A}]
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.54/123
55. Then we conclude that the telegraph signal is a non-
stationary signal. But as t → ∞, E{X(t)} is constant and
E{X(t1)X(t2)} is only a function of time lag.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.55/123
56. Caley-Hamilton theorem:
∆(A) = |A − λI| = (−λ)n
+
n−1
X
i=0
ciλi
= characteristic polynomial
∆(λ) = (−1)n
An
+
n−1
X
i=0
ciAi
A = MΛM−1
⇒ Ak
= MΛk
M−1
⇒
∆(A) = M
(−1)n
Λn
+
n−1
X
i=0
ciΛi
#
M−1
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.56/123
75. If for n → ∞,
P(n) = P0Πn
has a limit which is independent of state of Markov chain,
then the Markov process is called ergodic and irreducible.
In general, for irreducible, ergodic Markov chain:
lim
n→∞
πij(n) = Pj
these steady state probabilities Pj satisfy
Pn
j=1 Pj = 1
Pj =
Pn
i=1 Piπij
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.75/123
76. Birth-Death process:
A birth is referred to as an arrival a death as a departure
from a physical system.
X(t) is the number of customers in the system at time
t.
States=0,1,2,· · ·,j,j+1,· · ·
S is the number of servers.
Pj(t) = P{X(t) = j}
Pj = lim
t→∞
Pj(t)
λn mean arrival rate given n customers are in the
system
µn mean service rate given n customers are in the
system AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.76/123
77. Birth-death process can be used to describe how X(t)
changes through time.
The basic assumption is Poisson arrival so that the
probability of more than one birth or death at the same
instant is zero.
Because of Poisson arrival, when X(t) = j, the PDF of the
time to the next birth(arrival) is exponential with parameter
λj.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.77/123
80. Specific scenarios:
λj = λ µ =mean service rate per busy server.
µj = sµ, j s
µj = jµ, j s
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.80/123
81. In queueing theory, the inter-arrival times (i.e. the times be-
tween customers entering the system) are often modeled
as exponentially distributed variables. The length of a pro-
cess that can be thought of as a sequence of several inde-
pendent tasks is better modeled by a variable following the
gamma distribution (which is a sum of several independent
exponentially distributed variables).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.81/123
82. The exponential distribution is used to model Poisson pro-
cesses, which are situations in which an object initially in
state A can change to state B with constant probability per
unit time λ. The time at which the state actually changes is
described by an exponential random variable with parame-
ter λ. Therefore, the integral from 0 to T over PDF is the
probability that the object is in state B at time T.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.82/123
83. An important property of the exponential distribution is that
it is memoryless. This means that if a random variable T is
exponentially distributed, its conditional probability obeys
P(T s + t|T t) = P(T s), ∀s, t 0.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.83/123
84. This says that the conditional probability that we need to
wait, for example, more than another 10 seconds before
the first arrival, given that the first arrival has not yet hap-
pened after 30 seconds, is no different from the initial prob-
ability that we need to wait more than 10 seconds for the
first arrival. This is often misunderstood by students tak-
ing courses on probability: the fact that P(T 40|T 30) =
P(T 10) does not mean that the events T 40 and T 30
are independent.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.84/123
85. To summarize: “memorylessness” of the probability
distribution of the waiting time T until the first arrival means
P(T 40|T 30) = P(T 10)
It does not mean
P(T 40|T 30) = P(T 40)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.85/123
86. The arrival and service processes follow the following PDF:
fa(t) = λe−λt
, fs(t) = µe−µt
The inter-arrival and service times in a busy channel
produce rates λ and µ.
Mean inter-arrival time = 1
λ
Mean for busy channel to complete service = 1
µ
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.86/123
87. Expected number of customers in the queueing system:
L =
∞
X
j=0
jPj
Expected queue length:
Lq =
∞
X
j=0
jPj − s
W =Expected waiting time in the system. Wq =Expected
waiting time in the queue(excluding service time).
L = λW, Lq = λWq
W = Wq +
1
µ
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.87/123
88. If λj are not equal, λ̄ replaces λ
λ̄ =
∞
X
j=0
λjPj
System utilization factor:
ρ =
λ
sµ
ρ is the fraction of time that the servers are busy.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.88/123
89. An important case s = 1:
We assume an unlimited queue length with exponential
inter-arrivals and
λ0 = λ1 = λ2 = · · · = λ
We also assume the service times will be independent with
exponential distribution, and
µ1 = µ2 = µ3 = · · · = µ ⇒
Cj =
λ
µ
j
= ρj
, j = 1, 2, · · ·
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.89/123
90. Pj = ρj
P0, P0 =
1
1 +
P∞
j=1 ρj
= 1 − ρ
The steady state probabilities:
Pj = (1 − ρ)ρj
, j = 1, 2, · · ·
Pj the probability that there are j customers in the system
follows a geometric distribution.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.90/123
91. The exponential distribution may be viewed as a
continuous counterpart of the geometric distribution, which
describes the number of Bernoulli trials necessary for a
discrete process to change state. In contrast, the
exponential distribution describes the time for a continuous
process to change state.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.91/123
92. Expected number of customers in the system:
L =
∞
X
j=0
jPj =
∞
X
j=0
j(1 − ρ)ρj
=
(1 − ρ)ρ
∞
X
j=0
d
dρ
ρj
=
ρ
1 − ρ
Expected queue length:
Lq =
∞
X
j=0
jPj − 1 = L − (1 − P0) =
λ2
µ(µ − λ)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.92/123
93. Expected waiting time in the system:
W =
L
λ
=
ρ
λ(1 − ρ)
=
1
µ − λ
Expected waiting time in the queue:
Wq =
Lq
λ
=
λ
µ(µ − λ)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.93/123
94. Queueing Theory Basics
A good understanding of the relationship between conges-
tion and delay is essential for designing effective congestion
control algorithms. Queuing Theory provides all the tools
needed for this analysis.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.94/123
95. Communication Delays
Lets understand the different components of delay in a
messaging system. The total delay experienced by
messages can be classified into the following categories:
Processing Delay
Queuing Delay
Transmission Delay
Propagation Delay
Retransmission Delay
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.95/123
96. Processing Delay
This is the delay between the time of receipt of a
packet for transmission to the point of putting it into the
transmission queue.
On the receive end, it is the delay between the time of
reception of a packet in the receive queue to the point
of actual processing of the message.
This delay depends on the CPU speed and CPU load
in the system.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.96/123
97. Queuing Delay
This is the delay between the point of entry of a packet
in the transmit queue to the actual point of
transmission of the message.
This delay depends on the load on the communication
link.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.97/123
98. Transmission Delay
This is the delay between the transmission of first bit of
the packet to the transmission of the last bit.
This delay depends on the speed of the
communication link.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.98/123
99. Propagation Delay
This is the delay between the point of transmission of
the last bit of the packet to the point of reception of last
bit of the packet at the other end.
This delay depends on the physical characteristics of
the communication link.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.99/123
100. Retransmission Delay
This is the delay that results when a packet is lost and
has to be retransmitted.
This delay depends on the error rate on the link and
the protocol used for retransmissions.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.100/123
101. We will be dealing primarily with queueing delay.
Little’s Theorem
Little’s theorem states that:
The average number of customers (N) can be determined
from the following equation:
N = λT
λ is the average customer arrival rate.
T is the average service time for a customer.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.101/123
102. We will focus on an intuitive understanding of the result.
Consider the example of a restaurant where the customer
arrival rate (λ) doubles but the customers still spend the
same amount of time in the restaurant (T). This will double
the number of customers in the restaurant (N). By the same
logic if the customer arrival rate remains the same but the
customers service time doubles, this will also double the
total number of customers in the restaurant.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.102/123
103. Queueing System Classification
With Little’s Theorem, we have developed some basic un-
derstanding of a queueing system. To further our under-
standing we will have to dig deeper into characteristics of a
queueing system that impact its performance.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.103/123
104. For example, queueing requirements of a restaurant will
depend upon factors like:
How do customers arrive in the restaurant? Are
customer arrivals more during lunch and dinner time (a
regular restaurant)? Or is the customer traffic more
uniformly distributed (a cafe)?
How much time do customers spend in the restaurant?
Do customers typically leave the restaurant in a fixed
amount of time? Does the customer service time vary
with the type of customer?
How many tables does the restaurant have for
servicing customers?
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.104/123
105. The above three points correspond to the most important
characteristics of a queueing system. They are explained
under the following:
Arrival Process
Service Process
Number of Servers
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.105/123
106. Arrival Process
The probability density distribution that determines the
customer arrivals in the system.
In a messaging system, this refers to the message
arrival probability distribution.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.106/123
107. Service Process
The probability density distribution that determines the
customer service times in the system.
In a messaging system, this refers to the message
transmission time distribution. Since message
transmission is directly proportional to the length of the
message, this parameter indirectly refers to the
message length distribution.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.107/123
108. Number of customers
Number of servers available to service the customers.
In a messaging system, this refers to the number of
links between the source and destination nodes.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.108/123
109. Based on the above characteristics, queueing systems can
be classified by the following convention:
A/S/n
Where A is the arrival process, S is the service process
and n is the number of servers. A and S can be any of the
following:
M (Markov) Exponential probability density
D (Deterministic) All customers have the same value
G (General) Any arbitrary probability distribution
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.109/123
110. Examples of queueing systems that can be defined with
this convention are:
1. M/M/1
2. M/D/n
3. G/G/n
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.110/123
111. M/M/1:
This is the simplest queueing system to analyze. Here the
arrival and service time are poisson process. The system
consists of only one server. This queueing system can be
applied to a wide variety of problems as any system with
a very large number of independent customers can be ap-
proximated as a Poisson process. Using a Poisson process
for service time however is not applicable in many applica-
tions and is only a crude approximation.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.111/123
112. M/D/n:
Here the arrival process is poisson and the service time
distribution is deterministic. The system has n servers. (e.g.
a ticket booking counter with n cashiers.) Here the service
time can be assumed to be same for all customers).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.112/123
113. G/G/n:
This is the most general queueing system where the ar-
rival and service time processes are both arbitrary. The
system has n servers. No analytical solution is known for
this queueing system.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.113/123
114. M/M/1 Queueing System
M/M/1 refers to Poisson arrivals and service times with a
single server. This is the most widely used queueing sys-
tem in analysis as pretty much everything is known about
it. M/M/1 is a good approximation for a large number of
queueing systems.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.114/123
115. Poisson Arrivals
M/M/1 queueing systems assume a Poisson arrival
process. This assumption is a very good approximation for
arrival process in real systems that meet the following
rules:
1. The number of customers in the system is very large.
2. Impact of a single customer on the performance of the
system is very small, i.e. a single customer consumes
a very small percentage of the system resources.
3. All customers are independent, i.e. their decision to
use the system are independent of other users.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.115/123
116. Example: Cars on a Highway
As you can see these assumptions are fairly general, so
they apply to a large variety of systems. Lets consider the
example of cars entering a highway. Lets see if the above
rules are met.
1. Total number of cars driving on the highway is very
large.
2. A single car uses a very small percentage of the
highway resources.
3. Decision to enter the highway is independently made
by each car driver.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.116/123
117. The above observations mean that assuming a Poisson ar-
rival process will be a good approximation of the car arrivals
on the highway. If any one of the three conditions is not met,
we cannot assume Poisson arrivals. For example, if a car
rally is being conducted on a highway, we cannot assume
that each car driver is independent of each other. In this
case all cars had a common reason to enter the highway
(start of the race).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.117/123
118. Another Example: Telephony Arrivals
Consider arrival of telephone calls to a telephone
exchange. Putting our rules to test we find:
1. Total number of customers that are served by a
telephone exchange is very large.
2. A single telephone call takes a very small fraction of
the systems resources.
3. Decision to make a telephone call is independently
made by each customer.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.118/123
119. Again, if all the rules are not met, we cannot assume tele-
phone arrivals are Poisson. If the telephone exchange is
a (Private Branch eXchange) (PBX) catering to a few sub-
scribers, the total number of customers is small, thus we
cannot assume that rule 1 and 2 apply. If rule 1 and 2 do
apply but telephone calls are being initiated due to some
disaster, calls cannot be considered independent of each
other. This violates rule 3.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.119/123
120. Poisson Arrival Process
Pn(t) =
(λt)n
n!
e−λt
This equation describes the probability of seeing n arrivals
in a period from 0 to t. Where:
t is used to define the interval 0 to t
n is the total number of arrivals in the interval 0 to t.
λ is the total average arrival rate in arrivals/sec
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.120/123
121. Poisson Service times
In an M/M/1 queueing system we assume that service times
for customers are also exponentially distributed (i.e. gener-
ated by a Poisson process). Unfortunately, this assumption
is not as general as the arrival time distribution. But it could
still be a reasonable assumption when no other data is avail-
able about service times. Lets see a few examples:
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.121/123
122. Telephone Call Durations
Telephone call durations define the service time for
utilization of various resources in a telephone exchange.
Lets see if telephone call durations can be assumed to be
exponentially distributed.
1. Total number of customers that are served by a
telephone exchange is very large.
2. A single telephone call takes a very small fraction of
the systems resources.
3. Decision on how long to talk is independently made by
each customer
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.122/123
123. From these rules it appears that exponential call hold times
are a good fit. Intuitively, the probability of a customers mak-
ing a very long call is very small. There is a high probability
that a telephone call will be short. This matches with the
observation that most telephony traffic consists of short du-
ration calls. (The only problem with using the exponential
distribution is that, it predicts a high probability of extremely
short calls).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.123/123