Probability distribution of active calls at a telephone exchange

#1
Please, could someone help me with this exercise from this book:

Suppose that calls arrive at a telephone exchange according to a Poisson process with parameter lambda. If the lengths of the various calls are identical and independent random variables, each with cumulative distribution function F(x), find the probability distribution of the number of calls in progress at any time.

I really don't know how to relate the continuous distribution to a discrete one. I've tried to work it out the same way you achieve P(n) in a M/M/1 queue, and came up to something like P(n) = (1-rho)*rho^n, where rho = lambda * E(f). Then I wrote a Java program to simulate the situation and compare the simulation frequencies with this result, and they're rather different. Moreover, the book proposes the exercise after a chapter on 1-variable probability distributions, with no mention of queue theory or Markov processes (I know about that cause I'm brushing up my statistics).

I'm pretty sure I'm missing something simple, any idea please? Thanks in advance.
 

BGM

TS Contributor
#2
Just google one material relating this:

http://www.maths.qmul.ac.uk/~ig/MAS338/PP and uniform d-n.pdf

If we let \( X(t) \) be the Poisson process with rate \( \lambda \),
and \( W_k \) be the \( k \)th arrival time and \( Y_k \) be the corresponding length of call, \( k = 1, 2, 3, \ldots \),
then the number of calls in progress at a certain time \( t \), \( Z(t) \) can be expressed as

\( Z(t) = \sum_{k=1}^{X(t)} \mathbf{1}\{W_k + Y_k > t\} \)

where as usual we take \( Z(t) = 0 \) when \( X(t) = 0 \)

To find the distribution, we need to compute the pmf of \( Z(t) \):

\( \Pr\{Z(t) = z\} \)

\( = \Pr\left\{\sum_{k=1}^{X(t)} \mathbf{1}\{W_k + Y_k > t\} = z\right\} \)

\( = \sum_{n=z}^{+\infty} \Pr\left\{\sum_{k=1}^{X(t)} \mathbf{1}\{W_k + Y_k > t\} = z\Bigg|X(t) = n\right\}\Pr\{X(t) = n\} \)

\( = \sum_{n=z}^{+\infty} \Pr\left\{\sum_{k=1}^{n} \mathbf{1}\{U_k + Y_k > t\} = z\right\}\Pr\{X(t) = n\} \)

\( = \sum_{n=z}^{+\infty} \frac {n!} {z!(n - z)!} p(t)^z(1 - p(t))^{n-z}
e^{-\lambda t}\frac {(\lambda t)^n} {n!} \)

\( = e^{-\lambda t} \frac {(\lambda tp(t))^z} {z!}
\sum_{n=z}^{+\infty} \frac {(\lambda t(1 - p(t)))^{n-z}} {(n - z)!}
\)

\( = e^{-\lambda t} \frac {(\lambda tp(t))^z} {z!} e^{\lambda t(1 - p(t))}\)

\( = e^{-\lambda tp(t)} \frac {(\lambda tp(t))^z} {z!}, z = 0, 1, 2, \ldots \)

So you see \( Z(t) \sim \text{Poisson}(\lambda t p(t)) \),

where \( p(t) = \Pr\{U_k + Y_k > t\} = \int_0^t \Pr\{Y_k > t - u\}\frac {1} {t} du = \frac {1} {t} \int_0^t [1 - F(t - u)]du \)

as \( U_k \sim \text{Uniform}(0, t) \)

And I think this should be a very typical example in many introductory stochastic process textbooks.
 

Dason

Ambassador to the humans
#3
If we let \( X(t) \) be the Poisson process with rate \( \lambda \),
and \( W_k \) be the \( k \)th arrival time and \( Y_k \) be the corresponding length of call, \( k = 1, 2, 3, \ldots \),
then the number of calls in progress at a certain time \( t \), \( Z(t) \) can be expressed as

\( Z(t) = \sum_{k=1}^{X(t)} \mathbf{1}\{W_k + Y_k > t\} \)
I'm not sure I follow that. Wouldn't

\(Z(t) = \sum_{k=1}^{X(t)} \mathbf{1}\{W_k \leq t < W_k + Y_k\}\)

since the a call won't be in progress if the time is greater than the arrival time + length of the call. We would need the call to have been started (W_k <= t) but not have ended yet right?
 
#4
Thanks a lot BGM, that's great. What I haven't clear is if having p(t) in the result implies the no of active calls depend on the instant you observe. Intuitively, I would expect their distribution to be the same at all times, at least after an initial bootstrap. I can't find a way to see that t*p(t) becomes a constant for t -> oo. Sorry for my being naive...
 

BGM

TS Contributor
#5
To Dason: Yes but the condition \( W_k \leq t \) is implicitly assured by the upper bound of summation \( X(t) \), i.e.

\( 0 < W_1 < W_2 < \ldots < W_{X(t)} < t \)

To zakmck: I have not got what you mean yet. And I am not sure why

\( \lim_{t\to +\infty} tp(t) \) exists is important.

Anyway we may express \( p(t) \) in another form:

\( p(t) = \Pr\{U_k + Y_k > t\} \)

\( = \int_0^{+\infty} \Pr\{U_k > t - y\}dF(y) \)

\( = \int_0^t \Pr\{U_k > t - y\}dF(y) + \int_t^{+\infty}dF(y) \)

\( = \frac {1} {t} \int_0^t ydF(y) + 1 - F(t) \)

Then \( tp(t) = \int_0^t ydF(y) + t\Pr\{Y_k > t\} \)

the limit \( \lim_{t\to +\infty} tp(t) \) should exist for all light-tail distribution (with first and second moments exist), and in such case

\( \lim_{t\to +\infty} tp(t) = E[Y_k] \)
 
#6
Thanks a lot BGM! I was looking exactly for that. I had arrived at the same result treating the phone exchange like a queue and using E(Yk) in place of 1/mu. The only problem was that a bug in my simulation program misleaded me by showing me different simulated/theoretical distributions.

Now it works and I'm attaching it here, in case someone is further interested. It uses JFreeCharts and Math Commons, both excellent libraries.
 

Dason

Ambassador to the humans
#7
To Dason: Yes but the condition \( W_k \leq t \) is implicitly assured by the upper bound of summation \( X(t) \), i.e.

\( 0 < W_1 < W_2 < \ldots < W_{X(t)} < t \)
Ah - subtle. Thanks for clarifying - that makes a lot of sense.