MLE of a weirdly split lambda in a Poisson distribution

#1
Hi everyone! :) I'm new here.

I have been looking everywhere for the solution to this problem so here's to hoping some of you can help me. :)

The number of cases X of a rare disease is assumed to follow a Poisson distribution, X ~ Poi(lambda)
We want to analyse the prevalence of the disease closer. To estimate the probability p of having a child with the disease we observe the number of children born n_i and the number of cases of the disease X_i for every year i where i=1,2,...,m. We assume that the probability is constant in these m years. Also, n_i is large and p is small, so we can assume that X_i ~Poi(n_i * p), i=1,2,...,m.

So, how do we find the MLE of p, what is it, it's expectation and its variance?

Thanks in advance for any help! :)
 
#2
This is a standard exercise and if you remember the Poisson distribution, its variance is the same as its expectation and is given by the parameter of the distribution, lambda (λ). You are going to have to show that.

Recall the pmf is given by: \( p(x)=\frac{e^{-\lambda}\lambda^{x}}{x!} \)

Assuming independence in \(n\) trials with constant probability the likelihood becomes \(L(\lambda)= \frac{e^{-n \lambda} \lambda ^{\sum_{i=1}^n x_i}}{ \prod_{i=1}^n x_i ! } \)

Make the log transformation, take the derivative w.r.t to \( \lambda \), equate to zero and obtain the solution.
 
#3
Right so in my case that would be:

\( L(\lambda) = \frac{e^{-m\lambda}\lambda^{\sum x_i}}{\prod_{i=1}^{m}X_i!} \)

Taking the log:
\(
lnL(\lambda) = -m\lambda + ln \lambda \sum_{i=1}^{m}X_i-ln(\prod_{i=1}^{m}X_i!)\)

Differentiating:

\(\frac{dlnL(\lambda)}{d\lambda} = -m+\frac{\sum_{i=1}^mX_i}{\lambda} = 0\)

From which we get the estimator of lambda:

\(\hat{\lambda}=\frac{\sum_{i=1}^mX_i}{m}\)

From the text we have that \(\lambda = n_ip\)

Does this mean that I can say:

\(\hat{\lambda} = n_i\hat{p} = \frac{\sum_{i=1}^mX_i}{m}\)
\(\hat{p} = \frac{\sum_{i=1}^mX_i}{n_im}\)?

Still have no clue what the estimators expectation and variance is though.
 
#4
Why did you use m instead of n in the above? To give you the intuition, \( \lambda\) is the average changes in a given interval length and is approximated by \(np\). Whether you can say that, is up to your textbook or your professor, but you have to remember in any case that what you have is an estimator, not the real value. In a more advanced course, you are going to study the Poisson Postulates on which several distributions depend but for now focus on this case.

Your answer is correct, the MLE estimator of \(\lambda\) is the sample mean. Now in order to find the mean and the variance, you basically have to recall the McLaurin expansion, or equivalently Taylor expansion at point 0, of \(e^x\) is given by \( e^x =\sum_{i=0}^{\infty} \frac{x^{i}}{i!} \).

Follow that video closely and take notes http://www.youtube.com/watch?v=65n_v92JZeE. The expectation \( E[X(X-1)] \) is called factorial expectation and is frequently used in discrete distributions.
 
Last edited:

BGM

TS Contributor
#5
The number of cases X of a rare disease is assumed to follow a Poisson distribution, X ~ Poi(lambda)
We want to analyse the prevalence of the disease closer. To estimate the probability p of having a child with the disease we observe the number of children born n_i and the number of cases of the disease X_i for every year i where i=1,2,...,m. We assume that the probability is constant in these m years. Also, n_i is large and p is small
I think the above paragraph is just act as a prologue. The key point is \( X_i \sim \text{Poisson}(n_ip), i = 1, 2, \ldots, m \), \( n_i \) are given constants; and they are independent although not identically distributed.

Now we can find the MLE from this independent but not identically distributed random sample. And the result is just very similar to the sample mean. And you can calculate the mean and variance afterward by applying some standard properties.
 
#7
Since you have a time dimension, you will need to adjust the value of \( \lambda \) for each year. This can be easily done if in the MLE estimator you use \( n_i \), the trials for each year. In that case you obtain \( \lambda_1 ,\lambda_2, \ldots, \lambda_m\).

Now you have to combine all these parameters. In essence then you are seeking the distribution of the sum of all your years, each year having different parameter λ. Assuming indepence it can be shown that the sum also has a Poisson distribution with parameter, say Z, that is equal to \(\ Z=\sum_{i=1}^m \lambda_i\).

That is your resulting pmf will have the form: \( p(z)= \frac{\left[ \sum_{i=1}^m \lambda_i \right]^z e^{-\sum_{i=1}^m \lambda_i}}{z!}\)

Does that help?

EDIT: \(\lambda_i\) is not considered known, right?
 
Last edited: