Combining estimators

The independent variables X and Y each have expected value mu but have different variances sigma^2 and r^2 respectively.

If the linear combination Z = aX +bY is to be used as an estimator of mu, explain what relationship between a and b is necessary for this estimator to be unbiased.

I have done this part, a + b = 1.

Obtain as simple an expression as possible for the variance of any unbiased estimator that is a linear combination of X and Y.

I have done this also, I think the variance of Z is a^2.sigma^2 + b^2.r^2

Use this to prove that the unbiased estimator of this form with smallest standard error has:

a = r^2/(sigma^2 + r^2)

b = sigma^2/(sigma^2 + r^2)

and variance: sigma^2.r^2/(sigma^2 + r^2)

I have no idea how to do this last part, any hints would be appreciated.



New Member
In the formula:
a^2*sigma^2 + b^2*r^2

put b = 1-a

a^2*sigma^2 + (1-a)^2*r^2

take the derivative with respect to a and equate to zero to find the value for a that gives minimum variance

2a*sigma^2+2(a-1)r^2 = 0

which gives a=r^2/(sigma^2+r^2) Q.E.D

From there, you can probably find out the rest by yourself without much trouble.