Upper Bound Error for Expectation Approximation?

arbab61

New Member
Dear all,

I have a problem for the past few weeks and couldn't find the answer in the books or in the internet. I know that in general:
E(g(X1, X2, ...)) is not equal to g(E(x1), E(X2) ,...) where E() is the expectation operator and g() is a function. For example E(XY) is not equal to E(X)E(Y) and we are tolerating an error of Cov(XY) by this approximation. What is the error in general case I described above? I am forced to use such an approximation in a problem and I want to know how much error I am committing. I really appreciate your help and suggestions, I am really stuck!

Best Regards,
Mohammad

Dason

Ambassador to the humans
Cov(X, Y) = E[XY] - E[X]E[Y]

so

E[XY] = E[X]E[Y] + Cov(X,Y)

there is no theoretical upper limit to the covariance without using additional information so I'm not sure where else there is to go from there.

arbab61

New Member
Thanks for the response. What I mean is that if approximate E[XY] with E[X]E[Y] I am tolerating an error of Cov(X,Y) (based on equation you provided). My question is about the general case of E[g(X1, X2, ...)]. What is the error of we approximate it with g(E(X1), E(X2), ...) ?

Cov(X, Y) = E[XY] - E[X]E[Y]

so

E[XY] = E[X]E[Y] + Cov(X,Y)

there is no theoretical upper limit to the covariance without using additional information so I'm not sure where else there is to go from there.

Dason

Ambassador to the humans
Do you want to keep it completely general or do you know anything else about g?

arbab61

New Member
It is interesting in general to know the answer but in my particular problem I have:
E[XY/Z] and I want to approximate it with E[X]E[Y]/E[Z] since there is no way to simplify XY/Z and make the expectation in a closed form. So in my problem g(X,Y,Z)=XY/Z.

Dason

Ambassador to the humans
What do you know about X, Y, Z? Are there any distributional assumptions you make? Are you assuming any independence?

arbab61

New Member
X and Y can be assumed to be independent. But other relationships are dependent.

arbab61

New Member
Thank you so much. I'll try it.

Maybe you can try Taylor Expansion

http://en.wikipedia.org/wiki/Taylor_expansions_for_the_moments_of_functions_of_random_variables

I am very sure the exact assumption behind the method (e.g. does it require $$g$$ to be an analytic function)

When it holds, you are just like doing the 0th order approximation for the expectation and you may try to give a bound for the remainder terms. I just doubt this method cannot be applied to very general distribution.