# Variance of a sum of predictions

#### mmercker

##### Member
Hi,

I predict the spatial distribution of a species on a regular grid using a regression model.
Species numbers vary with environmental covariates. Now I want to predict the overall size of the population within a given area, which means technically that I sum up all predicted values. But how can I calculate the standard error / confidence interval for this sum? Just by summing up all pointwise standard errors?

#### Dragan

##### Super Moderator
If you sum up the predicted values (Yhats), then this sum will be equal to the summation of the actual values of the dependent variable (Y). In short, the mean of the predicted values will be equal the mean of the actual values. (That is the purpose of the intercept term in a regression model.)

#### mmercker

##### Member
Sorry, I can't relate this answer to my question. The question is: Does it make sense to sum up the pointwise given standard deviations of the prediction in order to get the standard deviation of the sum? In other words: My predict function gives me outcome values and standard deviations for different values of covariates. How doe I get the standard error of the sum of these values?

#### ondansetron

##### TS Contributor
The question is: Does it make sense to sum up the pointwise given standard deviations of the prediction in order to get the standard deviation of the sum?

#### Dragan

##### Super Moderator
But how can I calculate the standard error / confidence interval for this sum? Just by summing up all pointwise standard errors?

How doe I get the standard error of the sum of these values?

Your overall query is rather vague. That said, I do know this, you cannot get the standard error by simply summing the standard errors. The way to approach this would be to sum the Variance Estimates and then take the square root of that sum (to get the standard error that you're looking for).

Last edited:

#### ondansetron

##### TS Contributor
You overall query is rather vague. That said, I do know this, you cannot get the standard error by simply summing the standard errors. The way to approach this would be to sum the Variance Estimates and then take the square root of that sum (to get the standard error that you're looking for).
Well, you spoiled my surprise for the OP! I was hoping he or she would see that the variances would be additive, rather than the standard deviations/standard errors.

#### Dragan

##### Super Moderator
Well, you spoiled my surprise for the OP! I was hoping he or she would see that the variances would be additive, rather than the standard deviations/standard errors.
Well, yes, but someone needs to be proactive (with knowledge) to spell it out.

#### ondansetron

##### TS Contributor
Well, yes, but someone needs to be proactive (with knowledge) to spell it out.
I'd favor on the side of teaching a man to fish, I suppose. Different approaches are always good!

#### mmercker

##### Member
Hi, sorry for formulating it wrong, it was clear to me that I have to sum up variances instead of standard errors, that was not my concern. My main concern is: if we sum up the poinwise given variances in order to geht the variance of the sum, this is (as far as I know) only valid if covariances are zero. I am used to the concept of covariance calculation based on different random variables. However, in the case of a prediction from a regression model (where I have only predicted values and it's standard errors for each covariate value) can I always assume that covariances between different predicted points are zero, if I specified the regression model correctly? It is even not clear to me how and if the definition of covariance exist in this context.