I have a multiple -

**through the origin**- regression model. Working with SPSS, I get very high F significance and all t-values are also very significative, all the model assumptions are valid. But, in spite of this, I get a high evidence of multicollinearity, VIF's are very high, (all higher than 10) and condition index is bigger than 25.

I guess that having no intercept creates a problem for the collinearity tests, because when I run auxiliary regressions among the independent variables (and including intercepts here) I get rather very low Rsquared's in every case. The SPSS calculates the auxiliary regressions without intercept, as in the original model, and get high levels of multicollinearity .

Theory suggests to exclude the intercept in the original model, because it would had no sense. However, when I include the intercept in the original model, the VIF's are very acceptable (all below 2.00), condition index is low (though with high variance proportions in some variables).

Also, to run the auxiliary regressions among the independents variables, the intercept is rather a need because theorically it's acceptable.

my question is: must I run the model through the origin and do the auxiliary regressions aside with the intercept?

I'm worried because when I exclude the intercept in the auxiliary regressions I get a high level of collinearity, I must forget about this fact?.