Hello at everyone, I'm new at the site, and I'd appreciate your suggestions.

I have a multiple -through the origin- regression model. Working with SPSS, I get very high F significance and all t-values are also very significative, all the model assumptions are valid. But, in spite of this, I get a high evidence of multicollinearity, VIF's are very high, (all higher than 10) and condition index is bigger than 25.
I guess that having no intercept creates a problem for the collinearity tests, because when I run auxiliary regressions among the independent variables (and including intercepts here) I get rather very low Rsquared's in every case. The SPSS calculates the auxiliary regressions without intercept, as in the original model, and get high levels of multicollinearity .

Theory suggests to exclude the intercept in the original model, because it would had no sense. However, when I include the intercept in the original model, the VIF's are very acceptable (all below 2.00), condition index is low (though with high variance proportions in some variables).
Also, to run the auxiliary regressions among the independents variables, the intercept is rather a need because theorically it's acceptable.

my question is: must I run the model through the origin and do the auxiliary regressions aside with the intercept?
I'm worried because when I exclude the intercept in the auxiliary regressions I get a high level of collinearity, I must forget about this fact?.
You get statistically significant results, right? Why to even worry about Multicollinearity? I though the only problem Multicollinearity created is that it made it harder get statistically significant results.
The problem is that I'm very interested in the parameters interpretation.
The Damodar Gujarati book, indicates that when there are significant results (t-tests and F test) and multicollinearity persists, perhaps it's not a serious problem, also he puts a note of John Johnston book (Econometric Methods 1984 edition) citing that it can happen when exists whether an overestimation (or subestimation) of the true parameters, but t-values are still significant. I have no access to that book.
But my problem is I'm first interested in parameters interpretation and the results with the auxiliary regression without intercept suggest an extreme degree of multicollinearity.
Does auxiliary regressions have to include the intercept, to test the dependency among independant variables? Can this be a choice ?