# Regression with no intercept - VIF

#### jmyles

##### New Member
Why did you remove the intercept? There's almost no good reason for ever doing that. You can introduce immense bias into the model.

#### bobo2

##### New Member
Yes, I know that usually, you shouldn't remove the intercept.
The question is for the rare cases when you decide to remove the intercept. how to calculate the VIF?

How do you interpret the results in this case?
What VIF value is suspicious for multicollinearity? What VIF value is dependently multicollinearity?

From the links above it seems that no method is good ...???

#### jmyles

##### New Member
To answer your question, despite you not providing an answer to mine, you would calculate VIF the exact same way as you would with an intercept. VIF is just the ratio of variance in a model with all the predictors to the variance of a the model with just the ith predictor. The interpretation will not change, but I would not trust it because of reasons I explained earlier. A rule of thumb is > 10 is "high". You may also calculate tolerance instead.

#### bobo2

##### New Member
Thanks Jmyles,

Sorry, I don't talk about a specific case but try to understand statistics.

In R for example:
model = lm(y~x1+x2+x3+0)
should you do: vif( lm(y~x1+x2+x3+0) ) or vif (lm(y~x1+x2+x3)

In: https://stats.stackexchange.com/questions/231252/high-vif-after-removing-intercept-in-r they write
"don't need to keep the intercept in the original model, but keep it for VIF computation"

They not so clear but write the vif calculation is related to the model, say you should probably use the no intercept model also for the VIF:
"Multicollinearity is not an intrinsic property of a data set: it is relative to the model you specify"

So I just try to understand what is the common practice ...?