I have a dichotomous dependent variable (a clinical form of multiple sclerosis) and quite a few independent variables that are quite good predictors of the dependent variable individually (individually I have 10 variables with AOC > 0,8 and all of them show to be significant if I build a binary logistic regression with only one variable).

I want to build a regression model with 4 variables that display the best AOC values if taken individually. I would like it to include the 4 variables, because a model with more variables displays a bigger AOC value and should be better in predicting the outcome (clinical form of MS). However, if I add all these 4 variables into the equation, most of their p values and confidence intervals show them to be not significant. I am pretty sure this is a multicollinearity issue as the values change significantly if I remove one or a few of the variables (event though the VIF values are not more than ~3) . The biggest number of variables with which all of the variables in the equation are shown to be significant for the model is two.

Therefore my question is it possible to build a model with more than two independent variables in this case and somehow overcome the multicollinearity issue or should I stick to only two.

Thanks in advance!